By that I mean, it must be an inherently comforting thing to think - we inherently know this and want there to be something after death, because it feels right, or more meaningful. There’s a reason basically every civilization ever has some sort of afterlife ethos.
I realize I am basically horseshoeing my way into evangelicalism but still. Maybe life was better if we believed there was something beyond this. [edit - please note that yes, the world is shitty, things are awful and getting worse, and that is exactly my point – we get THIS SHIT, and nothing else? god that’s awful]
Not believing in an afterlife is fantastic! When I die my life is complete, and what I did during my life is all that matters. No worries about having to meet some arbitrary moral code or fighting for eternity or being reincarnated, just the void like before I was born.
Now living s And becoming more and more aware of how completely awful people can be, that is depressing.