Longtermism:
Longtermism is an ethical stance which gives priority to improving the long-term future. It is an important concept in effective altruism and serves as a primary motivation for efforts to reduce existential risks to humanity.
Sigal Samuel from Vox summarizes the key argument for longtermism as follows: “future people matter morally just as much as people alive today;… there may well be more people alive in the future than there are in the present or have been in the past; and… we can positively affect future peoples’ lives.” These three ideas taken together suggest, to those advocating longtermism, that it is the responsibility of those living now to ensure that future generations get to survive and flourish. [Wikipedia]
Uh huh. A deliberate purging of the chronological aspect of existence, sounds like, with a dubious helping of the same intellectual errors made by anti-abortionists, to wit, equating the a prior existence to a later existence of different things.
Noted in “Why ‘longtermism’ isn’t ethically sound,” Christine Emba, WaPo:
Longtermism relies on the theory that humans have evolved fairly recently, and thus we can expect our species to grow long into the future. The world’s current population is really a blip; if all goes well, a huge number of humans will come after us. Thus, if we’re reasoning rationally and impartially (as EAs pride themselves on doing), we should tilt heavily toward paying attention to this larger future population’s concerns — not the concerns of people living right now.
Depending on how you crunch the numbers, making even the minutest progress on avoiding existential risk can be seen as more worthwhile than saving millions of people alive today. In the big picture, “neartermist” problems such as poverty and global health don’t affect enough people to be worth worrying about — what we should really be obsessing over is the chance of a sci-fi apocalypse.
It rather makes me wonder which group of humans get the assistance. I mean, if not today’s suffering folks, how about tomorrow’s? Day after? Thousand years from now?
I wonder if they’ll be making a religion out of this, with your position in the sainthood determined by how many people you, ah, “helped.” Every once in a while they update the computer’s database of people with the data, just to give those so stored a bit of a thrill.