Scott Alexander on conservatism self-preservation-against-optimality urge

I think that it might be reasonable to have continuation of your own culture as a terminal goal, even if you know your culture is worse” in some way than what would replace it. There’s a transhumanist joke — “Instead of protecting human values, why not reprogram humans to like hydrogen? After all, there’s a lot of hydrogen.” There’s way more hydrogen than beautiful art, or star-crossed romances, or exciting adventures. A human who likes beautiful art, star-crossed romances, and exciting adventures is in some sense worse” than a human who likes hydrogen, since it would be much harder for her to achieve her goals and she would probably be much less happy. But knowing this does not make me any happier about the idea of being reprogrammed in favor of hydrogen-related goals. My own value system might not be objectively the best, or even very good, but it’s my value system and I want to keep it and you can’t take it away from me. I am an individualist and I think of this on an individual level, but I could also see having this self-preservation-against-optimality urge for my community and its values.

(I’ve sometimes heard this called Lovecraftian parochialism, based on H.P. Lovecraft’s philosophy that the universe is vast and incomprehensible and anti-human, and you’ve got to draw the line between Self and Other somewhere, so you might as well draw the line at 1920s Providence, Rhode Island, and call everywhere else from Boston all the way to the unspeakable abyss-city of Y’ha-nthlei just different degrees of horribleness.)

https://slatestarcodex.com/2016/07/25/how-the-west-was-won/

Quote conservatism transhumanism moral philosophy scott alexander