Tyler Cowen on moral principles and the margins at which we should give them up

Cardiff Garcia: Do you think that most or all public intellectuals should write a treatise like [Stubborn Attachments]?

Tyler Cowen: Only if they want to. But I think they all ought to want to. And if you don’t want to, then how can you believe anything? This is foundationalist Tyler coming out again. So you hear all kinds of claims about utilitarianism, about inequality being important or about meritocracy being important, but people never address the question: at what margin am I willing to give up this principle?

That’s a great defect in current political discussion. People have a lot of arguments for why their margin is good but they rarely have any arguments for why they stop at that margin. You wanna redistribute? Well maybe, but why don’t we redistribute even more? And if the people who oppose the redistribution you favour are evil, why aren’t you evil for not proposing even more?

https://thevalmy.com/26 // https://www.ft.com/content/4803dd39-9c14-3a4c-90e4-c9630999660f

quote tyler cowen applied epistemology pragmatism

State credences as fractions to avoid suggesting false precision

I often find it helpful to think in probabilities and adopt a Bayesian mindset.

As part of that, I sometimes state my subjective probabilities in conversation.

Sometimes people respond skeptically, asking how can you possibly know that?”.

There are several reasons for this. Sometimes, the reason is that they’ve understood my credence to be much more precise than it is.

Two statements:

  1. I think there’s roughly a 70% chance X will happen
  2. I think there’s roughly a 2/3 chance X will happen

When I say (1), I usally intend to communicate a fairly imprecise credence. That’s to say, I mean something equivalent to (2) i.e. I think there’s something between a 60-73% chance X will happen. But, despite the word roughly”, I find people—especially people who aren’t familiar with the Bayesian mindset—often hear me as expressing a much narrower range, e.g. “68-72% chance”.

If I state the probability as a fraction rather than as a percentage, I am forced to specify the denominator, and this carries a clearer implication about the precision of my credence. It becomes more natural to think that, after a small positive update, I would round things off and state the same credence, rather than making a minor adjustment (e.g. “I now think there’s a roughly 71% chance) 1.

I’m writing this down because it’s a simple improvement I ought to have internalised by now. And also, because I’m starting to think this comms issue may have confused at least one eminent economist, and perhaps a lot of people in finance.


  1. I’ve seen this effect in personal conversations with n ≈10 people who weren’t familiar with the Bayesian mindset. I think there’s about a 2/3 chance that it’s real. I would have thought that Tetlock or some science communication” people would have done studies on this, but I can’t quickly find them.↩︎

writing communication bayesianism

Ross Douthat speculates on religious responses to transformative technology

In a world that remains stable and wealthy and scientifically proficient, someone or someones may figure out how to achieve substantial life extension, artificial wombs, genetic selection for unusual strength or speed or appearance, the breeding of human-animal hybrids, and other scenarios that have belonged for years to the realm of just around the corner” science fiction.

[…]

Human social institutions would change dramatically, to put it mildly, if babies could be grown in vats and people lived to be 125. Our political and economic debates would be scrambled if the rich suddenly began availing themselves of technologies and treatments that seemed to call a shared human nature into question.

[…]

There is a possible future where it becomes clear that the real bottleneck—the real source of temporary technological decadence—wasn’t technological proficiency so much as a dearth of societal ambition and centralized incentives and unsqueamishness about moral considerations. In this scenario, it would be Chinese scientists, subsidized and encouraged by an ambitious government and unencumbered by residual Christian qualms about certain forms of experimentation, who take the great leap from today’s CRISPR work to tomorrow’s more-than-human supermen—and suddenly the world would enter a scenario out of the original Star Trek timeline and its predicted Eugenics Wars, with a coterie of Khan Noonien Singhs in power in Beijing and the rest of the world trying to decide whether to adapt, surrender, or resist.

[…]

Indeed, you wouldn’t necessarily need the huge AI or biotech leap: even a refinement of existing virtual reality, one that draws more people more fully into the violence and pornography of unreal playgrounds, would seem to demand a religious response, even a dramatic one. Perhaps not quite a jihad… but at the very least an effort to tame and humanize the new technologies of simulation that escapes current culture war categories sufficiently to usher in a different—at long last—religious era than the one the baby boomers made.

[…]

Two things that certain modern prejudices assume can’t be joined together: scientific progress and religious revival. It isn’t just that some technological advance could, as suggested above, act in creative tension with religion by provoking a moral crusade or jihad in response. It’s also that scientific and religious experiments proceed from a similar desire for knowing, a similar belief that the universe is patterned and intelligible and that its secrets might somehow be unlocked. Which is why in periods of real intellectual ferment and development, there is often a general surge of experimentation that extends across multiple ways of seeking knowledge, from the scientific and experimental, to the theological and mystical, to the gray zones and disputed territories in between. Thus, the assumption, common to rationalists today, that religion represents a form of unreason that science has to vanquish on its way to new ages of discovery, is as mistaken as the religious reflex that regards the scientific mind-set as an inevitable threat to the pious simplicities of faith.

[…]

Much as the relationship between science and religion can be adversarial, there can also be a mysterious alchemy between the two forms of human exploration.

The Decadent Society, Chapter 10: Renaissance

quote ross douthat religion

Tyler Cowen on Straussian truths and rational choice ethics

The truths of literature and what you might call the Straussian truths of the great books”—what you get from Homer or Plato—are at least as important rational choice ethics. But the people who do rational choice ethics don’t think that. If the two perspectives aren’t integrated, it leads to absurdities— problems like fanaticism, the Repugnant Conclusion, and so on. Right now though, rational choice ethics is the best we have—the problems of, e.g., Kantian ethics seem much, much worse.

If rational choice ethics were integrated with the Straussian truths of the great books,” would it lead to different decisions? Maybe not—maybe it would lead to the same decisions with a different attitude. We might come to see rational choice ethics as an imperfect construct, a flawed bubble of meaning that we created for ourselves, and shouldn’t expect to keep working in unusual circumstances.

Tyler Cowen, interviewed by Nick Beckstead (2014)

quote tyler cowen moral philosophy pragmatism fanaticism

Tyler Cowen on our local cone of value, and the name of his book

How do you weigh the interests of humans versus animals or creatures that have very little to do with human beings? I think there’s no answer to that. The moral arguments of Stubborn Attachments — they’re all within a cone of sustainable growth for some set of beings. And comparing across beings, I don’t think anyone has good moral theories for that.

https://80000hours.org/podcast/episodes/tyler-cowen-stubborn-attachments/

If you think one of the next things growth will do is make us fundamentally different through something like genetic engineering or drugs… we’re not then animals but we could then be fundamentally different beings that are outside of the moral cone I’m used to operating in and then I’m back to not knowing how to evaluate it. There’s some like common moral cone that has the humans not the hamsters, future growth could push what are now humans outside that cone and then I think I’m back to a kind of incommensurability. Meantime I think in terms of expected value—if it doesn’t happen we’ll just be better off, if it does we’re not sure, so full steam ahead is where I’m at. But I do think about this quite a bit.

https://thevalmy.com/27

Russ Roberts: Why did you call this book Stubborn Attachments?

Tyler Cowen: The idea that we as humans have stubborn attachments to other people, to ideas and to schemes and to our own world and then trying to create a framework that can make sense of those and tell people it’s rational and they ought to double down on their best stubborn attachments and that that’s what makes life meaningful and creates this cornucopia in which moral and ethical philosophy can actually make sense and give us some answers.

https://pca.st/bHUr

stubborn attachments cover

quote tyler cowen pragmatism subjectivism

'Other Writers', Leonard Cohen

'Other Writers', Leonard Cohen

poetry leonard cohen