Readings & Past Writings
Readings
(selected in terms of frequency and uniqueness )
- Substack (the Profile, Doxa, Mule's Musings, Deep into the Fores, Noah, Synbiobeta, Dreams of Electric Sheep, Avoid Boring People, Chinese Characteristics, Danco and The Diff)
- Financial Times Weekend and Lex. I've been quoted on FT (see below)
- the Economist
- Less Wrong
- Foreign Policy
Rationalism
I follow closely a list of Ratioalism Blogs. I strongly recommend the below readings:
Byrne Cobart's description of rationalism:
Rationalism is a complex subculture that is hard to summarize in a few paragraphs. In one sense it's just a collection of sites that link to one another frequently, but in another sense it's a group of people who have to repeatedly insist that they're not a religion or cult despite having sacred texts, myths, unusual diets, distinctive family arrangements, and marriage rituals.1
One thing rationalists explicitly prize is making quantifiable estimates of low probabilities. In fact, one important branch of their community, Effective Altruism is devoted to making charitable donations from a utilitarian perspective. These decisions are highly sensitive to their underlying assumptions; if you change the moral weight ascribed to chicken suffering, you get a very different answer. In philosophical terms, it's an admirable effort to make first principles explicit, but in practical terms it doesn't get rid of the need to have those principles in the first place.
One thing rationalists implicitly prize is rigorous, quantifiable arguments that lead to very counterintuitive beliefs, such as: we probably live in a computer simulation, it would be a good idea to freeze your body or brain when you die, reducing the risk that a poorly-specified artificial intelligence destroys humanity is the single most important problem anyone could work on, or, in early 2020, that the respiratory disease outbreak in Wuhan was a big deal that would kill vast numbers of people if it wasn't stopped, perhaps by disregarding the expert consensus on masks.
And I believe I belong to the following group that Byrne has described:
The influence of people who read rationalist blogs, but don't self-identify as rationalists, is quite wide—the blogs are very widely followed in technology circles, and anecdotally have a large audience in the more quantitative branches of finance. Identifying as a rationalist is a losing move, because non-rationalists will think it's weird (objectively true!) and rationalists will be relatively indifferent to a personal label
Rationalists and Thymos
Applied Divinity Studies asks why adherents to the rationalist online subculture aren’t more successful. It’s a good question: rationalists try to identify and improve on their biases (the group first gelled on Overcoming Bias before moving to LessWrong—avoiding mistaken beliefs is their whole theme). Rationalists tend to be well-educated, and they’re overrepresented in the high-earning field of software. And yet, they don’t seem overwhelmingly successful. ADS has a theory:
Rather than general dishonesty, my theory is that founders neglect one kind of reasoning very specifically. The same kind most rationalists are obsessed with: taking the outside view.
…
Here’s a more concrete example: A rationalist has a good startup idea, so they set out to calculate expected value. YC’s acceptance rate is something like 1%, and even within YC companies, only 1% of them will ever be worth $1 billion. So your odds of actually having an exit of that magnitude are 10,000 to 1, and then you’re diluted down to 10% ownership and taxed at around 50%. Of course, there are exits under and above a billion, but back-of-the-napkin, you’re looking at an expected $5,000 for 10+ years of work so grueling that even successful founders describe it as “eating glass and staring at the abyss.”
This may be true. Rationalists are unusually likely to respond to Fermi Estimates by agreeing that, for example, the impact of runaway AI or nuclear war suggests that we should spend more time solving existential risks, or that the high payoff from living forever offsets the low probability that cryogenics will work. But they may have run the numbers on founding companies and decided that it’s a poor risk-reward tradeoff, and that they’re better off working for big tech. In that case, rationalists may be very successful on a Sharpe ratio basis, but under-levered.
There’s a collective action problem here. Many of the most successful companies are somewhat ideological, which gives them a monopoly on recruiting particular kinds of employees, and keeps the organization motivated even when things are difficult. Odd shared beliefs are a source of thymos. For any one person, starting such a company might be suboptimal, but for the group, the existence of a rationalist-dominated startup would be a net gain. So the outside view strikes again: rationalists are always in the habit of asking “why would I be special?” but if nobody asks that question, nothing special gets started.
(Another possibility is that there are a number of highly successful, rationalist-sympathetic people, but they’re discreet about their adherence to rationalism. This could be a prudent choice not to draw undue attention, or it could be because rationalists are much more introverted than average, and simply haven’t gotten around to telling people about their views.)
My undergraduate study in Philosophy, Politics and Economics is basically a reading degree. In case you are interested, I am taking these papers:
- Comparative Government
- Political Sociology
- Marx and Marxism
- Politics in the Middle East
- Politics in China
- Microeconomics
- Quantitative Economics
- Game Theory
Past Writings
- Medium
As a believer of complete transparency, I decided to show you my Medium stats:
My best performing article is one about Pret's coffee subscription:
And it was also quoted on the Financial Times:
2. The Oxford Blue
I've written two articles for Oxford Blue, a student-run online newspaper based in Oxford.