Writers move to sites like Substack (or 15 years ago blogspot) funded by other people's money like a software developer gets into an AI startup (or 5 years ago crypto). You can make bank in the short term even if you should know it will not last. Substack subsidizes individual creators and markets their blog as cooler than old blogs, Google subsidized web ads and upranked blogs in search results. Yes, it is no fun if you like stability, and its not a game I play.
The party line has shifted comrade. This year with posts on Richard Lynn etc. Scott Alexander is now saying in public the same thing he said in a private email: that he thinks race pseudoscientists and neoreactionaries are brilliant and precious and as many people as possible need to read the best 1% of their ideas. He is no longer pretending that he thinks they have nothing to offer and just has them in his blogroll because ?
The private email from 2014 explained how he hoped people would respond to the anti-neoreactionary FAQ, and his posts this year are 100% consistent with that.
GiveWell is an example of the short-termist end of EA. At the long-termist end people pay their friends to fantasize about Skynet at 'independent research institutes' like MIRI and Apollo Research. At the "trendy way to get rich people to donate" end you get buying a retreat center in Berkley, a stately home in England, and a castle in Czechia so Effective Altruists can relax and network.
Its important to know which type of EA organization you are supporting before you donate, because the movement includes all three.
I assume that GiveWell is the most popular of them. I mean, if you donate to MIRI, it is because you know about MIRI and because you specifically believe in their cause. But if you are just "hey, I have some money I want to donate, show me a list of effective charities", then GiveWell is that list.
(And I assume that GiveWell top charities receive orders of magnitude more money, but I haven't actually checked the numbers.)
Even GiveWell partnered with the long-termist/hypothetical risk type of EA by funding something called Open Philanthropy. And there are EA organizations which talk about "animal welfare" and mean "what if we replaced the biosphere with something where nothing with a spinal cord ever gets eaten?" So you can't trust "if it calls itself EA, it must be highly efficient at turning donations into measurable good." EA orgs have literally hired personal assistants and bought stately homes for the use of the people running the orgs!
In that essay Scott Alexander more or less says "so Richard Lynn made up numbers about how stupid black and brown people are, but we all know he was right if those mean scientists just let us collect the data to prove it." The level of thinking most of us moved past in high school, and he is a MD who sees himself as a Public Intellectual! More evidence that thinking too much about IQ makes people stupid.
I think I saw Apollo Research behind a paper that was being hyped a few months ago. The longtermist/rationalist space seems to be creating a lot of new organizations with new names because a critical mass of people hear their old names and say "effective altruism, you mean like Sam Bankman-Fried?" or "LessWrong, like that murder cult?" (which is a bit oversimplified, but a good enough heuristic for most people).
US and California culture have lots of problems around arrogance and refusal to see or hear the rest of the world, but refusing to read research from American universities seems harsh. Because the most powerful state that has ever existed is the imperial, entitled culture you meant right?
Why are you projecting so hard? Afraid of something?
Instead of having a (useless in this context) knee-jerk reaction, you can read the links and find that the reported countries with the highest cheating in each link were France and UAE.
US and California are in the opposite side of that spectrum, with less cheating.
The actual Hacker News comment which I was responding to mentioned "imperial-entitled cultures ... the cultural, god-chosen center of the universe ... face culture" which sounds like he means China. The comment linked two papers, but not for specific facts, and they discuss five countries only one of which is a hotbed of fake research (the USA - psychology, enough said). There are only a million Bahrainis so I don't think anyone would need to have a policy of ignoring research by Bahrainis, but a lot of people talk smack about Indian and Chinese research.
That singularity happened in the fifth century BCE when people figured out that they could charge silver to teach the art of rhetoric and not just teach their sons and nephews
Scott Alexander, for what its worth, is a psychiatrist, race science enthusiast, and blogger whose closest connection to software development is Bay Area house parties and a failed startup called MetaMed (2012-2015) https://rationalwiki.org/wiki/MetaMed
reply