Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This reminds me a lot of climate skeptics pointing out that climate researchers stand to make money off books about climate change.

Selling AI doom books nets considerably less money than actually working on AI (easily an order of magnitude or two). Whatever hangups I have with Yudkowsky, I'm very confident he's not doing it for the money (or even prestige; being an AI thought leader at a lab gives you a built-in audience).






The inverse is true, though - climate skeptics are oftentimes paid by the (very rich) petrol lobby to espouse skepticism. It's not an asinine attack, just an insecure one from an audience that also overwhelmingly accepts money in exchange for astroturfing opinions. The clear fallacy in their polemic being that ad-hominem attacks aren't addressing the point people care about. It's a distraction from global warming, which is the petrol lobby's end goal.

Yudkowsky's rhetoric is sabotaged by his ridiculous forecasts that present zero supporting evidence of his claims. It's the same broken shtick as Cory Doctorow or Vitalik Buterin - grandiose observations that resemble fiction more than reality. He can scare people, if he demonstrates the causal proof that any of his claims are even possible. Instead he uses this detachment to create nonexistent boogeymen for his foreign policy commentary that would make Tom Clancy blush.


What sort of unsupported ridiculous forecast do you mean? Can you point to one?

I'm not the grandparent but the more interesting question is what could possibly constitute "supporting evidence" for an AI Doom scenario.

Depending on your viewpoint this could range from "a really compelling analogy" to "A live demonstration akin to the trinity nuclear test."




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: