Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youāll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cutānāpaste it into its own post ā thereās no quota for posting and the bar really isnāt that high.
The post Xitter web has spawned soo many āesotericā right wing freaks, but thereās no appropriate sneer-space for them. Iām talking redscare-ish, reality challenged āculture criticsā who write about everything but understand nothing. Iām talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyāre inescapable at this point, yet I donāt see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldnāt be surgeons because they didnāt believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canāt escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
I hear that even though Yud started blogging on his site, and even though George Mason University type economics is trendy with EA and LessWrong, Hanson never identified himself with EA or LessWrong as movements. So this is like Gabriele DāAnnunzio insisting he is a nationalist not a fascist, not Nicholas Taleb denouncing phrenology.
I deeply regret I have made posts proclaiming LessWrong as amazing, in the past.
They do still have a decent article here and there, but thatās like digging for strawberries in a pile of shit. Even if you find one, it wonāt be great.
We have some threads of Vaccinations in Book/Article Form which try to share good pop science and textbooks without the cult shit and Dunning-Kruger. People who think they know everything and are mysteriously underemployed tend to have the most time to post though.
It is pretty good as a source for science fiction ideas. I mean, lots of their ideas originate from science fiction, but their original ideas would make fun fantasy sci-fi concepts. Like looking off their current front page⦠https://www.lesswrong.com/posts/WLFRkm3PhJ3Ty27QH/the-cats-are-on-to-something catās deliberately latching on to humans as the most lazy way of advancing their own value across the future seems like a solid point of fantasy worldworldbuildingā¦
He had me in the first half, I thought he was calling out rationalistās problems (even if dishonestly disassociating himself from then). But then his recommended solution was prediction markets (a concept which rationalists have in fact been trying to play around with, albeit at a toy model level with fake money).
Also a concept that Scott Aaronson praised Hanson for.
https://web.archive.org/web/20210425233250/https://twitter.com/arthur_affect/status/994112139420876800
(Crediting the āGreat Filterā to Hanson, like Scott Computers there, sounds like some fuckinā bullshit to me. In Cosmos, Carl Sagan wrote, āWhy are they not here? There are many possible answers. Although it runs contrary to the heritage of Aristarchus and Copernicus, perhaps we are the first. Some technical civilization must be the first to emerge in the history of the Galaxy. Perhaps we are mistaken in our belief that at least occasional civilizations avoid self-destruction.ā And in his discussion of abiogenesis: āLife had arisen almost immediately after the origin of the Earth, which suggests that life may be an inevitable chemical process on an Earth-like planet. But life did not evolve beyond blue-green algae for three billion years, which suggests that large lifeforms with specialized organs are hard to evolve, harder even than the origin of life. Perhaps there are many other planets that today have abundant microbes but no big beasts and vegetables.ā Boom! There it is, in only the most successful pop-science book of the century.)
A futarchy, you say? Tell me more, Robin Hanson
I noticed that Hanson speculated that āmost of the Great Filter is most likely to be explained by [ā¦] the steps in the biological evolution of life and intelligenceā, and then lied by omission about Saganās position. He said that Sagan appealed to āsocial scienceā and believed that the winnowing effect is civilizations blowing themselves up with nukes. He cites an obscure paper from 1983, while ignoring the, again, most successful pop-science book of the century.
Honestly Hanson is so awful the rationalists almost make him look better by association.
Heās the one that used the phrase āsilent gentle rapeā? Yeah, heās at least as bad as the worst evo-psych pseudoscience misogyny posted on lesswrong, with the added twist he has a position in academia to lend him more legitimacy.
I started reading his post with that title to refresh myself. Just to get your feet wet:
Man, what happened in the three years it took for a content warning?
Anyway I skimmed it, the rest of the post is a huge pile of shit that I donāt want to read any more of, Iām sure itās been picked apart already. But JFC.
The Manifest networking event in Berkeley combines prediction markets, race cranks, EA, and LessWrong. Scott Alexander likes prediction markets, does Yud?
To add to blakestaceyās answer, his fictional worldbuilding concept, dath ilan (which he treats like rigorous academic work to the point of citing it in tweets), uses prediction markets in basically everything, from setting government policy to healthcare plans to deciding what restaurant to eat at.
https://xcancel.com/ESYudkowsky/status/1933973423472164955
So Hanson is dissing one of the few movements that supports his pet contrarian policy? After the Defence Department lost interest the only people who like prediction markets seem to be LessWrongers / EAs / tech libertarians / crypto bros / worshippers of Friend Computer.
Apparently Donald Trump Jr. has found his way into the payroll of a couple of the bigger prediction markets, so they seem to be doing their darndest to change that.
Every tweet in that thread is sneerable. Either from failing to understand the current scientific process, vastly overestimating how easily cutting edge can be turned into cleanly resolvable predictions, or assuming prediction markets are magic.
Pretty easy to look at actually-existing instances and note just how laughable "traders trusted us enough for the market to be liquidā is.
This is just another data point begging what I believe to be the most important question an American can ask themselves right now: why be a sucker?
Bet itās more like assuming it will incentivize people with magical predicting genes to reproduce more so we can get a kwisatz haderach to fight AI down the line.
Itās always dumber than expected.