As suggested at this thread to general “yeah sounds cool”. Let’s see if this goes anywhere.

Original inspiration:

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

If your sneer seems higher quality than you thought, feel free to make it a post, there’s no quota here

  • Sailor Sega Saturn@awful.systems
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    10 months ago

    So today I learned there are people who call themselves superforcasters®. Neat!

    The superforecasters® have had a melding of the minds and determined that covid-19 was 75% likely to not be a lab leak. Nifty! This is useless to me!

    Looking at the website of these people with good enough judgement to call themselves “Good Judgement”, you can learn that 100% of superforecasters® agree that there will be less than 100 deaths from H5N1 this year. I don’t know much about H5N1 but I guess that makes sense given that it’s been around since 1996 and would need a mutation to be contagious among humans.

    I found one of the superforecaster®-trainee discussion topics where they reveal some of the secrets to their (super)forecasting(®)-trainee instincts

    I have used “Copilot” LLM AI to point me in the right direction. And to the point of the LLM they have been trained not to give a response about conflict as they say they are trying to permote peace instead of war using the LLM.

    Riveting!

    Let’s go next to find out how to give up our individuality and become a certified superforecaster® hive brain.

    To minimize the chance that outstanding accuracy resulted from luck rather than skill, we limited eligibility for GJP superforecaster status to those forecasters who participated in at least 50 forecasting questions during a tournament “season.”

    Fans of certain shonen anime may recognize this technique as Kodoku – a deadly poison created by putting a bunch of insects in a jar until only one remains:

    100 species of insects were collected, the larger ones were snakes, the smaller ones were lice, Place them inside, let them eat each other, and keep what is left of the last species. If it is a snake, it is a serpent, if it is a louse, it is a louse. Do this and kill a person.


    “But what’s the catch Saturn”? I can hear you say. “Surely this is somehow a grift nerds find or a way to fleece money out of governments”.

    Nonono you’ve got the completely wrong idea. Good Judgement offers a 100$ Superforecasting Fundamentals course out of the goodness of their heart I’m sure! I mean after all if they spread Superforecasting to the world then their Hari-Seldon-Esque hivemind would lose it’s competitive edge so they must not be profit motivated.

    Anyway if you work for the UK they want to hear from you:

    If you are a UK government entity interested in our services, contact us today.

    Maybe they have superforecasted the fall of the british empire.


    And to end this, because I can never resist web design sneer.

    Dear programmers: if you apply the CSS word-break: break-all; to the string “Privacy Policy” it may end up rendered as “Pr[newline]ivacy Policy” which unfortunately looks pretty unprofessional :(

    • sc_griffith@awful.systems
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      9 months ago

      lmao this is one of my all time favorite grifts. I’ve never understood why it isn’t more popular among us connoisseurs. it’s so baldfaced to say “statistically, someone probably has oracular powers, and thanks to science, here they are. you need only pay us a small incense and rites fee to access them”

      • Architeuthis@awful.systems
        link
        fedilink
        English
        arrow-up
        11
        ·
        9 months ago

        Imo because the whole topic of superforecasters and prediction markets is both undercriticized and kaleidoskopically preposterous in a way that makes it feel like you shouldn’t broach the topic unless you are prepared to commit to some diatribe length posting.

        Which somebody should, it’s a shame there is yet no one single place you can point to and say “here’s why this thing is weird and grifty and pretend science while striclty promoted by the scientology of AI, and also there’s crypto involved”.

        • YouKnowWhoTheFuckIAM@awful.systems
          link
          fedilink
          English
          arrow-up
          9
          ·
          9 months ago

          It’s really gotta be emphasised that these guys didn’t come out of internet atheism and frankly I would really like to know where that idea came from. It’s a completely different thing which, arguably, predates internet atheism (if we read “internet atheism” as beginning in the early 2000s - but we could obviously push back that date much earlier). These guys are more or less out of Silicon Valley, Emile P Torres has coined the term “TESCREALS” (modified to “TREACLES”) for - and I had to google this even though I know all the names independently - “Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, and Longtermism”.

          It’s a confluence of futurism cults which primarily emerged online (even on the early internet), but also in airport books by e.g. Ray Kurzweil in the 90s, and has gradually made its away into the wider culture, with EA and longtermism the now most successful outgrowths of its spores in the academy.

          Whereas internet atheism kind of bottoms out in 1990s polemics against religion - nominally Christianity, but ultimately fuelled by the end of the Cold War and the West’s hunger for a new enemy (hey look over there, it’s some brown people with a weird religion) - the TREACLES “cluster of ideologies” (I prefer “genealogy”, because this is ultimately about a political genealogy) has deep roots in the weirdest end of libertarian economics/philosophy and rabid anti-communism. And therefore the Cold War (and even pre-Cold War) need for a capitalist political religion. OK the last part is my opinion, but (a) I think it stands up, and (b) it explains the clearly deeply felt need for a techno-religion which justifies the most insane shit as long as there’s money in it.

          • blakestacey@awful.systems
            link
            fedilink
            English
            arrow-up
            8
            ·
            9 months ago

            Yeah, I hung out a lot in Internet skeptic/atheist circles during the 2005-10 era, and as far as I can recall, the overlap with LessWrong, Overcoming Bias, etc., was pretty much nil. This was how that world treated Ray Kurzweil.

            • Architeuthis@awful.systems
              link
              fedilink
              English
              arrow-up
              7
              ·
              9 months ago

              I read

              Most of it was exactly like the example above: Kurzweil tosses a bunch of things into a graph, shows a curve that goes upward, and gets all misty-eyed and spiritual over our Bold Future. Some places it’s OK, when he’s actually looking at something measurable, like processor speed over time. In other places, where he puts bacteria and monkeys on the Y-axis and pontificates about the future of evolution, it’s absurd. I am completely baffled by Kurzweil’s popularity, and in particular the respect he gets in some circles, since his claims simply do not hold up to even casually critical examination.

              and immediately thought someone should introduce PZ Meyers to rat/EA as soon as possible.

              Turns out he’s aware of them since at least 2016:

              Are these people for real?

              I’m afraid they are. Google sponsored a conference on “Effective Altruism”, which seems to be a code phrase designed to attract technoloons who think science fiction is reality, so the big worries we ought to have aren’t poverty or climate change or pandemics now, but rather, the danger of killer robots in the 25th century. They are very concerned about something they’ve labeled “existential risk”, which means we should be more concerned about they hypothetical existence of gigantic numbers of potential humans than about mere billions of people now. You have to believe them! They use math!

              More recently, it seems that as an evolutionary biologist he apparently has thoughts on the rat concept of genetics: The eugenicists are always oozing out of the woodwork

              FWiW I used to read PZM quite a bit before he pivoted to doing youtube videos which I don’t have the patience for, and he checked out of the new atheist movement (such as it was) pretty much as soon as it became evident that it was gradually turning into a safe space for islamophobia and misogyny.

          • bitofhope@awful.systems
            link
            fedilink
            English
            arrow-up
            6
            ·
            9 months ago

            I wonder if the existence of RationalWiki contributes to the confusion. Even though it’s unrelated to and critical towards TREACLES, the name can cause confusion.

    • Sailor Sega Saturn@awful.systems
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      10 months ago

      I know what you’re thinking.

      You’re thinking Saturn, that could have been a post!

      I know I know, but I can’t handle that kind of pressure. If someone else wants to make a post about this, or prediction markets, don’t let me stop you. It’s an under-sneered area at the intersection of tech weirdos, that other kind of tech weirdoes, and that third kind of tech weirdos.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      9 months ago

      If you have ever wondered why so many Rationalists do weird end of year predictions and keep stats on that, it is because they all want to become superforecasters. (And remember by correctly forecasting trivial things that are sure to happen, you can increase your % of correct forecasts and you can become a superforecaster yourself. Never try to forecast black swans for that reason however (or just predict they will not happen for more superforecastpoints)).

      See also you could have been a winner! And got a free sub to ACX!

      • David Gerard@awful.systemsOPM
        link
        fedilink
        English
        arrow-up
        11
        ·
        9 months ago

        this is what literary Bayesianism promises: the ability to pluck numbers out your ass and announce them in a confident voice

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      9 months ago

      Fans of certain shonen anime may recognize this technique as Kodoku – a deadly poison created by putting a bunch of insects in a jar until only one remains

      I understood this reference. I know it as Gu poison, which is listed in the wikipedia article you linked!

      To minimize the chance that outstanding accuracy resulted from luck rather than skill, we limited eligibility for GJP superforecaster status to those forecasters who participated in at least 50 forecasting questions during a tournament “season.”

      When I was a kid I read a vignette of a guy trying to scam people into thinking he was amazing at predicting things. He chose 1024 stockbrokers, picked one stock, and in 512 envelopes he said the stock would be up by the end of the month, and in the other 512 he said it would go down. You can see where this story is going, i.e. he would be left with one person thinking he predicted 10 things in a row correctly and was therefore a superforecaster. This vignette was great at illustrating to child me that predicting things correctly isn’t necessarily some display of great intelligence or insight. Unfortunately what I didn’t know is that it was setting me up for great disappointment when after that point and forevermore, I would see time and time again that people would fall for this shit so easily.

      (For some reason when I try to think of where I read that vignette, vonnegut comes to mind. I doubt it was him.)