Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youā€™ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cutā€™nā€™paste it into its own post ā€” thereā€™s no quota for posting and the bar really isnā€™t that high.

The post Xitter web has spawned soo many ā€œesotericā€ right wing freaks, but thereā€™s no appropriate sneer-space for them. Iā€™m talking redscare-ish, reality challenged ā€œculture criticsā€ who write about everything but understand nothing. Iā€™m talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyā€™re inescapable at this point, yet I donā€™t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldnā€™t be surgeons because they didnā€™t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canā€™t escape them, I would love to sneer at them.

Last weekā€™s thread

(Semi-obligatory thanks to @dgerard for starting this)

  • gerikson@awful.systems
    link
    fedilink
    English
    arrow-up
    13
    Ā·
    2 months ago

    Despite Soatak explicitely warning users that posting his latest rant[1] to the more popular tech aggregators would lead to loss of karma and/or public ridicule, someone did just that on lobsters and provoked this mask-slippage[2]. (comment is in three paras, which I will subcomment on below)

    Obligatory note that, speaking as a rationalist-tribe member, to a first approximation nobody in the community is actually interested in the Basilisk and hasnā€™t been for at least a decade. As far as I can tell, itā€™s a meme that is exclusively kept alive by our detractors.

    This is the Rationalist version of the village worthy complaining that everyone keeps bringing up that one time he fucked a goat.

    Also, ā€œthis sure looks like a religion to meā€ can be - and is - argued about any human social activity. Iā€™m quite happy to see rationality in the company of, say, feminism and climate change.

    Sure, ā€œreligionā€ is on a sliding scale, but Big Yud-flavored Rationality ticks more of the boxes on the ā€œReligion or notā€ checklist than feminism or climate change. In fact, treating the latter as a religion is often a way to denigrate them, and never used in good faith.

    Finally, of course, it is very much not just rationalists who believe that AI represents an existential risk. We just got there twenty years early.

    Citation very much needed, bub.


    [1] https://soatok.blog/2024/09/18/the-continued-trajectory-of-idiocy-in-the-tech-industry/

    [2] link and username witheld to protect the guilty. Suffice to say that They Are On My List.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      Ā·
      edit-2
      2 months ago

      nobody in the community is actually interested in the Basilisk

      But you should, yall created an idea which some people do take seriously and it is causing them mental harm. In fact, Yud took it so seriously in a way that shows that he either beliefs in potential acausal blackmail himself, or that enough people in the community believe it that the idea would cause harm.

      A community he created to help people think better. Which now has a mental minefield somewhere but because they want to look sane to outsiders now people donā€™t talk about it. (And also pretend that now mentally exploded people donā€™t exist). This is bad.

      I get that we put them in a no-win situation, either take their own ideas seriously enough to talk about acausal blackmail. And then either help people by disproving the idea, or help people by going ā€˜this part of our totally Rational way of thinking is actually toxic and radioactive and you should keep away from it (A bit like Hegel am I right(*))ā€™. Which makes them look a bit silly for taking it seriously (of which you could say who cares?), or a bit openly culty if they go with the secret knowledge route. Or they could pretend it never happened and never was a big deal and isnā€™t a big deal in an attempt to not look silly. Of course, we know what happened, and that it still is causing harm to a small group of (proto)-Rationalists. This option makes them look insecure, potentially dangerous, and weak to social pressure.

      That they do the last one, while have also written a lot about acausal trading, which just shows they donā€™t take their own ideas that seriously. Or if it is an open secret to not talk openly about acausal trade due to acausal blackmail it is just more cult signs. You have to reach level 10 before they teach you about lord Xeno type stuff.

      Anyway, I assume this is a bit of a problem for all communal worldbuilding projects, eventually somebody introduces a few ideas which have far reaching consequences for the roleplay but which people rather not have included. It gets worse when the non-larping outside then notices you and the first reaction is to pretend larping isnā€™t that important for your group because the incident was a bit embarrassing. Own the lightning bolt tennis ball, it is fine. (**)

      *: I actually donā€™t know enough about philosophy to know if this joke is correct, so apologies if Hegel is not hated.

      **: I admit, this joke was all a bit forced.

    • ShakingMyHead@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      Ā·
      2 months ago

      Obligatory note that, speaking as a rationalist-tribe member, to a first approximation nobody in the community is actually interested in the Basilisk and hasnā€™t been for at least a decade.

      Sure, but that doesnā€™t change that the head EA guy wrote an OP-Ed for Time magazine that a nuclear holocaust is preferable to a world that has GPT-5 in it.

        • ShakingMyHead@awful.systems
          link
          fedilink
          English
          arrow-up
          6
          Ā·
          2 months ago

          Finally, of course, it is very much not just rationalists who believe that AI represents an existential risk. We just got there twenty years early.

          This one?

    • David Gerard@awful.systemsM
      link
      fedilink
      English
      arrow-up
      8
      Ā·
      2 months ago

      nobody in the community is actually interested in the Basilisk

      except the ones still getting upset over it, but if we deny their existence as hard as possible they wonā€™t be there

      • gerikson@awful.systems
        link
        fedilink
        English
        arrow-up
        8
        Ā·
        edit-2
        2 months ago

        The reference to the Basilisk was literally one sentence and not central to the post at all, but this big-R Rationalist couldnā€™t resist on singling it out and loudly proclaiming itā€™s not relevant anymore. The mā€™lady doth protest too much.