• Lime Buzz (fae/she)@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 hour ago

    There’s no such thing as a brainwashing machine. Brainwashing is a very specific process that doesn’t last long outside of the torture or specific scenario that creates it.

    What they mean is a conditioning machine.

    • TheRtRevKaiser@beehaw.orgOPM
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      49 minutes ago

      You’re right, but I think they are using the term “brainwashing” in a colloquial sense. There’s a perception that misinformation on the internet is persuading people into more extreme views, but what the author of this article is arguing is that what is happening more is that online misinformation is allowing people to easily justify beliefs that they have already formed, and quickly and easily get rid of cognitive dissonance associated with encountering information that contradicts their beliefs. This is something that people have always done, but it’s become so easy on the modern internet that more and more people are embracing fringe worldviews who might previously have been unable to cognitively support those views.

      It’s a small difference in the way we think about misinformation online, but I think it’s important that we understand what is likely happening. It’s not so much that misinformation is changing people’s beliefs, but that it’s allowing people to hang onto beliefs that contradict reality more easily.

      • Lime Buzz (fae/she)@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        29 minutes ago

        Excellent reply.

        Honestly, I think it’s both. But you may be correct, sadly that’s it’s more about being able to hang on to beliefs for the most part.

        • TheRtRevKaiser@beehaw.orgOPM
          link
          fedilink
          arrow-up
          2
          ·
          11 minutes ago

          I actually also think it’s probably both, to a degree, that’s just not what the author of the article is arguing. I think there’s probably a certain amount of persuasion that is pulling people deeper into a belief system that they might only be partially invested in at first, and then they are sucked into ecosystems that reinforce those beliefs and pull them further in. I don’t have anything but vibes and lots of half-remembered reading about online radicalization, though.

  • Chris Remington@beehaw.orgM
    link
    fedilink
    arrow-up
    11
    ·
    22 hours ago

    Confronted with information that could shake their worldviews, people can now search for confirming evidence and mainline conspiracist feeds or decontextualized videos. They can ask AI and their favorite influencers to tell them why they are right. They can build tailored feeds and watch as algorithms deliver what they’re looking for. And they will be overwhelmed with data.

    This is, precisely, what morons do. How do you fix stupid?

    • megopie@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      33 minutes ago

      Alter the incentive structures of the systems so as to make it unappealing to pedal sensationalism. Change how algorithms work for promoting content so it doesn’t reward rage bait. Don’t have bad behavior be rewarded with attention, especially not when attention is money.