TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

  • Zoot@reddthat.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    Just anecdotal but I only ever watch duck videos or funny animal videos with occasional other funnies or crazy science things, and that’s still all I ever get. Other days I get plenty of cool music like tesla coils making music or other piano music.

    Am I youtubing wrong?

    • UraniumBlazer@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Nah good for you. Maybe it’s because of your geographical location/you just being lucky? I have experienced what the video above says quite a lot though.

      I’m not American, so I didn’t exactly see a lot of Trump (although there was some amount of it). I largely saw a lot of Hindu nationalist content (cuz of my geographical location). The more I disliked the videos, the more they got recommended to me. It was absolutely pathetic.

  • gmtom@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    ·
    6 days ago

    So you’re saying we need to start pumping out low quality left wing brainrot?

    • eronth@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      5 days ago

      Insanely, that seems to be the play. Not logic or reason, but brainrot and low blows. Which is a bit at odds with the actual desire.

      • xor@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 days ago

        fight fire with fire i guess….
        maybe people get on board quicker if they feel the emotions first, and then learn the logic….
        one good example is Noam Chompsky: every thing is says is gold, but he says it so slow and dispassionately even people who agree with him find it hard to watch.

    • sudo@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      5 days ago

      It only must be extremely simplified and evoke emotional reactions. That’s just basic propaganda rules. The brainrot quality of the content is a consequence of the sheer quantity of the content. You can’t make that volume of content and without making fully automated ai slop.

      What the experiment overlooks is that there are PR companies being paid to flood YouTube with rightwing content and are actively trying to game its algorithm. There simply isn’t a left-wing with the capital to manufacture that much content. No soros-bucks for ai minions in keffiyehs talking about medicare.

  • Victor@lemmy.world
    link
    fedilink
    English
    arrow-up
    74
    arrow-down
    1
    ·
    7 days ago

    I keep getting recommendations for content like “this woke person got DESTROYED by logic” on YouTube. Even though I click “not interested”, and even “don’t recommend channel”, I keep getting the same channel, AND video recommendation(s). It’s pretty obvious bullshit.

    • SaharaMaleikuhm@feddit.org
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      7 days ago

      Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        7 days ago

        Are those available on PC/Linux? On my TV? 😭 I have them on my phone but I feel like there’s too much hassle to do on my main viewing devices.

        • prole@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          7 days ago

          I use FreeTube on Linux. I think it’s Chromium based, so some people don’t like it, and it’s usually one of the bigger resource hogs when I have it open, but its worth it for the ad-free, subscriptions-only experience imo…

          Though lately it hasn’t been behaving well with the vpn…

          • Victor@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 days ago

            I mean, on PC I’m not really having much issue. I don’t fall for “recommendations”, and I run ublock origin in Firefox so I have zero ads. All good there. The TV is the worst though…

              • Victor@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 days ago

                I know, sorry. I realized it wasn’t an issue on PC after the fact.

                But the TV… It’s brutal, the amount of long, unskippable ads. It’s worse than on regular/linear television.

    • lennivelkant@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      19
      ·
      7 days ago

      You’d think a recommendation algorithm should take your preferences into account - that’s the whole justification for tracking your usage in the first place: recommending relevant content for you…

        • lennivelkant@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          7
          ·
          7 days ago

          Even in the best-intentioned recommender system, trained on the content you watch to estimate what you’re interested in and recommend similar things, that would be the drift of things. You can’t really mathematically judge the emotions the viewers might feel unless they express them in a measurable way, so observing their behaviour and recommending similar by whatever heuristic. And if they keep clicking on rageposts, that’s what the system has to go on.

          But at least giving the explicit indication “I don’t want to see this” should be heavily weighted in that calculation. Just straight up ignoring that is an extra layer of awful.

      • andallthat@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        6 days ago

        it is. But who said that you get to decide what’s relevant for you? Welcome and learn to trust your algorithmic overlords

      • sudo@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 days ago

        Wrong, the whole purpose of tracking your usage is to identify what kind of consumer you are so they can sell your views to advertisers. Recommendations are based on what category of consumer you’ve been identified as. Maintaining your viewership is secondary to the process of selling your views.

        • lennivelkant@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 days ago

          I said justification, not purpose. They claim they want to track usage to tailor your experience to you.

          They don’t actually believe that, of course, but respecting your explicit expression of interest ought to be the minimum perfunctory concession to that pretense. By this we can see just how thin a pretense it is.

      • prole@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        I feel like it at least used to pretend that it was doing this (YouTube) at least.

        I can’t say for recently as I use a third party client these days and do not log in.

  • socialmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    1
    ·
    edit-2
    7 days ago

    I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.

    The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.

    This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.

    This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

    • whoisearth@lemmy.ca
      link
      fedilink
      English
      arrow-up
      13
      ·
      7 days ago

      The problem is education. It’s a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda

      What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I’ll argue there are people actively destroying this for their own gain.

      Educated people are dangerous people.

      It’s not 1984. It’s Brave New World. Aldous Huxley was right.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 days ago

        I think we need to do better than just say “get an education.”

        There are educated people that still vote for Trump. Making it sound like liberalism is some result of going to college is part of why so many colleges are under attack.

        From their perspective I get it, many of the Trump voters didn’t go, they hear that and they just assume brainwashing.

        We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information, etc, not just the kind of “education” where you regurgitate talking points from teachers, the TV, or the radio as if they’re matter of a fact … and the whole education system is pretty tuned around regurgitation, even at the college level. A lot of the culture of exploration surrounding college (outside of the classroom) is likely more where the liberal view points come from and we’d be ill advised to assume the right can’t destroy that.

        • CircuitGuy@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          6 days ago

          We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information

          This entire comment and @whoisearth@lemmy.ca’s comments are so powerful.

          I think people have two modes of getting information: digging into a newspaper article and trying to figure out what’s going on and seeing a lurid headline in the tabloid rack. Most people do both ends of the spectrum and a lot of in-between. Modern technology lends itself to giving tabloid-like content while we’re waiting in line for a minute. This is why Tiktok is concerned about being removed from the app store, even though it’s easy to install the app yourself, easier than signing up for a newspaper delivery subscription was. But Tiktok isn’t more like a lurid tabloid that most people would not go two steps out of their way to find, but they might read it waiting in a slow line. I’m hopeful that people will learn to manage the new technology and not keep being influenced by tabloid entertainment.

          • whoisearth@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 days ago

            My GF is a student at the university of TikTok whereas I am very much about traditional media. It’s interesting seeing where both succeed and both fail.

            That said, you hit the nail on the head. A video on TikTok about some conspiracy bullshit is no different than seeing Bat Boy on the Weekly World News. The problem to me is not just the increased accessibility of this bullshit but the echo chamber around them that keep you engaged.

            Trust but verify should be in everyone lexicon.

        • CascadianGiraffe@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          I don’t think college education is the source we should be looking at.

          Critical thinking skills need to be taught a much much earlier phase.

            • CascadianGiraffe@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 days ago

              It’s not. But it’s harder the longer you wait. We’ve stopped teaching children because skills like this don’t fit into check boxes that can be marked off. It’s far easier just to train them to memorize test answers. We don’t TEACH anymore (likely because we don’t pay people to). It’s hard work, and the students outnumber the instructors so it’s no surprise we’re struggling. Parents don’t continue the education at home and expect the system to do all the work because they are wasting all of their energy at work.

              It’s going to take generations and decades to fix this and we won’t see the results in our lifetime. But we need to start doing things differently if we want things to be better for our descendants. We need to reform education AND create an environment where you don’t need both adults to work 40+ hours a week to support the family.

              • Dark Arc@social.packetloss.gg
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 days ago

                I don’t think this is an “anymore” problem, I don’t think it ever has been taught. The majority of people that voted for Trump were not young people fresh out of school.

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 days ago

      This discussion existed before computers. Before that it was TV and before that it was radio. The core problem is ads. They ruined the internet, TV, radio, the press. Probably stone tablets somehow. Fuck ads.

    • trashboat@midwest.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 days ago

      sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

      Not arguing against this at all because you’re completely correct, but this feels like a key example of governments being too slow (and perhaps too out of touch?) to properly regulate tech. People clearly like having an algorithm, but algorithms in their current form are a great excuse for tech companies to use to throw their hands up in the air and claim no foul play because of how opaque they are. “It only shows you what you tell it you want to see!” is easy for them to say, but until consumers are given the right to know how exactly each one works, almost like nutrition facts on food packaging, then we’ll never know whether they’re telling the truth. The ability for a tech company to have near unlimited control and no oversight over what millions of people are looking at day after day is clearly a major factor in what got us here in the first place

      Not that there’s any hope for new consumer protections during this US administration or anything, but just something I had been thinking about for a while

  • Queen HawlSera@lemm.ee
    link
    fedilink
    English
    arrow-up
    60
    ·
    7 days ago

    I hate the double standards

    On a true crime video: “This PDF-File game ended himself after he was caught SAing this individual… Sorry Youtube forces me to talk like that or I might get demonetized” Flagged for discussing Suicide

    On PragerU: “The Transgender Agenda is full of rapists and freaks who will sexually assault your children, they are pedophiles who must be dealt with via final solution!” Completely fucking acceptable!

  • HoMaster@lemm.ee
    link
    fedilink
    English
    arrow-up
    42
    ·
    7 days ago

    Alt right videos are made to elicit outrage, hate, and shock which our lizard brains react to more due to potential danger than positive videos spreading unity and love. It’s all about getting as many eyeballs on the video to make money and thi is the way that’s most effective.

    • sudo@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 days ago

      There’s also an entire industry around mass producing this content and deliberately gaming the algorithm.

    • intensely_human@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      6 days ago

      So all this stuff about climate change being an existential threat is actually alt right?

      • Duamerthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        6 days ago

        Are people making clickbait/ragebait articles about climate change? Are people seeking out clickbait about climate change?

        I don’t need to be constantly reminded of climate change, but an old “friend” is constantly telling me about the politics of video games he doesn’t even have a system to play with.

      • deus@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        6 days ago

        All alt-right content is made to generate outrage but content that generates outrage does not have to be necessarily alt-right.

        • I Cast Fist@programming.dev
          link
          fedilink
          English
          arrow-up
          7
          ·
          6 days ago

          Another important part of alt right bullshit is that they blame people that viewers can easily identify on the streets. Crime? It’s the immigrants and blacks! Shit economy? Jews and the deep state!

          So, I guess the only way to fight climate change is by accusing every petrol CEO of being a deep state Jew gay communist

          • MaggiWuerze@feddit.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 days ago

            I don’t think you meant it that way, but how are Jews ‘easily identifiable’ on the street?

            • I Cast Fist@programming.dev
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 days ago

              Ever seen that caricature of a Jew? The one with a huge nose and a grin, curly hair? That’s how the idiots picture all Jews. It doesn’t matter that it’s a racist/xenophobic stereotype, it has a “clear, recognizable face” of the enemy. It creates an image of “the enemy” in their mind

      • Saltycracker@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        6 days ago

        I feel like they have a hard time defining alt right. If you type in is drinking coffee alt right there is a article, playing video games, driving cars.

  • danciestlobster@lemm.ee
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    1
    ·
    6 days ago

    I don’t think it makes me feel better to know that our descent into fascism is because gru promised 1MM rizz for it

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    36
    ·
    7 days ago

    From my anecdotal experiences, it’s “manly” videos that seem to lead directly to right wing nonsense.

    Watch something about how a trebuchet is the superior siege machine, and the next video recommended is like “how DEI DESTROYED Dragon Age Veilguard!”

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      7 days ago

      Or “how to make ANY woman OBEY you!”

      Check out a short about knife sharpening or just some cringe shit and you’re all polluted.

  • ragebutt@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    edit-2
    7 days ago

    Do these companies put their fingers on the scale? Almost certainly

    But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc

    This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.

    Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).

    You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.

    This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up

    The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”

    This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products

    If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings

  • Yerbouti@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 days ago

    There’s a firefox extension to hide short and another to default to your subscription. Along with ublock, those are the only things that makes youtube usable.

    • Ashelyn@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 days ago

      That doesn’t fix the out-of-the-box experience of the platform for millions, if not billions of people. Yes it’s a good step to take individually, but insufficient to deal with the broader issue raised of latent alt-right propagandizing

      • x00z@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        Commenting on stuff definitely strengthens it, but I wouldn’t know if a shadow ban changes that. I don’t think there’s much difference if you are shadowbanned or not, you’re still interacting with the content.

          • x00z@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            7 days ago

            That’s not what a shadowban is. A shadow ban is where the user does not know they are banned. These search terms were very obviously censorship and not a shadowban.

            If it were a shadowban then you would still get results and be able to interact with it. But some results might have been hidden and your interactions would be hidden to others too. A shadowban is meant to make you believe you were not censored.

  • doortodeath@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    7 days ago

    I don’t know if anyone of you still looks at memes on 9gag, it once felt like a relatively neutral place but the site slowly pushed right wing content in the last years and is now infested with alt-right and even blatantly racist “memes” and comment-sections. Fels to me like astroturfing on the site to push viewers and posters in some political direction. As an example: in the span during US-election all of a sudden the war on palestine became a recurring theme depicting the Biden admin and jews as “bad actors” and calling for Trump; after election it became a flood of content about how muslims are bad people and we shouldn’t intervene in palestine…

    • blubfisch@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 days ago

      From what I heard, the site was astroturfing long before it took a right turn. But my only sources are online rumors…

  • GhostlyPixel@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    7 days ago

    The view farming in shorts makes it even harder to avoid as well. Sure, I can block the JRE channel, for example, but that doesn’t stop me from getting JRE clips from probably day-old accounts which just have some shitty music thrown on top. If you can somehow block those channels, there’s new ones the next day, ad infinitum.

    It’s too bad you can’t just disable the tab entirely, I feel like I get sucked in more than I should. I’ve tried browser extensions on mobile which remove the tab, but I haven’t had much luck with PiPing videos from the mobile website, so I can’t fully stop the app.

  • shalafi@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    7 days ago

    I’ll get downvoted for this, with no explanation, because it’s happened here and on reddit.

    I’m a liberal gun nut. Most of my limited YouTube is watching gun related news and such. You would think I’d be overrun with right-wing bullshit, but I am not. I have no idea why this is. Can anyone explain? Maybe because I stick to the non-politcal, mainstream guntubers?

    The only thing I’ve seen start to push me to the right was watching survival videos. Not some, “dems gonna kill us all” bullshit, simply normal, factual stuff about how to survive without society. That got weird fast.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      11
      ·
      7 days ago

      Their algorithms are probably good enough to know you’re interested in guns but not right wing stuff. Simple as that.

    • Ilovethebomb@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 days ago

      I’ve noticed most firearms channels steer well clear of politics, unless it’s directly related to the topic at hand, I think partly to appeal to an international audience.

      I do think the algorithm puts firearms and politics into very separate categories, someone watching Forgotten Weapons probably isn’t going to be interested in political content.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      7
      ·
      edit-2
      7 days ago

      Yeah, I don’t think I’ve ever seen alt-right nonsense without actively looking for it. Occasionally I’ll get recommended some Joe Rogan or Ben Shapiro nonsense, but that’s about it.

      I consider myself libertarian and a lot of my watch time is on Mental Outlaw (cyber security and dark web stuff), Reason (love Remy and Andrew Heaton videos), and John Stossel, but other than that, I largely avoid political channels. I watch a fair amount of gun content as well.

      If I get recommended political stuff, it’s usually pretty mainstream news entertainment, like CNN or Fox News. Even the crypto nonsense is pretty rare, even though I’m pretty crypto-positive (not interested in speculation though, only use as a currency and technical details).

      If you’re seeing alt-right crap, it’s probably because you’ve watched a lot of other alt-right crap.

      • gdog05@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        7 days ago

        I have had the opposite experience. I watch a few left-leaning commentary channels. Sam Seder, my boy Jesse Dollomore. If I watch a single video about guns (with no apparent ideological divide), within a single refresh I’m getting Shapiro and Jordan Peterson videos. I’m in a red Western state. My subscriptions are mostly mental health, tech, and woodworking. I have to delete history if I stray even a little bit.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          7 days ago

          I’ve watched some Sam Seder and related as well, mostly if I follow a link from Lemmy or something, but they don’t get recommended unless I watch a bunch. I’m more likely to see Bill Maher or something else more mainstream from the left than a smaller podcaster like Maher. I’d say I see Bill Maher about as much as Jordan Petersen, and I almost never watch either.

          I’m in a red state too. My voting district is also one of the more conservative in the state (70+% GOP according to voting stats), though I work in one of the more liberal areas.

      • Captain Aggravated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        My watch history would peg me as NOT a Republican. Youtube’s short feed will serve me

        • excerpt from youtuber’s longer video
        • tiktok repost from like, the truck astrology guy or “rate yer hack, here we go” guy, etc
        • Artificial voice reading something scraped from Reddit with Sewer Jump or Minecraft playing in the background
        • Chris Boden
        • Clip from The West Wing
        • Clip from Top Gear or Jeremy Clarkson’s Farm
        • “And that’s why the Bible tells us that Jesus wants you to hate filthy fucking liberals.”

        “Do not recommend channel.” “The downvote button doesn’t even seem to be a button anymore but I clicked it anyway.” “Report video for misinformation and/or supporting terrorism.” But the algorithm keeps churning it up.

        • AngryRobot@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 days ago

          Guy you replied to is trying to pretend his individual experience is representative of the whole.

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 days ago

            I’m not sure there is a “representative of the whole” here; I think the Youtube algorithm is modal.

            I think it’s an evolution of the old spam bots, like if you had an email address that in any way indicated you were male you’d get “v1agra” and “c1alis” ads nonstop, I’m sure you’d get makeup and breast enlargement spam or some shit in a woman’s inbox, whatever they can make you feel insecure enough to buy.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          7 days ago

          truck astrology guy

          Huh, never seen that, but of course that exists. I watched part of one and it was as cringy as I thought it would be.

          From that list, only one is anything close to “alt right” (last one). I’m guessing a lot of people that like truck astrology or top gear also watch alt right crap, so whatever is causing you to be recommended those videos is probably leading to the last.

          I don’t think YouTube’s algorithm looks at content, it probably more looks at what other people watched that watched similar videos as you.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          Perhaps. I’m in a very red part of a very red state, so following that logic, my feed would be filled with that crap.

          I mostly get tech videos because I mostly watch tech videos, and if they mention politics, they tend to be on the left end of the spectrum, because tech people lean left.

          • droporain@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            I don’t work with tech people. Tech people are smart. They know how to prevent cross contamination of social media. Lol tech people also know how to curate opinions to suit the situation. My experience has been tech people are privately very conservative. Kinda how everyone was shocked when they found out what a piece of shit musk was. But what do I know I hope I’m wrong but I also made a lot of money betting trump would win this election.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 days ago

              Tech people are smart

              I don’t think that’s true, or at least I don’t think tech people are smarter on average. There are a lot of “blue collar” people in tech, by which I mean they learned a skill and apply it according to orders.

              I don’t know what you consider “smart,” but I recommend talking about serious issues with a tech person and someone working a skilled blue collar job (e.g. mining engineer, metal fabrication, etc), and I bet you’ll have a similar experience. Some of my favorite people to talk to as a kid worked in construction or something, because they had a very practical form of intelligence that really resonated with me, instead of the airy BS I got from financial or tech people.

              People say tech people are smart, but as someone who works in tech, I don’t buy it. I think tech people are just like anyone else, they just have an aptitude for coding.

              • droporain@lemmynsfw.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 days ago

                Sure pal, whatever you say. I have a feeling them blue collar boys have a different opinion but know your soft. Which is why you feel like the financial and tech people bully you cuz.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 days ago

                  Where did you get bullying from?

                  I’m just saying people in “smart” fields think they’re smarter than they are, and other people don’t realize how smart they are. At the end of the day, I think most people who take time to excel in some craft are probably about the same level of intelligence, whether that’s writing code, fixing machines, or trading securities.

                  I do well in my field (I’m a sr. software engineer and lead a team), and I’m passionate about finance (can speak confidently with most finance types), but I’m no smarter than my mechanic. We just picked different fields where we juggle different balls in our 9-5.