She’s almost 70, spend all day watching q-anon style of videos (but in Spanish) and every day she’s anguished about something new, last week was asking us to start digging a nuclear shelter because Russia was dropped a nuclear bomb over Ukraine. Before that she was begging us to install reinforced doors because the indigenous population were about to invade the cities and kill everyone with poisonous arrows. I have access to her YouTube account and I’m trying to unsubscribe and report the videos, but the reccomended videos keep feeding her more crazy shit.

  • Ozymati@lemmy.nz
    link
    fedilink
    arrow-up
    98
    ·
    1 year ago

    Log in as her on your device. Delete the history, turn off ad personalisation, unsubscribe and block dodgy stuff, like and subscribe healthier things, and this is the important part: keep coming back regularly to tell YouTube you don’t like any suggested videos that are down the qanon path/remove dodgy watched videos from her history.

    Also, subscribe and interact with things she’ll like - cute pets, crafts, knitting, whatever she’s likely to watch more of. You can’t just block and report, you’ve gotta retrain the algorithm.

    • sergih123@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      Yeah, when you go on the feed make sure to click on the 3 dots for every recommended video and “Don’t show content like this” and also “Block channel” because chances are, if they uploaded one of these stupid videos, their whole channel is full of them.

    • lingh0e@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Would it help to start liking/subscribing to videos that specifically debunk those kinds of conspiracy videos? Or, at the very least, demonstrate rational concepts and critical thinking?

      • RGB3x3@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        1 year ago

        Probably not. This is an almost 70 year old who seems not to really think rationally in the first place. She’s easily convinced by emotional misinformation.

        Probably just best to occupy her with harmless entertainment.

      • driving_crooner@lemmy.eco.brOP
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        We recommend her a youtube channel about linguistics and she didn’t like it because the Phd in linguistics was saying that is ok for language to change. Unfortunately, it comes a time when people just want to see what already confirms their worldview, and anything that challenges that is taken as an offense.

  • Malcriada Lala@lemmy.world
    link
    fedilink
    arrow-up
    69
    ·
    1 year ago

    In addition to everything everyone here said I want to add this; don’t underestimate the value in adding new benin topics to her feel. Does she like cooking, gardening, diy, art content? Find a playlist from a creator and let it auto play. The algorithm will pick it up and start to recommend that creator and others like it. You just need to “confuse” the algorithm so it starts to cater to different interests. I wish there was a way to block or mute entire subjects on their. We need to protect our parents from this mess.

  • fennec@feddit.de
    link
    fedilink
    arrow-up
    63
    ·
    1 year ago

    At this point I would set up a new account for her - I’ve found Youtube’s algorithm to be very… persistent.

      • ubergeek77@lemmy.ubergeek77.chat
        link
        fedilink
        arrow-up
        42
        ·
        edit-2
        1 year ago

        You can make “brand accounts” on YouTube that are a completely different profile from the default account. She probably won’t notice if you make one and switch her to it.

        You’ll probably want to spend some time using it for yourself secretly to curate the kind of non-radical content she’ll want to see, and also set an identical profile picture on it so she doesn’t notice. I would spend at least a week “breaking it in.”

        But once you’ve done that, you can probably switch to the brand account without logging her out of her Google account.

        • AvoidMyRage@lemmy.world
          link
          fedilink
          arrow-up
          44
          ·
          1 year ago

          I love how we now have to monitor the content the generation that told us “Don’t believe everything you see on the internet.” watches like we would for children.

          • ubergeek77@lemmy.ubergeek77.chat
            link
            fedilink
            arrow-up
            24
            ·
            edit-2
            1 year ago

            We can thank all that tetraethyllead gas that was pumping lead into the air from the 20s to the 70s. Everyone got a nice healthy dose of lead while they were young. Made 'em stupid.

            OP’s mom breathed nearly 20 years worth of polluted lead air straight from birth, and OP’s grandmother had been breathing it for 33 years up until OP’s mom was born. Probably not great for early development.

        • _finger_@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          1 year ago

          She’s going to seek this stuff out and the algorithm will keep feeding her. This isn’t just a YouTube problem, this is also a mom problem.

      • Historical_General@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        12
        ·
        1 year ago

        Delete watch history, find and watch nice channels and her other interests, log in to the account on a spare browser on your own phone periodically to make sure there’s no repeat of what happened.

  • zombuey@lemmy.world
    link
    fedilink
    arrow-up
    62
    ·
    1 year ago

    youtube has a delete option that will wipe the recorded trend. then just watch a couple of videos and subscribe to some healthy stuff.

  • fuser@quex.cc
    link
    fedilink
    arrow-up
    57
    arrow-down
    2
    ·
    1 year ago

    the damage that corporate social media has inflicted on our social fabric and political discourse is beyond anything we could have imagined.

    • zeppo@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      1 year ago

      This is true, but one could say the same about talk radio or television.

      • theragu40@lemmy.world
        link
        fedilink
        arrow-up
        19
        ·
        1 year ago

        Talk radio or television broadcasts the same stuff to everyone. It’s damaging, absolutely. But social media literally tailors the shit to be exactly what will force someone farther down the rabbit hole. It’s actively, aggressively damaging and sends people on a downward spiral way faster while preventing them from encountering diverse viewpoints.

        • zeppo@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          1 year ago

          I agree it’s worse, but i was just thinking how there are regions where people play ONLY Fox on every public television, and if you turn on the radio it’s exclusively a right-wing propagandist ranting to tell you democrats are taking all your money to give it to black people on welfare.

            • zeppo@lemmy.world
              link
              fedilink
              arrow-up
              8
              ·
              1 year ago

              Uh, no. For one, Republicans spend like fuck, they just also cut taxes so they end up running up the deficit. Social welfare programs account for a tiny fraction of government budgets. The vast majority is the military and interest payments on debt.

      • fuser@quex.cc
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Yes, I agree - there have always been malevolent forces at work within the media - but before facebook started algorithmically whipping up old folks for clicks, cable TV news wasn’t quite as savage. The early days of hate-talk radio was really just Limbaugh ranting into the AM ether. Now, it’s saturated. Social media isn’t the root cause of political hatred but it gave it a bullhorn and a leg up to apparent legitimacy.

        • zeppo@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Social media is more extreme, but we can’t discount the damage Fox and people like Limbaugh or Michael Savage did.

    • bouh@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      Populism and racism is as old as societies. Anciant Greece already had it. Rome fell to it. Christianism is born out of it.

      Funnily enough, people always complained about how bad their society was because of this new thing. Like 5 thousand years ago already. Probably earlier even.

      Which is not to say we shouldn’t do anything about it. We definitely should. But common sense won’t save us unfortunately.

  • Usernameblankface@lemmy.world
    link
    fedilink
    arrow-up
    55
    arrow-down
    1
    ·
    1 year ago

    As you plan on messing with her feed, I’d like to warn you that a sudden change in her recommendations could seem to her like the whole internet got censored and she can’t see the truth anymore. She would be cut off from a sense of community and a sense of having special inside knowledge, and that may make things worse rather than better.

    My non-proffessional prediction is that she would get bored with nothing to worry about and start actively seeking out bad news to worry over.

    • dox@lemmy.world
      link
      fedilink
      arrow-up
      24
      ·
      1 year ago

      I’d like to warn you that a sudden change in her recommendations could seem to her like the whole internet got censored and she can’t see the truth anymore.

      This is exactly the response my neighbors have. “Things get censored, so that must mean that the gov must be hiding this info” It’s truly insane to see this happen

      • Jimmycrackcrack@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Well yeh but, the concern is unintended consequences which sounds entirely likely. It’s kind of fucked up to even be considering doing this to an adult, entirely entitled to their own choice of viewing habits, without their knowledge and by surreptitious use of their account and it’s only dubiously ethical because it’s an act of kindness against machine generated manipulation of a far more insidious nature and for far less than altruistic reasons.

        You’d hardly want it to backfire after taking this step. By posing this question the OP obviously already considers the pseudo cult his Mum’s getting sucked in to be a bad thing so it doesn’t need further signalling but being sure to do it carefully with an eye on the actual effect is probably wise or the whole endeavour would be a waste and might push her further in to the waiting arms of lunatics and charlatans.

      • imPastaSyndrome@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Cold turkey isn’t the best solution if you truly think that is where they’re at for exactly the reason they described. How don’t you get that?

      • Usernameblankface@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I agree that cult mindset or psychosis is what she has going on. But suddenly removing the source from her computer without her knowledge could backfire because she’s already so invested.

  • 001100 010010@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    50
    ·
    1 year ago

    I’m a bit disturbed how people’s beliefs are literally shaped by an algorithm. Now I’m scared to watch Youtube because I might be inadvertently watching propaganda.

      • Mikina@programming.dev
        link
        fedilink
        arrow-up
        10
        ·
        edit-2
        1 year ago

        It’s even worse than “a lot easier”. Ever since the advances in ML went public, with things like Midjourney and ChatGPT, I’ve realized that the ML models are way way better at doing their thing that I’ve though.

        Midjourney model’s purpose is so receive text, and give out an picture. And it’s really good at that, even though the dataset wasn’t really that large. Same with ChatGPT.

        Now, Meta has (EDIT: just a speculation, but I’m 95% sure they do) a model which receives all data they have about the user (which is A LOT), and returns what post to show to him and in what order, to maximize his time on Facebook. And it was trained for years on a live dataset of 3 billion people interacting daily with the site. That’s a wet dream for any ML model. Imagine what it would be capable of even if it was only as good as ChatGPT at doing it’s task - and it had uncomparably better dataset and learning opportunities.

        I’m really worried for the future in this regard, because it’s only a matter of time when someone with power decides that the model should not only keep people on the platform, but also to make them vote for X. And there is nothing you can do to defend against it, other than never interacting with anything with curated content, such as Google search, YT or anything Meta - because even if you know that there’s a model trying to manipulate with you, the model knows - there’s a lot of people like that. And he’s already learning and trying how to manipulate even with people like that. After all, it has 3 billion people as test subjects.

        That’s why I’m extremely focused on privacy and about my data - not that I have something to hide, but I take a really really great issue with someone using such data to train models like that.

        • Cheers@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Just to let you know, meta has an open source model, llama, and it’s basically state of the art for open source community, but it falls short of chatgpt4.

          The nice thing about the llama branches (vicuna and wizardlm) is that you can run them locally with about 80% of chatgpt3.5 efficiency, so no one is tracking your searches/conversations.

          • Mikina@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I was using ChatGPT only as an example - I don’t think that making a chatbot AI is their focus, so it’s understandable that they are not as good at it - plus, I’d guess that making a coherent text is a lot harder than deciding what kind of video or posts to put in someones feed.

            And that AI, the one that takes users data as input and outputs what to show him in his feed to keep him glued to Facebook for as much as possible, I’m almost sure is one of the best ML we have on the world right now - simply because of the user base and time it has to learn on, and the sheer amount of data Meta has about users. But that’s also something that will never get public, naturally.

    • Mikina@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      My personal opinion is that it’s one of the first large cases of misalignment in ML models. I’m 90% certain that Google and other platforms have been for years already using ML models design for user history and data they have about him as an input, and what videos should they offer to him as an ouput, with the goal to maximize the time he spends watching videos (or on Facebook, etc).

      And the models eventually found out that if you radicalize someone, isolate them into a conspiracy that will make him an outsider or a nutjob, and then provide a safe space and an echo-chamber on the platform, be it both facebook or youtube, the will eventually start spending most of the time there.

      I think this subject was touched-upon in the Social Dillema movie, but given what is happening in the world and how it seems that the conspiracies and desinformations are getting more and more common and people more radicalized, I’m almost certain that the algorithms are to blame.

      • Ludrol@szmer.info
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        If youtube “Algorithm” is optimizing for watchtime then the most optimal solution is to make people addicted to youtube.

        The most scary thing I think is to optimize the reward is not to recommend a good video but to reprogram a human to watch as much as possible

        • Mikina@programming.dev
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          I think that making someone addicted to youtube would be harder, than simply slowly radicalizing them into a shunned echo chamber about a conspiracy theory. Because if you try to make someone addicted to youtube, they will still have an alternative in the real world, friends and families to return to.

          But if you radicalize them into something that will make them seem like a nutjob, you don’t have to compete with their surroundings - the only place where they understand them is on the youtube.

      • archomrade [he/him]@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        100% they’re using ML, and 100% it found a strategy they didn’t anticipate

        The scariest part of it, though, is their willingness to continue using it despite the obvious consequences.

        I think misalignment is not only likely to happen (for an eventual AGI), but likely to be embraced by the entities deploying them because the consequences may not impact them. Misalignment is relative

      • MonkCanatella@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        fuck, this is dark and almost awesome but not in a good way. I was thinking the fascist funnel was something of a deliberate thing, but may be these engagement algorithms have more to do with it than large shadow actors putting the funnels into place. Then there’s the folks who will create any sort of content to game the algorithm and you’ve got a perfect trifecta of radicalization

        • floofloof@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          1 year ago

          Fascist movements and cult leaders long ago figured out the secret to engagement: keep people feeling threatened, play on their insecurities, blame others for all the problems in people’s lives, use fear and hatred to cut them off from people outside the movement, make them feel like they have found a bunch of new friends, etc. Machine learning systems for optimizing engagement are dealing with the same human psychology, so they discover the same tricks to maximize engagement. Naturally, this leads to YouTube recommendations directing users towards fascist and cult content.

          • MonkCanatella@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            That’s interesting. That it’s almost a coincidence that fascists and engagement algorithms have similar methods to suck people in.

    • niktemadur@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      You watch this one thing out of curiosity, morbid curiosity, or by accident, and at the slightest poke the goddamned mindless algorithm starts throwing this shit at you.

      The algorithm is “weaponized” for who screams the loudest, and I truly believe it started due to myopic incompetence/greed, not political malice. Which doesn’t make it any better, as people don’t know how to take care of themselves from this bombardment, but the corporations like to pretend that ~~they~~ people can, so they wash their hands for as long as they are able.

      Then on top of this, the algorithm has been further weaponized by even more malicious actors who have figured out how to game the system.
      That’s how toxic meatheads like infowars and joe rogan get a huge bullhorn that reaches millions. “Huh… DMT experiences… sounds interesting”, the format is entertaining… and before you know it, you are listening to anti-vax and qanon excrement, your mind starts to normalize the most outlandish things.

      EDIT: a word, for clarity

      • Jaywarbs@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Whenever I end up watching something from a bad channel I always delete it from my watch history, in case that affects my front page too.

        • emptyother@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Huh, I tried that. Still got recommended incel-videos for months after watching a moron “discuss” the Captain Marvel movie. Eventually went through and clicked “dont recommend this” on anything that showed on my frontpage, that helped.

        • Sludgehammer@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          I do that, too.

          However I’m convinced that Youtube still has a “suggest list” bound to IP addresses. Quite often I’ll have videos that other people in my household have watched suggested to me. While some of it can be explained by similar interests, but it happens a suspiciously often.

          • Drunemeton@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            I can confirm the IP-based suggestions!

            My hubs and I watch very different things. Him: photography equipment reviews, photography how to’s, and old, OLD movies. Me: Pathfinder 2e, quantum field theory/mechanics and Dip Your Car.

            Yet we both see stuff in the other’s Suggestions of videos the other recently watched. There’s ZERO chance based on my watch history that without IP-based suggestions YT is going to think I’m interested in watching a Hasselblad DX2 unboxing. Same with him getting PBS Space Time’s suggestions.

    • static@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      My normal YT algorithm was ok, but shorts tries to pull me to the alt-right.
      I had to block many channels to get a sane shorts algorythm.

      “Do not recommend channel” really helps

      • AstralPath@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        It really does help. I’ve been heavily policing my Youtube feed for years and I can easily see when they make big changes to the algorithm because it tries to force feed me polarizing or lowest common denominator content. Shorts are incredibly quick to smother mebin rage bait and if you so much as linger on one of those videos too long, you’re getting a cascade of alt-right bullshit shortly after.

      • Andreas@feddit.dk
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Using Piped/Invidious/NewPipe/insert your preferred alternative frontend or patched client here (Youtube legal threats are empty, these are still operational) helps even more to show you only the content you have opted in to.

    • DaGuys470@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Just this week I stumbled across a new YT channel that seemed to talk about some really interesting science. Almost subscribed, but something seemed fishy. Went on the channel and saw the other videos, immediately got the hell out. Conspiracies and propaganda lurk everywhere and no one is save. Mind you, I’m about to get my bachelor’s degree next year, meaning I have received proper scientific education. Yet I almost fell for it.

    • nLuLukna @sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Reason and critical thinking is all the more important in this day and age. It’s just no longer taught in schools. Some simple key skills like noticing fallacies or analogous reasoning, and you will find that your view on life is far more grounded and harder to shift

      • cynar@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Just be aware that we can ALL be manipulated, the only difference is the method. Right now, most manipulation is on a large scale. This means they focus on what works best for the masses. Unfortunately, modern advances in AI mean that automating custom manipulation is getting a lot easier. That brings us back into the firing line.

        I’m personally an Aspie with a scientific background. This makes me fairly immune to a lot of manipulation tactics in widespread use. My mind doesn’t react how they expect, and so it doesn’t achieve the intended result. I do know however, that my own pressure points are likely particularly vulnerable. I’ve not had the practice resisting having them pressed.

        A solid grounding gives you a good reference, but no more. As individuals, it is down to us to use that reference to resist undue manipulation.

      • Dark Arc@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I think it’s worth pointing out “no longer” is not a fair assessment since this is regularly an issue with older Americans.

        I’m inclined to believe it was never taught in schools, and is probably more likely to be a subject teachers are increasingly likely to want to teach (i.e. if politics didn’t enter the classroom it would already be being taugh, and might be in some districts).

        The older generations were given catered news their entire lives, only in the last few decades have they had to face a ton of potentially insidious information. The younger generations have had to grow up with it.

        A good example is that old people regularly click malicious advertising, fall for scams, etc, they’re generally not good at applying critical thinking to a computer, where as younger people (typically though I hear this is regressing some with smartphones) know about this stuff and are used to validating their information (or at least have a better “feel” for what’s fishy).

      • MonkCanatella@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        imagine if they taught critical media literacy in schools. of course that would only be critical media literacy with an american propaganda backdoor but still

    • masquenox@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I have to clear out my youtube recommendations about once a week… no matter how many times I take out or report all the right-wing garbage, you can bet everything that by the end of the week there will be a Jordan Peterson or PragerU video in there. How are people who aren’t savvy to the right-wing’s little “culture war” supposed to navigate this?

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I find it interesting how some people have so vastly different experience with YouTube than me. I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I’m interested in. I even watch occasional political videos, gun videos and police bodycam videos but it’s still not trying to force any radical stuff down my throat. Not even when I click that button which asks if I want to see content outside my typical feed.

      • Andreas@feddit.dk
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I’m interested in.

        The algorithm’s goal is to get you addicted to Youtube. It has already succeeded. For the rest of us who watch one video a day, if at all, it employs more heavy-handed strategies.

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        The experience is different because it’s not one algorithm for everyone.

        Demographics are targeted differently. If you actually get a real feed, it’s only because no one has yet paid YouTube for guiding you towards their product.

        It would be an interesting experiment to set up two identical devices and then create different Google profiles for each just to watch the algorithm take them in different directions.

      • livus@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        My youtube is usually ok but the other day I googled an art exhibition on loan from the Tate Gallery, and now youtube is trying to show me Andrew Tate.

    • froggers@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      At this point, any channel that I know is either bullshit or annoying af I just block. Out of sight out of mind.

      • youthinkyouknowme@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Same. I have ads blocked and open YouTube directly to my subbed channels only. Rarely open the home tab or check related videos because of the amount of click bait and bs.

  • Jackolantern@lemmy.world
    link
    fedilink
    arrow-up
    48
    arrow-down
    1
    ·
    1 year ago

    Oof that’s hard!

    You may want to try the following though to clear the algorithm up

    Clear her YouTube watch history: This will reset the algorithm, getting rid of a lot of the data it uses to make recommendations. You can do this by going to “History” on the left menu, then clicking on “Clear All Watch History”.

    Clear her YouTube search history: This is also part of the data YouTube uses for recommendations. You can do this from the same “History” page, by clicking “Clear All Search History”.

    Change her ‘Ad personalization’ settings: This is found in her Google account settings. Turning off ad personalization will limit how much YouTube’s algorithms can target her based on her data.

    Introduce diverse content: Once the histories are cleared, start watching a variety of non-political, non-conspiracy content that she might enjoy, like cooking shows, travel vlogs, or nature documentaries. This will help teach the algorithm new patterns.

    Dislike, not just ignore, unwanted videos: If a video that isn’t to her taste pops up, make sure to click ‘dislike’. This will tell the algorithm not to recommend similar content in the future.

    Manually curate her subscriptions: Unsubscribe from the channels she’s not interested in, and find some new ones that she might like. This directly influences what content YouTube will recommend.

  • Chunk@lemmy.world
    link
    fedilink
    arrow-up
    44
    ·
    1 year ago

    I curate my feed pretty often so I might be able to help.

    The first, and easiest, thing to do is to tell Youtube you aren’t interested in their recommendations. If you hover over the name of a video then three little dots will appear on the right side. Clicking them opens a menu that contains, among many, two options: Not Interested and Don’t Recommend Channel. Don’t Recommend Channel doesn’t actually remove the channel from recommendations but it will discourage the algorithm from recommending it as often. Not Interested will also inform the algorithm that you’re not interested, I think it discourages the entire topic but it’s not clear to me.

    You can also unsubscribe from channels that you don’t want to see as often. Youtube will recommend you things that were watched by other people who are also subscribed to the same channels you’re subscribed to. So if you subscribe to a channel that attracts viewers with unsavory video tastes then videos that are often watched by those viewers will get recommended to you. Unsubscribing will also reduce how often you get recommended videos by that content creator.

    Finally, you should watch videos you want to watch. If you see something that you like then watch it! Give it a like and a comment and otherwise interact with the content. Youtube knows when you see a video and then go to the Channel’s page and browse all their videos. They track that stuff. If you do things that Youtube likes then they will give you more videos like that because that’s how Youtube monetizes you, the user.

    To de-radicalize your mom’s feed I would try to

    1. Watch videos that you like on her feed. This introduces them to the algorithm.
    2. Use Not Interested and Don’t Recommend Channel to slowly phase out the old content.
    3. Unsubscribe to some channels she doesn’t watch a lot of so she won’t notice.
    • Ducks@ducks.dev
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      OP listen to this comment. YouTubes goal is to feed you as much related content to keep you on the site as long as possible. Radical content or otherwise, any engagement to them is positive. You can spend some time curating the feed so that the algorithm works in your favor and the algorithm will adjust very quickly to new interests.

    • Beanerrr@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 year ago

      I can confirm that this works quite well. I use these tactics all the time on my parents’ feed to keep them from watching too many “crap” news, pardon my french. There’s a very notorious news channel in our country that insists on feeding bad (and only the bad) news - I often remove their channel from the suggested feed play videos of funny fails/wins, cute cats, daily dose of internet and other happy nonesense.

      Give us an update sometime @driving_crooner@lemmy.eco.br , hope all goes well with your mom.

      • PlushySD@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        Yup pretty easy too. I just checked on my mobile app this is how you do it

        • Click ‘Library’
        • On your History section, click ‘view all’
        • Then click the three dots upper right
        • Click ‘clear all watch history’

        You can also do this from the google history management page on the web but not sure the step tho.

        • smcharles@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I’m not sure about this tbh, I’ve had YouTube history disabled/cleared for years and it still recommends content that is relevant to things I’ve been watching. But maybe it at least can help?

      • Hamartiogonic@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Yes. I’ve nuked my history once to start from a clean slate. Alternatively, you can also delete harmful videos from your history one by one, but that could take a while.

  • Snapz@lemmy.world
    link
    fedilink
    arrow-up
    37
    ·
    1 year ago

    First unsub from worst channels and report a few of the worst channels in general feed with “Don’t show me this anymore”. Then go into actual Google profile settings, not just YouTube. Delete and Pause/Off Watch History and even web history. It will still eventually creep back up, but temp relief.

  • NightOwl@lemmy.one
    link
    fedilink
    arrow-up
    35
    arrow-down
    1
    ·
    1 year ago

    Switch her to FreeTube on desktop. Can still subscribe to keep a list of channels she likes, but won’t get the YouTube algorithm recommendations on the home page.

    For mobile something like newpipe for the same algorithm removed experience.

    • CoatGhost@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I love FreeTube but sometimes I just can’t get videos to play. Or they take forever to play because I have to pause for buffering. Still better than the actual site but still can be frustrating.

  • KeisukeTakatou@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    1 year ago

    I just want to share my sympathies on how hard it must be when she goes and listens to those assholes on YouTube and believes them but won’t accept her family’s help telling her what bullshit all that is.

    I hope you get her out of that zone op.

    • I Cast Fist@programming.dev
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      Invidious is different enough that dummies will complain that “it’s not youtube”

      Source: had my mom and ex-gf complain exactly that. “But it doesn’t show ads” wasn’t enough quality of life for them over the known UI.

  • lemmylommy@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    In the google account privacy settings you can delete the watch and search history. You can also delete a service such as YouTube from the account, without deleting the account itself. This might help starting afresh.

    • twistedtxb@lemmy.ca
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      I was so weirded out when I found out that you can hear ALL of your “hey Google” recordings in these settings.

      • BombOmOm@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        6
        ·
        1 year ago

        Yeah, anything you send Google/Amazon/Facebook they will keep. I have been moving away from them. Protonmail for email, etc.