• Fades@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    The children will defend tiktok with their lives, crying about how it doesn’t matter non-allied foreign powers can manipulate the algos and narratives.

    • misk@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      I don’t see this as a China problem. It’s a lack of regulations and oversight problem. Allied foreign powers (or one specific power to be precise) push far right via social media onto Europe as well.

  • demesisx@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    4 months ago

    I am a fairly radical leftist and a pacifist and you wouldn’t believe the amount of hoo-ra military, toxic masculinity, explosions and people dying, gun-lover bullshit the YouTube algorithm has attempted to force down my throat in the past year. I block every single channel that they recommend yet I am still inundated.

    Now, with shorts, it’s like they reset their whole algorithm entirely and put it into sensationalist overdrive to compete with TikTok.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      Same here. I’ve never watched anything like that, yet my recommendations are filled with far-right grifters, interspersed with tech and cooking videos that are more my jam.

      YouTube seems to think I love Farage, US gun nuts, US free speech (to be racist) people, anti-LGBT (especially T) channels.

      I keep saying I’m not interested, yet they keep trying to convert me. Like fuck off YouTube, no I don’t want to see Jordan Peterson completely OWNS female liberal using FACTS and LOGIC

    • JustARegularNerd@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      The algorithm is just garbage at this point. I ultimately just watch YouTube exclusively through Invidious at this point, can’t imagine going back at this stage.

    • Tja@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      My shorts are full of cooking channels, “fun fact” spiky hair guy, Greenland lady, Michael from vsauce and aviation content. Solid 6.5/10

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I am a fairly radical leftist and a pacifist and you wouldn’t believe the amount of hoo-ra military, toxic masculinity, explosions and people dying, gun-lover bullshit the YouTube algorithm has attempted to force down my throat in the past year. I block every single channel that they recommend yet I am still inundated.

      I really want to know why their algorithm varies so wildly from person to person, this isn’t the first time I’ve seen people say this about YT.

      But in comparison, their algorithm seems to be fairly good in recommending what I’m actually interested in and none of all that other crap people always say. And when it does recommend something I’m not interested in, it’s usually something benign, like a video on knitting or something.

      None of this out of nowhere far right BS gets pushed to me and a lot of it I can tell why it’s recommending me it.

      For example my feed is starting to show some lawn care/landscaping videos and I know it’s likely related to the fact I was looking up videos on how to restring my weed trimmer.

      • marron12@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Maybe it depends on what you watch. I use Youtube for music (only things that I search for) and sometimes live streams of an owl nest or something like that.

        If I stick to that, the recommendations are sort of OK. Usually stuff I watched before. Little to no clickbait or random topics.

        I clicked on one reaction video to a song I listened to just to see what would happen. The recommendations turned into like 90% reaction videos, plus a bunch of topics I’ve never shown any interest in. U.S. politics, the death penalty in Japan, gaming, Brexit, some Christian hymns, and brand new videos on random topics.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        I think it depends on the things you watch. For example, if you watch a lot of counter-apologetics targeted towards Christianity, YouTube will eventually try out sending you pro-Christian apologetics videos. Similarly, if you watch a lot of anti-Conservative commentary, YouTube will try sending you Conservative crap, because they’re adjacent and share that “Conservative” thread.

        Additionally, if you click on those videos and add a negative comment, the algorithm just knows you engaged with it, and it will then flood your feed with more.

        It doesn’t care what your core interests are, it just aims for increasing your engagement by any means necessary.

    • pycorax@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      I gave up and used an extension to remove video recommendations, blocked shorts and auto redirect me to my subscriptions list away from the home page. It’s a lot more pleasant to use now.

    • The Menemen!@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      Don’t know about youtube, but I have a similar experience at twitter. I believe they probably see blocking, muting or reporting as “interaction” and show more of the same as a result.

      On youtube on the other hand, I never blocked a channel and almost never see militaristic or right wing stuff (despite following some gun nerds, because I think they are funny).

  • LengAwaits@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 months ago

    Did anyone actually read the whole article? These comments sorta read like the answer is no.

    The researchers say that their findings prove no active collaboration between TikTok and far-right parties like the AfD but that the platform’s structure gives bad actors an opportunity to flourish. “TikTok’s built-in features such as the ‘Others Searched For’ suggestions provides a poorly moderated space where the far-right, especially the AfD, is able to take advantage,” Miazia Schüler, a researcher with AI Forensics, tells WIRED.

    A better headline might have been “TikTok algorithm gamed by far-right AfD party in Germany”, but I doubt that would drive as many clicks.

    For more info, check out this article: Germany’s AfD on TikTok: The political battle for the youth

    • daddy32@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      It doesn’t matter if it is intentional or not, only the result matters: TikTok gave them boost in the visibility. Whether it was “an algorithm” or any other aspect of the company is irrelevant.

      • ZILtoid1991@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Also knowing how scummy these social media companies are and how they operate inside, there could be some internal memo to the moderators to let the AfD reign free (happened with Twitter and Libs of Tiktok).