• yarr@feddit.nl
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    13
    ·
    8 months ago

    Are the platforms guilty or are the users that supplied the radicalized content guilty? Last I checked, most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves.

    • baru@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      ·
      8 months ago

      Those sites determine what they promote. Such sites often promote extreme views as it gets people to watch or view the next thing. Facebook for instance researched this outcome, then ignored that knowledge.

    • KneeTitts@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      8 months ago

      most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves

      Its their job to block that content before it reaches an audience, but since thats how they make their money, they dont or wont do that. The monetization of evil is the problem, those platforms are the biggest perpetrators.

      • yarr@feddit.nl
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        10
        ·
        8 months ago

        Its their job to block that content before it reaches an audience

        The problem is (or isn’t, depending on your perspective) that it is NOT their job. Facebook, YouTube, and Reddit are private companies that have the right to develop and enforce their own community guidelines or terms of service, which dictate what type of content can be posted on their platforms. This includes blocking or removing content they deem harmful, objectionable, or radicalizing. While these platforms are protected under Section 230 of the Communications Decency Act (CDA), which provides immunity from liability for user-generated content, this protection does not extend to knowingly facilitating or encouraging illegal activities.

        There isn’t specific U.S. legislation requiring social media platforms like Facebook, YouTube, and Reddit to block radicalizing content. However, many countries, including the United Kingdom and Australia, have enacted laws that hold platforms accountable if they fail to remove extremist content. In the United States, there have been proposals to amend or repeal Section 230 of CDA to make tech companies more responsible for moderating the content on their sites.

        • ITGuyLevi@programming.dev
          link
          fedilink
          English
          arrow-up
          9
          ·
          8 months ago

          The argument could be made (and probably will be) that they promote those activities by allowing their algorithms to promote that content. Its’s a dangerous precedent to set, but not unlikely given the recent rulings.

          • FlyingSpaceCow@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            8 months ago

            Any precedent here regardless of outcome will have significant (and dangerous) impact, as the status quo is already causing significant harm.

            For example Meta/Facebook used to prioritize content that generates an angry face emoji (over that of a “like”) - - as it results in more engagement and revenue.

            However the problem still exists. If you combat problematic content with a reply of your own (because you want to push back against hatred, misinformation, or disinformation) then they have even more incentiive to show similar content. And they justify it by saying “if you engaged with content, then you’ve clearly indicated that you WANT to engage with content like that”.

            The financial incentives as they currently exist run counter to the public good

          • joel_feila@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 months ago

            Yeah i have made that argument before. By pushing content via user recommended lists and auto play YouTube becomes a publisher and meeds to be held accountable

            • hybrid havoc@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              8 months ago

              Not how it works. Also your use of “becomes a publisher” suggests to me that you are misinformed - as so many people are - that there is some sort of a publisher vs platform distinction in Section 230. There is not.

              • joel_feila@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                8 months ago

                Oh no i am aware of that distinction. I just think it needs to go away and be replaced.

                Currently sec 230 treats websites as not responsible for user generated content. Example, if I made a video defaming someone I get sued but YouTube is in the clear. But if The New York Times publishes an article defaming someone they get sued not just the writer.

                Why? Because NYT published that article but YouTube just hosts it. This publisher platform distinction is not stated in section 230 but it is part of usa law.

                • hybrid havoc@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  7 months ago

                  This is frankly bizarre. I don’t understand how you can even write that and reasonably think that the platform hosting the hypothetical defamation should have any liability there. Like this is actually a braindead take.

        • Alien Nathan Edward@lemm.ee
          link
          fedilink
          English
          arrow-up
          8
          ·
          8 months ago

          this protection does not extend to knowingly facilitating or encouraging illegal activities.

          if it’s illegal to encourage illegal activities it’s illegal to build an algorithm that automates encouraging illegal activities

        • hybrid havoc@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          Repealing Section 230 would actually have the opposite effect, and lead to less moderation as it would incentivize not knowing about the content in the first place.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            8 months ago

            I can’t see that. Not knowing about it would be impossible position to maintain since you would be getting reports. Now you might say they will disable reports which they might try but they have to do business with other companies who will require that they do. Apple isn’t going to let your social media app on if people are yelling at Apple about the child porn and bomb threats on it, AWS will kick you as well, even Cloudflare might consider you not worth the legal risk. This has already happened multiple times even with section 230 providing a lot of immunity to these companies. Without that immunity they would be even more likely to block.

      • yarr@feddit.nl
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 months ago

        That’s why we have separate charges for drug manufacturing and distribution.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        I never liked that logic it’s basically “success has many father’s but failure is an orphan” applied.

        Are you involved with something immoral? To the extent of your involvement is the extent of how immoral your actions are. Same goes for doing the right thing.