‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • jivandabeast@lemmy.browntown.dev
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    4
    ·
    1 year ago

    Sus question lmfao

    These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you’ll find them. It’s a massive issue and the content is everywhere

      • jivandabeast@lemmy.browntown.dev
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        We’re talking specifically about AI enhanced fakes, not the old school Photoshop fakes – they’re two completely different beasts

          • jivandabeast@lemmy.browntown.dev
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            1 year ago

            No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing

              • Delta_V@midwest.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.

              • jivandabeast@lemmy.browntown.dev
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I’m not saying that it’s a shift in nature? All I’ve been saying is:

                A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing

                B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects

              • barsoap@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                1 year ago

                The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn’t been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don’t have an artistic bone in their body.

            • lolcatnip@reddthat.com
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.