A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    13
    ·
    8 months ago

    That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

    • Khrux@ttrpg.network
      link
      fedilink
      English
      arrow-up
      64
      arrow-down
      8
      ·
      edit-2
      8 months ago

      I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.

      I wish everyone involved in this use of AI a very awful day.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      49
      arrow-down
      5
      ·
      8 months ago

      The people being exploited are the ones who are the victims of this, not people who paid for it.

        • sbv@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          31
          arrow-down
          1
          ·
          8 months ago

          It seems like there’s a news story every month or two about a kid who kills themselves because videos of them are circulating. Or they’re being blackmailed.

          I have a really hard time thinking of the people who spend ten bucks making deep fakes of other people as victims.

          • Sentient Loom@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            16
            ·
            edit-2
            8 months ago

            I have a really hard time thinking

            Your lack of imagination doesn’t make the plight of non-consensual AI-generated porn artists any less tragic.

        • Vanth@reddthat.com
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          edit-2
          8 months ago

          How are the perpetrators victims?

          I could see an argument for someone in need of money making AI generated porn of themselves. Like, don’t judge sex workers, they’re just trying to make money. But taking someone else’s image without their consent is more akin to Tate coercing his “girlfriends” into doing cam work and taking all the money and ensuring they can’t escape. He’s not a victim nor a sex worker, he’s a criminal.

          • Sentient Loom@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            10
            ·
            8 months ago

            Writing /s would have implied that my fellow lemurs don’t get jokes, and I give them more credit than that.

            • Vanth@reddthat.com
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              edit-2
              8 months ago

              I would love to assume Lemmy users are intelligent enough to realize text-only sarcastic jokes about sex criminals are almost never a good idea, but alas, I’ve been on the internet longer than two weeks.

              • Sentient Loom@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                9
                ·
                8 months ago

                Some people just don’t have a sense of humor.

                And those people are YOU!!

                Thanks for the finger-wagging, you moralistic rapist!

      • Dkarma@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        11
        ·
        8 months ago

        No one’s a victim no one’s being exploited. Same as taping a head on a porno mag.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      2
      ·
      8 months ago

      IDK, $10 seems pretty reasonable to run a script for someone who doesn’t want to. A lot of people have that type of arrangement for a job…

      That said, I would absolutely never do this for someone, I’m not making nudes of a real person.

    • IsThisAnAI@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      edit-2
      8 months ago

      Scam is another thing. Fuck these people selling.

      But fuck dude they aren’t taking advantage of anyone buying the service. That’s not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

      NOBODY on that side of the equation are bring exploited 🤣

    • OKRainbowKid@feddit.de
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      8 months ago

      In my experience with SD, getting images that aren’t obviously “wrong” in some way takes multiple iterations with quite some time spent tuning prompts and parameters.

    • M500@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      8 months ago

      Wait? This is a tool built into stable diffusion?

      In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.

        • bassomitron@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          8 months ago

          Img2img isn’t always spot-on with what you want it to do, though. I was making extra pictures for my kid’s bedtime books that we made together and it was really hit or miss. I’ve even goofed around with my own pictures to turn myself into various characters and it doesn’t work out like you want it to much of the time. I can imagine it’s the same when going for porn, where you’d need to do numerous iterations and tweaking over and over to get the right look/facsimile. There are tools/SD plugins like Roop which does make transferring over faces with img2img easier and more reliable, but even then it’s still not perfect. I haven’t messed around with it in several months, so maybe it’s better and easier now.

        • BlackPenguins@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          It depends on the models you use too. There’s specific training models data out there and all you need to do is give it a prompt of “naked” or something and it’s scary good at making something realistic in 2 minutes. But yeah, there is a learning curve at setting everything up.

        • M500@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          Thanks for the link, I’ve been running some llm locally, and I have been interested in stable diffusion. I’m not sure I have the specs for it at the moment though.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 months ago

            An iPhone from 2018 can run Stable Diffusion. You can probably run it on your computer. It just might not be very fast.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            By the way, if you’re interested in Stable Diffusion and it turns out your computer CAN’T handle it, there are sites that will let you toy around with it for free, like civitai. They host an enormous number of models and many of them work with the site’s built in generation.

            Not quite as robust as running it locally, but worth trying out. And much faster than any of the ancient computers I own.