• pokexpert30@lemmy.pussthecat.org
    link
    fedilink
    arrow-up
    32
    arrow-down
    6
    ·
    7 months ago

    I kinda fail to see the problem. The GPU owner doesn’t see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There’s a demand, there’s an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do

    • mavu@discuss.tchncs.de
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      7 months ago

      The problem is that they are clearly targeting minors who don’t pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.

      This is a shitty grift, abusing people who don’t understand the consequences of the software.

      • blindsight@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        7 months ago

        Agreed. Preying on children who don’t understand what they’re signing up for is shitty to begin with.

        Then, add that deepfake AI porn is unethical and likely illegal (and who knows what other kinds of potentially-illegal images are being generated…)

        And, as you point out, the files having existed in the computer could, alone, be illegal.

        Then, as and extra fuck you, burning GPU cycles to make AI images is causing CO2 emissions, GPU wear, waste heat that might trigger AC, and other negative externalities too, I’m sure…

        It’s shit all around.

    • Amerikan Pharaoh@lemmygrad.ml
      link
      fedilink
      arrow-up
      7
      arrow-down
      6
      ·
      edit-2
      7 months ago

      Because most ai-generated pornography models are trained off actual nudes scraped off the internet; and not just those who work in the corporate porn industry. This essentially falls under the same morality as nonconsensual/revenge porn by allowing all and sundry to generate images off images the original posters never were polled for consent for.

      But I forgot, this comm is plagued with treathounds that meatspace kink communities would throw out for a rule 3 breach; so I don’t know why I’m inconveniencing the electrons to explain something that even the terminally-pornbrained should be able to comprehend…