First and foremost, this is not about AI/ML research, only about usage in generating content that you would potentially consume.

I personally won’t mind automated content if/when that reach current human generated content quality. Some of them probably even achievable not in very distant future, such as narrating audiobook (though it is nowhere near human quality right now). Or partially automating music/graphics (using gen AI) which we kind of accepted now. We don’t complain about low effort minimal or AI generated thumbnail or stock photo, we usually do not care about artistic value of these either. But I’m highly skeptical that something of creative or insightful nature could be produced anytime soon and we have already developed good filter of slops in our brain just by dwelling on the 'net.

So what do you guys think?

Edit: Originally I made this question thinking only about quality aspect, but many responses do consider the ethical side as well. Cool :).

We had the derivative work model of many to one intellectual works (such as a DJ playing a collection of musics by other artists) that had a practical credit and compensation mechanism. With gen AI trained on unethically (and often illegally) sourced data we don’t know what produce what and there’s no practical way to credit or compensate the original authors.

So maybe reframe the question by saying if it is used non commercially or via some fair use mechanism, would you still reject content regardless of quality because it is AI generated? Or where is the boundary for that?

  • HiddenLayer555@lemmy.ml
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    edit-2
    4 days ago

    Peel back the veneer of AI and you find the foundation of stolen training data it’s built on. They are stealing from the very content creators they aim to replace.

    Torrent a movie? You can potentially go to jail. Scrape the entire internet for content and sell it as a shitty LLM or art generator? That’s just an innovative AI startup which is doing soooooo much good for humanity.

    • jsomae@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      Even if they were able to train them without stealing, the threat they pose to our society would be equally problematic.

    • Sturgist@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      4 days ago

      Exactly, an equitable solution could be to pay royalties to artists that had their work stolen to train these algorithms. That, however, would require any of the generative algorithms to be operating at a profit, which they absolutely are not.

    • serenissi@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      ·
      4 days ago

      Torrent a movie? You can potentially go to jail. …

      Because artists are not billion doller hollywood studios with so many political lobbies and stubborn well paid lawyers, duh.