• jubilationtcornpone@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    “Prompt Engineering”: AKA explaining to Chat GPT why it’s wrong a dozen times before it spits out a useable (but still not completely correct) answer.

    • ByteOnBikes@slrpnk.net
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      That’s actually a valid skill to know when to tell the AI that it’s wrong.

      A few months ago, I had to talk to my juniors to think critically about the shitty code that AI was generating. I was getting sick of clearly copy-pasted code from chatGPT and the junior not knowing what the fuck they were submitting to code review.

      • pfm@scribe.disroot.org
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        I’m trying to convince a senior developer from the team I’m a member of, to stop using copilot. They have committed code that they didn’t understand (only tested to verify it does what it’s expected to do). I doubt it’d succeed…

      • Evotech@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        Should start asking them like, why did you do this? Why did you chose this method? To make them sweat :p