Wondering if Modern LLMs like GPT4, Claude Sonnet and llama 3 are closer to human intelligence or next word predictor. Also not sure if this graph is right way to visualize it.

  • Todd Bonzalez@lemm.ee
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    1 month ago

    Human intelligence created language. We taught it to ourselves. That’s a higher order of intelligence than a next word predictor.

    • Sl00k@programming.dev
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      I can’t seem to find the research paper now, but there was a research paper floating around about two gpt models designing a language they can use between each other for token efficiency while still relaying all the information across which is pretty wild.

      Not sure if it was peer reviewed though.

    • sunbeam60@lemmy.one
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      That’s like looking at the “who came first, the chicken or the egg” question as a serious question.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      I mean, to the same degree we created hands. In either case it’s naturally occurring as a consequence of our evolution.