• evranch@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    7 months ago

    I think you’re misreading the point I’m trying to make. I’m not arguing that LLM is AGI or that it can understand anything.

    I’m just questioning what the true use case of AGI would be that can’t be achieved by existing expert systems, real humans, or a combination of both.

    Sure Deepseek or Copilot won’t answer your legal questions. But neither will a real programmer. Nor will a lawyer be any good at writing code.

    However when the appropriate LLMs with the appropriate augmentations can be used to write code or legal contracts under human supervision, isn’t that good enough? Do we really need to develop a true human level intelligence when we already have 8 billion of those looking for something to do?

    AGI is a fun theoretical concept, but I really don’t see the practical need for a “next step” past the point of expanding and refining our current deep learning models, or how it would improve our world.

    • melpomenesclevage
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Those are not meaningful use cases for llm’s.

      And they’re getting worse at even faking it now.