It’s been there for a while and the article is also a bit older, but I only recently saw this.

  • TehPers@beehaw.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    8 months ago

    I think the reason people refer to LLMs as generative comes from the term GPT, which is short for generative pre-trained transformer I believe. At its core, it generates new outputs based on previous ones, and its purpose is to create new content. There are plenty of models that are not generative, like dedicated classifiers (think sentiment analyzers, models that try to identify what an object is, etc).