• Zron@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 months ago

    To add to this, we’re going to run into the problem of garbage in, garbage out.

    LLMs are trained on text from the internet.

    Currently, a massive amount of text on the internet is coming from LLMs.

    This creates a cycle of models getting trained on data sets that increasingly contain large sets of data generated by older models.

    The most likely outlook is that LLMs will get worse as the years go by, not better.