• thebartermyth [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 year ago

    It’s so cynical and short sighted. The byproduct of referring to everything online as “content” without caring about what that content is. Their “how to bake a pie” example only makes sense if you don’t care about how to bake a pie. Short sighted because if the “content” is LLM garbage then google or w/e can easily just generate the output. Like if their example was “tensorflow optimization” there’s no way the coders would be like “yeah, that’s perfect. The robot will teach me.” Because they understand that LLMs give wrong information, they just assume that baking a pie is unworthy of real actual instruction. Ironically, I think the coder-tech-help-blog space is actually where you can generate nonsense content and get clicks.