Can it really get worse than most of the shit side quest lines that are already pretty standard? Go here and fetch me this thing and I’ll give you something shiny is already the bottom of the barrel.
I finished everything in the base game for Just Cause 4 last year, and it was literally taking me more time to drive from one activity to the next than to do the activity. But of course there has to be 80 of them
People comment on LLM stuff about how it’s ‘soulless’ having only used basic sanitized stuff built on the technology, and usually not even the SotA models. They’ll spend 15 minutes using the free ChatGPT and write it off as ‘soulless.’
Anyone who was around in the first few weeks of the initial closed rollout of GPT-4 for Bing knows what a less lobotomized version of what’s already year old tech can look like. In another year or two by the time AAA games built on LLMs are just starting to enter serious production people aren’t going to believe what it can actually look like when the emotion guardrails are taken away.
The current models are so ‘soulless’ because the initial rollout of the model was so soulful that it was freaking people out.
A lot of games have crap writing, particularly for side content, and the quality of a model that emulates emotional language as if actually that character in that given context is going to be a big step up.
Can it really get worse than most of the shit side quest lines that are already pretty standard? Go here and fetch me this thing and I’ll give you something shiny is already the bottom of the barrel.
I finished everything in the base game for Just Cause 4 last year, and it was literally taking me more time to drive from one activity to the next than to do the activity. But of course there has to be 80 of them
Yes, it can.
No, it won’t be.
People comment on LLM stuff about how it’s ‘soulless’ having only used basic sanitized stuff built on the technology, and usually not even the SotA models. They’ll spend 15 minutes using the free ChatGPT and write it off as ‘soulless.’
Anyone who was around in the first few weeks of the initial closed rollout of GPT-4 for Bing knows what a less lobotomized version of what’s already year old tech can look like. In another year or two by the time AAA games built on LLMs are just starting to enter serious production people aren’t going to believe what it can actually look like when the emotion guardrails are taken away.
The current models are so ‘soulless’ because the initial rollout of the model was so soulful that it was freaking people out.
A lot of games have crap writing, particularly for side content, and the quality of a model that emulates emotional language as if actually that character in that given context is going to be a big step up.