db0@lemmy.dbzer0.com to TechTakes@awful.systemsEnglish · 6 months agoThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.message-squaremessage-square263fedilinkarrow-up1953arrow-down11file-text
arrow-up1952arrow-down1message-squareThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.db0@lemmy.dbzer0.com to TechTakes@awful.systemsEnglish · 6 months agomessage-square263fedilinkfile-text
minus-squaremilicent_bystandrlinkfedilinkEnglisharrow-up3·6 months agoYes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.
Yes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.