db0@lemmy.dbzer0.com to TechTakes@awful.systemsEnglish · 1 month agoThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.message-squaremessage-square262fedilinkarrow-up1950arrow-down11file-text
arrow-up1949arrow-down1message-squareThe Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.db0@lemmy.dbzer0.com to TechTakes@awful.systemsEnglish · 1 month agomessage-square262fedilinkfile-text
minus-squaremilicent_bystandrlinkfedilinkEnglisharrow-up3·1 month agoYes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.
Yes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.