@[email protected] to Science [email protected]English • 1 year agoHuhsh.itjust.worksimagemessage-square46fedilinkarrow-up1561
arrow-up1561imageHuhsh.itjust.works@[email protected] to Science [email protected]English • 1 year agomessage-square46fedilink
minus-square@[email protected]linkfedilinkEnglish79•1 year agoIt all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
minus-square@[email protected]linkfedilinkEnglish1•1 year agoHey, but if Sam says it might be AGI he might get a trillion dollars so shut it /s
It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
Hey, but if Sam says it might be AGI he might get a trillion dollars so shut it /s