• @[email protected]
    link
    fedilink
    English
    1
    edit-2
    3 months ago

    i suspect we are going to have a semantic disagreement on what “understanding” means here.

    ChatGPT is absolutely trained on the concept of calendars, that iPhones indeed have calendars, and how they work. It doesn’t need to “understand” what a calendar is on a deep epistemological level to process requests about them. If you ask chatGPT a question about calendars, it’ll answer you. So in that shallower sense LLMs absolutely “understand” what you mean, and that’s enough for chatGPT to help siri.

    • @[email protected]
      link
      fedilink
      English
      13 months ago

      No, they really don’t. They answer you with words that are commonly found in conjunction with calendars. There is code in Siri that understands calendars, but the LLM part ain’t it. I have, ahem, firsthand knowledge of how Siri does this.

      • @[email protected]
        link
        fedilink
        English
        1
        edit-2
        3 months ago

        There is code in Siri that understands calendars, but the LLM part ain’t it.

        yes i know, you are getting hung up on my colloquial use of “understand”. the LLM doesn’t need to “understand” it on that level because siri does. the LLM is there to parse the language and hand off to Siri. that’s all i’m saying.

        • @[email protected]
          link
          fedilink
          English
          13 months ago

          Fair. I think it’s important from an “Is this True AGI?” sense to distinguish these, but yes, in the colloquial sense I guess the system could be said to understand, even if it’s not strictly the actual LLM part that does it.