• Wren
    link
    fedilink
    1030 days ago

    But they’re saying they do know. And they are correct.

    • @[email protected]
      link
      fedilink
      English
      129 days ago

      I know I’m the smartest man on earth. And I’m correct.

      See how crazy that sounds? Just because someone is confident about something doesn’t make it true.

      • @[email protected]
        link
        fedilink
        English
        6
        edit-2
        29 days ago

        Please apply that to this:

        all I said was that I don’t know and neither do you.

        Because there is not any evidence whatsoever that there is consciousness associated with LLMs. We have ample evidence that consciousness is associated with many forms of biological life.

        I’m not even aware of a scholarly theory suggesting there might be consciousness associated with LLMs. Now, I’m not an LLM expert, but neither are you (hurr durr) and so I think if you are going to suggest that maybe consciousness exists there, it should be based off something other than “hey man you never know” which is pretty much what it feels like. (Or you should be unsurprised when folks find that assertion unconvincing.)

        • @[email protected]
          link
          fedilink
          English
          229 days ago

          Honestly, I’m not surprised. I obviously didn’t phrase my argument in a compelling way.

          I disagree that we don’t have evidence for conciousness in LLMs. They have been showing behavior previously attributed only to highly intelligent, sentient creatures, i.e. us. To me it seems very plausible that when you have a large network of neurons, be they artificial or biological, with specialized circuits for processing specific stimuli that some sort of sentience could emerge.

          If you want academic research on this you just have to take a look. Researchers have been discussing this topic for decades. There isn’t a working theory of machine sentience simply because we don’t have one that works for natural systems. But that obviously doesn’t rule it out. After all, why should sentience be constrained to squishy matter? In any case, I think we can all agree something very interesting is going on with LLMs.

      • Wren
        link
        fedilink
        329 days ago

        I think you know exactly how empirically provable facts work. And I also think you’re a troll.