I asked GPT-4 for a list of the most important threats to human civilization, their likelihood, and why they were considered threats.

GPT’s output is also pasted into the comments.

  • Butt Pirate
    link
    fedilink
    12 years ago

    ChatGPT doesn’t know what it’s saying. You can’t ask it for information like this, it just isn’t accurate.

      • Butt Pirate
        link
        fedilink
        22 years ago

        Fair. I just see a lot of these posts where people say “well ChatGPT says this” and I’m trying to educate people that that doesn’t actually mean anything yet.

        • @[email protected]OP
          link
          fedilink
          12 years ago

          Well it means something. Maybe it doesn’t mean that it’s true, but it means something about what comes out.

          I mostly assume this is what it’s read in a huge number of articles and blog posts.

          It’s regurgitating it here. Like querying a big opinion pool, using questions as the query language. It’s just rewriting passages in its own words to cobble together a report on whatever I ask it, which is fine for my purposes.

          I realize the percentages are probably just the percentages various people came up with, and that’s fine with me too.

          My intention with the prompt here isn’t so much “predict the future” as it is “summarize current thinking”.

          Like when I ask it to tell me organic chemistry. I know it’s just reporting what it read.