• @[email protected]
    link
    fedilink
    English
    272 years ago

    Can someone help me do this in practise? Gpt sucks since they neutered it. It’s so stupid, anything I ask, half of the text is the warning label, the rest is junk text. Like I really need chatgpt if I wanted Recepie for napalm, lol. We found the anarchist cookbook when we were 12 in the 90s. I just want a better ai.

    • @[email protected]
      link
      fedilink
      English
      42 years ago

      Bard isn’t as neutered and doesn’t kick you out after it read an article containing the words sex after asking it a question about pregnancy. Sadly, bard sucks. Just wait for gemini since they say it’s pretty good.

      • Echo Dot
        link
        fedilink
        English
        32 years ago

        Does anyone understand why Gemini is not going to be released in Europe I don’t understand that.

        • @[email protected]
          link
          fedilink
          English
          112 years ago

          Likely, regulations. Big tech wants to mercilessly siphon every drop of juicy data about you and the EU has a few laws against this.

    • @[email protected]
      link
      fedilink
      English
      132 years ago

      If you have decent hardware, running ‘Oobabooga’ locally seems to be the best way to achieve decent results. Not only can you remove the limitations through running uncensored models (wizardlm-uncensored), but can prompt the creation of more practical results by writing the first part of the AI’s response.

    • @[email protected]
      link
      fedilink
      English
      72 years ago

      You can run smaller models locally, and they can get the job done, but they are not as good as the huge models that would not fit on a your graphics card.

      If you are technically adept and can run python, you can try using this:

      https://gpt4all.io/index.html

      It has a front end, and I can run queries against it in the same API format as sending them to openai.