Pricefield | Lemmy
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
@[email protected] to [email protected]English • 1 year ago

Snowden: "They've gone full mask-off: do not ever trust OpenAI or its products"

twitter.com

message-square
177
fedilink
  • cross-posted to:
  • [email protected]
626
external-link

Snowden: "They've gone full mask-off: do not ever trust OpenAI or its products"

twitter.com

@[email protected] to [email protected]English • 1 year ago
message-square
177
fedilink
  • cross-posted to:
  • [email protected]
x.com
twitter.com
external-link

cross-posted from: https://lemmy.smeargle.fans/post/182373

HN Discussion

  • classic
    link
    fedilink
    14•1 year ago

    Is there a magazine or site that breaks this down for the less tech savvy? And is the quality of the AI on par?

    • @[email protected]
      link
      fedilink
      3•1 year ago

      Your best bet is YouTubing ollama.

    • Possibly linux
      cake
      link
      fedilink
      English
      6•1 year ago

      Ollama with Lava and Mistral

    • @[email protected]
      link
      fedilink
      9•1 year ago

      On par? No. Good enough? Definitely. Ollama baby

    • @[email protected]
      link
      fedilink
      21•1 year ago

      Check my notes https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence but as others suggested a good way to start is probably https://github.com/ollama/ollama/ and if you need a GUI https://gpt4all.io

      • @[email protected]
        link
        fedilink
        5•1 year ago

        I’m not the person who asked, but still thanks for the information. I might give this a try soon.

        • classic
          link
          fedilink
          2•1 year ago

          Ditto, thanks to everyone’s for their suggestions

      • @[email protected]
        link
        fedilink
        1•1 year ago

        You should have at least 16 GB of RAM available to run the 13B models,

        Is this gpu ram or cpu ram?

        • @[email protected]
          link
          fedilink
          English
          1•1 year ago

          Either works, but system RAM is at least an order of magnitude slower, more play by mail than chat…

        • KillingTimeItself
          link
          fedilink
          English
          2•1 year ago

          likely GPU ram, there is some tech that can offload ram, but generally it’s all hosted in VRAM, this requirement will likely fade as NPUs start becoming a thing though.

        • @[email protected]
          link
          fedilink
          1•1 year ago

          pretty sure it can run on either, but cpus are slow compared to gpus, often to the point of being impractical

[email protected]

[email protected]
Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

  • Lemmy.ml libre_culture
  • Lemmy.ml privatelife
  • Lemmy.ml DeGoogle
  • Lemmy.ca privacy

much thanks to @gary_host_laptop for the logo design :)

  • 31 users / day
  • 397 users / week
  • 1.27K users / month
  • 4.76K users / 6 months
  • 5 subscribers
  • 3.83K Posts
  • 101K Comments
  • Modlog
  • mods:
  • @[email protected]
  • tmpod
  • @[email protected]
  • @[email protected]
  • UI: 0.18.4
  • BE: 0.18.2
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org