Pricefield | Lemmy
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Karna to [email protected] • 6 months ago

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

external-link
message-square
53
fedilink
  • cross-posted to:
  • [email protected]
131
external-link

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

Karna to [email protected] • 6 months ago
message-square
53
fedilink
  • cross-posted to:
  • [email protected]
Orbit by Mozilla is a new AI-powered assistant for the Firefox web browser that makes summarising web content while you browse as easy as clicking a
  • Jeena
    link
    fedilink
    English
    28•6 months ago

    Thanks for the summary. So it still sends the data to a server, even if it’s Mozillas. Then I still can’t use it for work, because the data is private and they wouldn’t appreciate me sending their data toozilla.

    • KarnaOP
      link
      fedilink
      21•6 months ago

      In such scenario you need to host your choice of LLM locally.

      • @[email protected]
        link
        fedilink
        English
        5•6 months ago

        does the addon support usage like that?

        • KarnaOP
          link
          fedilink
          7•6 months ago

          No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

          I have this setup running for a while now.

          • @[email protected]
            link
            fedilink
            4•6 months ago

            Which model you are running? Who much ram?

            • KarnaOP
              link
              fedilink
              4•
              edit-2
              6 months ago

              My (docker based) configuration:

              Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

              Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

              Docker: https://docs.docker.com/engine/install/

              Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

              Open WebUI: https://docs.openwebui.com/

              Ollama: https://hub.docker.com/r/ollama/ollama

    • @[email protected]
      link
      fedilink
      1•
      edit-2
      6 months ago

      According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.

      If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.

    • @[email protected]
      link
      fedilink
      12•
      edit-2
      5 days ago

      deleted by creator

[email protected]

[email protected]
Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

A place to discuss the news and latest developments on the open-source browser Firefox

  • 4 users / day
  • 29 users / week
  • 73 users / month
  • 969 users / 6 months
  • 2 subscribers
  • 1.09K Posts
  • 19.1K Comments
  • Modlog
  • mods:
  • @[email protected]
  • UI: 0.18.4
  • BE: 0.18.2
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org