Mathers
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Karna@lemmy.ml to Firefox@lemmy.ml · 6 months ago

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

external-link
message-square
55
fedilink
81
external-link

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

Karna@lemmy.ml to Firefox@lemmy.ml · 6 months ago
message-square
55
fedilink
Orbit by Mozilla is a new AI-powered assistant for the Firefox web browser that makes summarising web content while you browse as easy as clicking a
  • Jeena@piefed.jeena.net
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    6 months ago

    Thanks for the summary. So it still sends the data to a server, even if it’s Mozillas. Then I still can’t use it for work, because the data is private and they wouldn’t appreciate me sending their data toozilla.

    • Karna@lemmy.mlOP
      link
      fedilink
      arrow-up
      21
      ·
      6 months ago

      In such scenario you need to host your choice of LLM locally.

      • ReversalHatchery@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 months ago

        does the addon support usage like that?

        • Karna@lemmy.mlOP
          link
          fedilink
          arrow-up
          7
          ·
          6 months ago

          No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

          I have this setup running for a while now.

          • cmgvd3lw@discuss.tchncs.de
            link
            fedilink
            arrow-up
            4
            ·
            6 months ago

            Which model you are running? Who much ram?

            • Karna@lemmy.mlOP
              link
              fedilink
              arrow-up
              4
              ·
              edit-2
              6 months ago

              My (docker based) configuration:

              Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

              Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

              Docker: https://docs.docker.com/engine/install/

              Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

              Open WebUI: https://docs.openwebui.com/

              Ollama: https://hub.docker.com/r/ollama/ollama

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      7 days ago

      deleted by creator

    • Hamartiogonic@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      arrow-down
      3
      ·
      edit-2
      6 months ago

      According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.

      If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.

Firefox@lemmy.ml

firefox@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !firefox@lemmy.ml

A place to discuss the news and latest developments on the open-source browser Firefox

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 19 users / day
  • 157 users / week
  • 545 users / month
  • 5.31K users / 6 months
  • 1 local subscriber
  • 20.2K subscribers
  • 552 Posts
  • 8.26K Comments
  • Modlog
  • mods:
  • k_o_t@lemmy.ml
  • BE: 0.19.4
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org