• @[email protected]
    link
    fedilink
    English
    12 days ago

    Tonight, I installed Open Web UI to see what sort of performance I could get out of it.

    My entire homelab is a single n100 mini, so it was a bit of squeeze to add even Gemma3n:e2b onto it.

    It did something. Free chatgpt is better performance, as long as I remember to use place holder variables. At least for my use case: vibe coding compose.yamls and as a rubber duck/level 0 tech support for trouble shooting. But it did something, I’m probably going to re-test when I upgrade to 32gb of ram, then nuke the LXC and wait till I have a beefier host though.

    • @[email protected]
      link
      fedilink
      English
      26 hours ago

      it was a bit of squeeze to add even Gemma3n:e2b onto it

      statements dreamed up by the utterly deranged

    • @[email protected]
      link
      fedilink
      English
      62 days ago

      case in point: you jacked off all night over your local model and still got a disappointing result