I run ollama and auto1111 on my desktop when it’s powers on.
Using open-webui in my homelab always on, and also connected to openrouter.
This way I can always use openwebui with openrouter models and it’s pretty cheap per query and a little more private that using a big tech chatbot. And if I want local, I turn on the desktop and have local lamma and stab diff.
I also get bugger all benefit out of it., it’s a cute toy.
I run ollama and auto1111 on my desktop when it’s powers on. Using open-webui in my homelab always on, and also connected to openrouter. This way I can always use openwebui with openrouter models and it’s pretty cheap per query and a little more private that using a big tech chatbot. And if I want local, I turn on the desktop and have local lamma and stab diff.
I also get bugger all benefit out of it., it’s a cute toy.