

Firefox has been slowly ramping up AI integration and data collection/telemetry. Fortunately theres many forks that rip all that out. Librewolf and ironfox are good examples.
The truth is Mozilla is not profitable as a company and FF would have died a long time ago if Google didn’t pay up to keep them alive to help avoid monopoly busting suits. In a worse timeline we don’t even have modern FF to fork from or ublock in 2025.








It depends on how powerful and fast you want your model. Yeah, a 500b parameter model running at 20 tokens per second is gonna require a expensive GPU cluster server.
If you happen to not have pewdiepie levels of cash laying around but still want to get in on the local AI you need one powerful GPU inside any desktop with a reasonably fast CPU. A used 16GB 3090 was like 700$USD last I checked on eBay and well say another 100$ for an upgraded power supply to run it. Many people have an old desktop just laying around in the basement but an entry level ibuypower should be no more than 500. So realistically Its more like 1500-2000$USD to get you into the comfy hobbyist status. I make my piece of shit 10 year old 1070ti 8GB work running 8-32b quant models. Ive heard people say 70b is a really good sweetspot and that’s totally attainable without 15k investment.