I run it on the hardware I have, the data stays with me, offline. I run it on hardware I already have for other purposes, and I even have a portable solar panel (but I use the plug socket).
Qwen3 30b a3b, for example, is brilliant for its size and i can run it on my 8 GB VRAM + 32 GB RAM system at like 20 tokens per second. For lower powered systems, Qwen3 4b + a search tool is also insanely great for its size and can fit in less than 3 GB of RAM or VRAM at Q5 quantization
Luckily, I have local AI.
And you should too!
What for? I can’t think of a single problem I have in my life where the answer is AI.
With these ram prices?
Rather live without ai
I run it on the hardware I have, the data stays with me, offline. I run it on hardware I already have for other purposes, and I even have a portable solar panel (but I use the plug socket).
doesn’t AI need like 96 gigs of ram to be comparable in quality (or lack there of, depending on how you view it) yo the commercial options?
Qwen3 30b a3b, for example, is brilliant for its size and i can run it on my 8 GB VRAM + 32 GB RAM system at like 20 tokens per second. For lower powered systems, Qwen3 4b + a search tool is also insanely great for its size and can fit in less than 3 GB of RAM or VRAM at Q5 quantization