This reminds me of this Cyanide and Happiness short: https://youtube.com/watch?v=W09ltuxt3rI
Yoko, Shinobu ni, eto… 🤔
עַם יִשְׂרָאֵל חַי Slava Ukraini 🇺🇦 ❤️ 🇮🇱
This reminds me of this Cyanide and Happiness short: https://youtube.com/watch?v=W09ltuxt3rI
Typical Xonotic players: https://www.youtube.com/watch?v=K8JD2g4N9JA
This would be a meme by itself:
lacks some cheese IMO
Let them fight among themselves and prove time and time again that patents are idiotic and hinder innovation.
I think it’s already removed? I checked by sorting with New and there’s nothing right now, unless you mean another community?
Yup, they already forced Google to announce that they’ll add such a choice screen for the search engine and web browser on Android: https://www.neowin.net/news/google-will-add-new-search-and-browser-choice-screens-for-android-phones-in-europe/
It’s only a matter of time before Microsoft does so too.
ollama should be much easier to setup!
ROCm is decent right now, I can do deep learning stuff and CUDA programming with it with an AMD APU. However, ollama doesn’t work out-of-the-box yet with APUs, but users seem to say that it works with dedicated AMD GPUs.
As for Mixtral8x7b, I couldn’t run it on a system with 32GB of RAM and an RTX 2070S with 8GB of VRAM, I’ll probably try with another system soon [EDIT: I actually got the default version (mixtral:instruct) running with 32GB of RAM and 8GB of VRAM (RTX 2070S).] That same system also runs CodeLlama-34B fine.
So far I’m happy with Mistral 7b, it’s extremely fast on my RTX 2070S, and it’s not really slow when running in CPU-mode on an AMD Ryzen 7. Its speed is okayish (~1 token/sec) when I try it in CPU-mode on an old Thinkpad T480 with an 8th gen i5 CPU.
PSA: give open-source LLMs a try folks. If you’re on Linux or macOS, ollama makes it incredibly easy to try most of the popular open-source LLMs like Mistral 7B, Mixtral 8x7B, CodeLlama etc… Obviously it’s faster if you have a CUDA/ROCm-capable GPU, but it still works in CPU-mode too (albeit slow if the model is huge) provided you have enough RAM.
You can combine that with a UI like ollama-webui or a text-based UI like oterm.
Hmm I don’t think it’s because of that feature, because it only runs when you explicitly ask it to translate a page for you. You should probably check your extensions, see if you have some redundant ones (a mistake people make is use multiple ad-blockers/anti-trackers, when just uBlock Origin + Firefox’s defaults are usually good enough).
Yup, Firefox has it: https://browser.mt/ (it’s now a native part of Firefox)
Microsoft really wants someone to remind it of these days:
I’d usually alternate between true neutral and neutral evil
RedReader gets a barely-glance in a single sentence. A single dev (and with users providing PRs) has one of the best, and most unknown, apps for over a decade now.
RedReader is definitely a gem. Incredible app that still works despite the Reddit appocalypse.
The tone may be a bit harsh but it’s muuuuch better than how he used to be during his most toxic days. This is how he used to talk: https://www.networkworld.com/article/706908/security-torvalds-to-bad-security-devs-kill-yourself-now.html
Linus definitely got much better at handling his anger since his public apology in 2018.
But for a moment I was like wow, 100FPS in software rendering
Thank you, that exactly was my point.
In the 2000s we had AdSense. So now we’re getting… AISense?