Full Ublock is a mixed bag on mobile because it eats battery/performance, and (if you add all the same filter sources), integrated blockers like Orion’s are just about the same anyway.
Full Ublock is a mixed bag on mobile because it eats battery/performance, and (if you add all the same filter sources), integrated blockers like Orion’s are just about the same anyway.
Oh heck yeah. I’ve been using it on iOS a ton, and dying for this on Windows/Linux.
Fun trivia: what browser supports HEIFs, JPEG XL AVIF, AV1, all with correctly rendered HDR?
Not Chrome. And not Firefox, nor anything based on them I’ve tried: https://caniuse.com/?search=image+format


I like Windows 11. But only as a thoroughly neutered, disposable “secondary” OS to dual boot with Linux, to the extent that I could wipe my Windows partition without a care.
If I had to use Windows 11 as my only OS, I’d pull my hair out. Same with desktop Linux TBH. There’s stuff that’s just painful in both ecosystems.


Apple’s media support is incredible.
I have one platform where HDR photos/video playback and editing, JpegXL, HEIFs from my camera and such all just work. And it’s definitely not my KDE desktop, nor Windows 11.


Puuuuurge



Yeah, probably. I actually have no idea what they charge, so I’d have to ask.
It’s be worth it for a 3090 though, no question.


This doesn’t make any sense, especially the 2x 3090 example. I’ve run my 3090 at PCIe 3.0 over a riser, and there’s only one niche app where it ever made any difference. I’ve seen plenty of benches show PCIe 4.0 is just fine for a 5090:
https://gamersnexus.net/gpus/nvidia-rtx-5090-pcie-50-vs-40-vs-30-x16-scaling-benchmarks
1x 5090 uses the same net bandwidth, and half the PCIe lanes, as 2x 3090.
Storage is, to my knowledge, always on a separate bus than graphics, so that also doesn’t make any sense.
My literally ancient TX750 still worked fine with my 3090, though it was moved. I’m just going to throttle any GPU that uses more than 420W anyway, as that’s ridiculous and past the point of diminishing returns.
And if you are buying a 5090… a newer CPU platform is like a drop in the bucket.
I hate to be critical, and there are potential issues, like severe CPU bottlenecking or even instruction support. But… I don’t really follow where you’re going with the other stuff.


That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.
I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.
I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.
…That being said, there’s a lot of trends going against people, especially for gaming:
There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.
We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.
Time gaps between generations are growing as silicon gets more expensive to design.
…Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.
Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.
You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.
IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.


Yeah, that’d be great. Peltiers would be awesome and everywhere if they were dirt cheap.


So what malware got shipped?


Awesome, thanks for the info and source.
Yeah, most of my frustration came from JXL/AVIF/HEIF and how linux/Windows browsers, KDE, and Windows 11 don’t seem to support them well. Not a fan of packing HDR into 8-bits with WebP/JPG, especially with their artifacts, though I haven’t messed with PNG yet.


Also, we haven’t even got HDR figured out.
I’m still struggling to export some of my older RAWs to HDR. Heck, Lemmy doesn’t support JPEG XL, AVIF, TIFF, HEIF, nothing, so I couldn’t even post them here anyway. And even then, they’d probably only render right in Safari.


8K is theoretically good as “spare resolution,” for instance running variable resolution in games and scaling everything to it, displaying photos with less scaling for better sharpness, clearer text rendering, less flickering, stuff like that.
It’s not worth paying for. Mostly. But maybe some day it will be cheap enough to just “include” with little extra cost, kinda like how 4K TVs or 1440p monitors are cheap now.


…I actually wouldn’t be against this.
But it isn’t even genuine. They’ve forked llama.cpp into a broken clone like about 500 other corporations, instead of just contributing to shit that actually works and is used, and… that’s about it.
That’s about par for the AI industry.
I hear this all the time.
Yet when I bring up features that don’t work at all on X because it’s ancient, “no, thats superfluous. No one needs that.”
I used to do this, but literally just switched to discrete Nvidia yesterday.
Zero issues so far. TBH it actually fixed issues I had with HDR and video decoding on my AMD IGP.


Who fucking cares?
Credit card companies.
And their ad buyers, maybe.


Yeah, I’m not against the idea philosophically. Especially for security. I love the idea of containerized isolation.
But in reality, I can see exactly how much disk space and RAM and CPU and bandwidth they take, heh. Maintainers just can’t help themselves.


The later sounds very plausible.
I think they meant background transcoding while using the browser.
I don’t even want to speculate on what’s going wrong there, heh. But I can definitely see that being a quirk.