The line between helpful tech and quiet surveillance is blurring — and our devices no longer feel fully under our control.
Because soulless ghouls can only pretend to be human.
The poor user experience is intentional. Compare FireTV to AppleTV. Everything about FireTV is carefully designed to coerce you into spending money. Easy access to the content you already have doesn’t make money, so the UX serves Amazon, not you. Apple does it, too, but with a more subtlety.
deleted by creator
Have the day you paid for.
I installed Lubuntu on my Microsoft Surface 2 and my custom PC from 2014 that couldn’t get upgraded to windows 11 due to lack of a tpm chip. We don’t need better hardware, we need better operating systems. We need more Linux.
We need more real Linux – GNU/Linux, with compliant copyleft licensing – not Tivoized crap like they put on TVs.
Roku OS, Amazon Fire OS, Tizen (Samsung TV OS), etc. – all technically Linux, but you wouldn’t know it because they’ve systematically butchered them to destroy everything that made Linux good (the users’ freedom).
What’s the point of being so pedantic?, they were obviously not advocating for more Roku installs.
Because the distinction matters. The corporate raping of Linux has to stop being tolerated or else nothing is solved. The technical details of the kernel don’t actually matter; the licensing and openness is what matters. Hell, if the Windows NT kernel got magically relicensed to AGPLv3 tomorrow it would instantly become the superior option just because of that.
Linux doesn’t fucking matter. Copyleft matters.
Sigh. This kind of nonsense is why so many people get scared away from even trying linux. Who cares which distro you use, as long as it is linux it is a step in the right direction, and a whole lot of people (including myself) have taken that step very recently, despite some arrogant linux bros doing their best to gatekeep us away from even trying.
People just need to install lubuntu or some Linux distribution on their pc. Tech companies for years have forced consumer upgrades for average pc users when it wasn’t necessary.
I have a photo company in my town that still ruins dos off of windows 95 and has internet for email on windows 2000s for their point of sale machines is all dos. Even dot matrix printers. I was born in 1984 and remember this. Shows you don’t need the latest tech
That’s great for the folks who have access to decades-old pre-enshittification technology and the means to maintain it, but what about everybody else?
Continuing my smart TV OS analogy, your answer is like saying just to use a dumb TV instead. There aren’t any dumb TVs anymore! The TV manufacturer cartel colluded to quit making them!
“Just go live in the fucking woods like the goddamn Unabomber, eschewing modern technology” is not a valid solution for normal people! The law must be changed to protect them from predatory abusive corporations.
i use my tv as secondary display on my desktop and run anything i want to watch from it.
Continuing my smart TV OS analogy, your answer is like saying just to use a dumb TV instead. There aren’t any dumb TVs anymore! The TV manufacturer cartel colluded to quit making them!
Yes there are, every smart tv becomes dumb as soon as you disconnect it from the internet. Just use it the same way you would use a monitor for your computer.
A modern tv without internet connection is a dump tv.
Eh, the printers should be swapped for laserjet to save money and ears. I don’t even know where one could buy paper for dot matrix printers either.
I’ve been considering using my phone only for tethering, and doing anything on the go on a ultraportable Linux laptop. If anyone is doing this already, I’d love to hear about your experience.
I tether my GOS tablet. I currenly don’t use a notebook privately, only a desktop.
You need a generous data plan, or never install system updates but on WiFi.
I’m working towards something like that. I’m hoping to ultimately drop the smartphone altogether, and I’ve set my current phone’s end of life (2027ish?) as the goal.
I think the other thing that’s necessary to keep the same sense of connectedness is a device to receive notifications, and I have an open source smartwatch I want to program for that. I’ve been working on a notification server too (kind of like Gotify), but at the moment it’s a work in progressI’m no tech expert and I haven’t done this for a while so don’t know if that change but they were more packet loss/errors (not sure proper terms, not English native). For most files this isn’t an issue but was for more sensitive ones like programs/iso…
Battery also suffered more from this used, keeping phone charged while tethering wasn’t good due to battery management system. But things could have changed.
Last point is that bad weather does affect cellphone reception.
Do any cell phone plans allow for unlimited Hotspot data? That’s my largest issue with doing that, I use more than 50GB every single month.
Huh
Maybe it’s different in other countries, but why would there be a different allowance for tethered/hotspot data?
Surely unlimited means unlimited and it makes no difference whether the ones and zeros go to a phone or something connected to it?
I’ve never had any problems
deleted by creator
people are experiencing innovation fatigue
What innovation? The user experience hasn’t undergone significant innovation (improvement) in the last decade
It’s enshittification fatigue, not innovation.
Innovative data collection for the shareholders so the line goes up!
Don’t forget all the innovative ways they’ve found to make it harder to repair “your” device.
Exactly. I almost feel like many are hungry for something new and different. So much so, that you give them something completely useless like an Ai widget, and they are willing to accept it to scratch an innovation itch.
Forced ‘innovation’ see-
-
Windows 8/10/11
-
Gnome 3
There’s no forced Gnome 3 (and it’s not been called that for a long time either), because you choose to install it, have the freedom to install anything else you want, and can customise it infinitely if you so choose.
Besides, Gnome is great. Maybe you don’t like it, but it seems odd to say that the way Linus Torvalds uses Linux is the “wrong” way to use Linux.
-
There’s kind of been an increase in things being more accessible and usable by the standard user where previously they would need to be quite savvy or know a language.
But, yeah, I can’t think of much else. Not user-based tech anyway. Just the usual insignificant increases and a bunch of bullshit no one asked for and actually ends up using, but has to pay for.
I think smartphones are an excellent example. Most people wouldn’t notice the differences between a second-hand $150 Samsung Galaxy from five years ago, and the latest flagship for 10× the price. The innovation is almost entirely unnoticeable.
Only difference is lack of updates for security and latest android, turns phones into ewaste long before the end of the hardware useful life.
In many cases that accessibility is a full-on neutered replacement for a previous system that offered more user control and customizability, removing options from power users, so one man’s progress is another man’s step backwards.
As someone with a second hand Galaxy from seven years ago, yeah there’s not really much difference. Newer phones are slightly more annoying to use, actually.
You don’t even hold the hardware if it’s not user repairable, customizable or upgradable
how big would a gpu need to be to be user repairable lmao
I’m taking at a device level not at a component level, think mackbook vs framework laptop
Repairability isn’t about the physical realities of executing the repair - that’s a user end problem to be solved and people are often eager to tackle those.
It’s about the manufacturer not being allowed to explicitly make design decisions that make it intentionally harder to do so than is strictly necessary as a side effect of the basic design.
This is where the Linux and self hosting people chime in.
I get tut-tutted by other Linux nerds for this a lot, but I think Linux is impersonal in a different way because it simply demands more of the user. Sure, it gives freedom, but that freedom comes with responsibility, and a lot of people just are like “ain’t nobody got time for that!” Which I think is a valid way to feel.
tut-tut
I’ve been a developer for decades. I’ve contributed to FOSS code and do a lot of my own development.
I just want a desktop that works. No fuss.
Yes I could compile my own x11 (and have) but I would rather spend my time doing my own shit than trying to stand up a new VM for some edge issue I’m having.
Just…just give me a UI I can use.
It’s why I use Ubuntu.
Linux has come a long way though and it’s basically turn key for some distros. Even with flatpak or system catalogs built into the gui.
Self hosting doesn’t make you immune, though. See how Plex evolved, for example. Self hosting plus free software that isn’t abandoned or compromised is the way, but idealistic developers need to take bread to the table too.
So the way maybe is self-hosted + libre software + a non-profit supporting the project. And that can too be corrupted, for example, the Mozilla Foundation and Google’s influence.
Always be ready to migrate.
Always be ready to migrate.
or be ready to contribute to the project you use, so they don’t have to sell out to google.
This is why permissive licensing isn’t good enough; copyleft is essential. (And not just GPLv2 copyleft, but copyleft with anti-tivoization and cloud loophole protection as well, such as AGPLv3.) Every part of the system – the tech itself, the management, and the legal/business structure – has to be designed to resist being subverted against the user.
Oh yeah linux people have been building like crazy these past 10 years.
Sometimes the user experience is so slick its boring. But the great past of.linux is even when the usage is simple I can always tweak it or modify it to my exact liking.
On Mac it either works nicely or I’m fucked.
They’ve been really holding back until now.
I have a computer capable of outputting video like 5 different ways: over the internet, near-field EM, HDMI, yadda
I just want a fucking standards compatible dumb screen
I heard a talk a few days ago, and the fella said that if you want a non-smart monitor, you’ll need to pay somewhat more for what he called an ‘industrial monitor’. He said the ‘smart TV’ is cheaper because of all the data it’ll collect, and they can sell that data to make the price-to-the-user lower. (Don’t know for myself, my old Samsung monitor’s only smarts were to send data out to one URL, and I was able to change that URL to a site that doesn’t exist.)
The supreme irony of that message coming from Windows Central…
I mean tech innovation has been stale for a long time, even with hardware remember how the CPU market was before Ryzen? Completely dead, Intel was sitting on it’s morals doing nothing because they were owning the market 10 to 1, but even now that I’ve got my i7-10700 I don’t see any point in upgrading.
Software side? It’s a mess companies will always be greedy, just today I wanted to upscale something with the cloud because my PC is great for 90% of the things I want to do, Upscaling is not one of those but guess what Topaz asks for credits in order to use their servers, yes CREDITS, so I said bye bye. I’ve also said bye bye to Adobe and moved on with Davinci Resolve.
bye bye to Adobe and moved on with DaVinci Resolve.
This is the way. I skipped Adobe entirely due to how they conduct business. I really wish Resolve had better Linux support though. Like, it works and I use it, but having to use a third party tool (make resolve deb) is ridiculous.
Additionally, Gimp is just not on the level of Photoshop, at least from what I understand, I’ve never used photoshop. I mostly long for smart select tools where I can, for example, just circle a person and have them selected. Also, content aware fill would be incredibly nice to have. Of course neither of those things are worth shoveling money out of my wallet into Adobe’s.
Affinity Photo 3 is also free and it’s kind of like Photoshop, I haven’t tried it yet but I’ve heard good things
deleted by creator
sitting on it’s morals
Assuming that’s not a typo, the phrase is “sitting on its laurels”.
And they definitely have no morals
For upscaling, check out chaiNNer: https://github.com/chaiNNer-org/chaiNNer
And openmodeldb: https://openmodeldb.info/
It’s quite possible your PC can do it with Vulkan just fine, and if not, you can rent something online pretty cheap.
Also, for video processing, if you know any Python check out vapoursynth.
This highlights the problem: the primary obstacle to a lot of software enshittification is accessibility, and discoverability.
Once something like Topaz or Premiere gets SEO, it starves all the other cool efforts out there as they get buried under spam.
What people want often exists. They just don’t know it, or it’s too technically demanding to set up and no one is “in the middle” packaging enthusiast experiments to be accessible.
I don’t know how to solve this either. The open internet is getting worse, niches are moving to Discord, and it feels like people are losing patience to really dig for cool stuff. Heck, I see some open source efforts spin up, with thousands of man hours dumped in, without even a cursory check to see what they want already exists and is looking for contributors.
Honestly this is more complicated as I’m looking for something that just works, maybe I’ll set it up later and yeah I’ve got an AMD GPU so upscaling is much much slower.
See: https://openmodeldb.info/docs/faq
The GUIs are basically just as plug-and-play as Topaz.
IF ITS too manage AI, no its not innovation at all.

















