I haven’t thought about it in a while but the premise of the article rings true. Desktops are overall disposable. Gpu generations are only really significant with new cpu generations. CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.
Is there a platform that challenges that trend?
AMD challenges that trend, but the article writer dismisses them because of Intel’s market share.
Terrible article.
I don’t agree with this article. Everyone I know usually upgrades their GPU until the CPU is bottlenecking it heavily and that is only the case after a few GPU upgrades.
Disposable my ass. I just did the final upgrades to my AM4 platform to be my main rig for the next 5 years. After that it will get a storage upgrade and become a NAS and do other server stuff. This computer 7 years in has another 15 left in it.
Everything in this post is wrong, actually. But if you buy shit parts to build your desktop, you’ll have a shitty desktop.
Simple answer is at the motherboard level - you look at your motherboard’s future expansion capability and if you started with a good foundation you can do years of upgrades. Also your computer case needs to be big enough to fit extra stuff, full ATX motherboard size is great.
For example I have a VR gaming rig that runs VR games well on DDR3 RAM and a Sandy Bridge CPU, because it has a decent modern GPU and enough CPU cores + RAM.
I have been ship of theseusing my desktop and server for 15 years. This article is fucking stupid.
Laptop CPUs are crippled garbage compared to desktop CPUs of the same generation. So there’s that.
CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.
I find the quoted statement untrue. You still have all peripherals, including the screen, the PSU, and the case.
You can replace components as and when it becomes necessary.
You can add up hard drives, instead of replacing a smaller one with a larger one.
Desktop mobos are usually more upgradeable with RAM than laptops.
There’s probably more arguments that speak against the gist of this article.
Everything is disposable. I don’t think you or the author who wrote that article has a clue. It’s a matter of getting things that’ll last longer than others do and making financially wise choices and purchasing decisions based on the needs of the moment.
Like, I’m not spending $5 on a toothbrush when you need to replace it every 30 days, I buy the cheapest toothbrush I can afford to replace it with since they’re all equally made. I will spend some more money on a computer component if I feel it will have a positive increment on my entire system. Replacing my entire system would just set me back big and it would make me waste the components that are already inside that are still good. Plus, if I decide to sell the old system, I’m not going to get a good value back.
The only thing I’ve yet to replace is the case. Why? Because it’s still serviceable to me.
I just don’t get this stupid logic where you have to replace the entire system. For what? just to be with the in-crowd of current technology trends? No thanks, I’ll build my PC based on what I want out of it.
That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.
I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.
I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.
…That being said, there’s a lot of trends going against people, especially for gaming:
-
There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.
-
We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.
-
Time gaps between generations are growing as silicon gets more expensive to design.
-
…Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.
-
Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.
-
You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.
IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.
If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.
While throwing out working things is terrible, the cost of servicing a motherboard outpaces the cost of replacing it. They can possibly still charge you 200 dollars and tell you the board cant be fixed, right? I think the right balance is that you observe the warranty period, try to troubleshoot it yourself --and then call it a day, unless you have a 400+ dollar motherboard.
-
Meanwhile I’ve been using an AM4 board and DDR4 for… well, it’s been awhile now.
Personally I still prefer the desktop because I can choose exactly where I prefer performance, and where I can make some tradeoffs. Also, parts are easier to replace when they fail, making them more sustainable. You don’t have that choice with a laptop since it’s all prebuilt.
Desktops also offer better heat dissipation and peripheral replacements extending the life of the unit. It can be difficult for most folks to replace a laptop display or even battery nowadays frankly.
Let’s say that you’ve just significantly upgraded your GPU. If you were getting the most out of your CPU with your previous GPU, there’s a good chance that your new GPU will be held back by that older component. So now, you need a new CPU or some percentage of your new GPU’s performance is wasted.
There’s always an imbalance. It doesn’t mean it’s “wasted”. CPU and GPU do different things.
except, getting a new CPU that’s worth the upgrade usually means getting a new motherboard
Also not true. AM4 came out in 2016 and they are still making modern processors for it.
Generational performance increases are too small
Wrong again.
Ask yourself this: how much of your current desktop computer has components from your PC from five years ago?
Most of it.
They’re also ignoring the concept of repairability. If my CPU dies? Buy another CPU. Maybe upgrade at the same time. CPU dies in your PS5? Fuck you, better throw the whole thing away and buy a new one.
Let’s say that you’ve just significantly upgraded your GPU. If you were getting the most out of your CPU with your previous GPU, there’s a good chance that your new GPU will be held back by that older component. So now, you need a new CPU or some percentage of your new GPU’s performance is wasted. Except, getting a new CPU that’s worth the upgrade usually means getting a new motherboard, which might also require new RAM, and so on.
This guy’s friends should keep him away from computers and just give him an iPad to play with.
I disagree that you need to upgrade your CPU and GPU inline. I almost always stagger those upgrades. Sure, I might have some degree of bottleneck but it’s pretty minimal tbh.
I also think it’s a bit funny the article mentions upgrading every generation. I’ve never done that, I don’t know a single person who does. Maybe I’m just too poor to hang with the rich fucks, but the idea of upgrading every generation was always stupid.
Repairability is a big deal too. It also means that if my GPU dies I can just replace that one card rather than buy an entire new laptop since they tend to just solder things down for laptops.
I typically build a whole new PC and then do a mid-life GPU upgrade after a couple generations. e.g. I just upgraded my GPU I bought in late 2020. For most users there just isn’t a good reason to be upgrading your CPU that frequently.
I can see why some people would upgrade their GPU every generation. I was suprised at how expensive even 2 generations old card are going for on ebay, if you buy a new card and sell your old one every couple years the “net cost per year” of usage is pretty constant.
The main benefit of a desktop is the price / performance ratio which is higher because you’re trading space and portability for easier thermal management and bigger components.









