I haven’t thought about it in a while but the premise of the article rings true. Desktops are overall disposable. Gpu generations are only really significant with new cpu generations. CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.
Is there a platform that challenges that trend?
Edit Good points were made. There is a lot to disagree with in the article, especially when focused on gaming.
Storage For the love of your data : storage is a WEAR component. Especially with HDD. Up until recently storage was so cheap it was crazy not to get new drives every few years.
Power Supplies Just because the computer still boots doesn’t mean the power supply is still good. A PSU will continue to shove power into your system long past the ability to provide clean power. Scope and test an older PSU before you put it on a new build.
I have been ship of theseusing my desktop and server for 15 years. This article is fucking stupid.
Aye.
And OP is doubling down.
This is categorically untrue with the latest generations of chipsets, CPUs and GPUs. Just look at AMD instead of Intel: AM4/5 cross-compatibility, DDR4/DDR5 combined support and so on.
If anything, today is the day when you can upgrade beyond your current gen hardware component by component.
What are you on about? Which AM5 CPU can you drop in an AM4 mobo?
🎺"The upgrade argument for desktops doesn’t stand up anymore" 🎺
of course, you can still…
hum… well, you can also…
yeah, yeah, you can do that also… but…
…and so going on.
Honestly most people just upgrade the GPU and ssd, after 10-15 years they buy a new desktop. Also one of the biggest reasons to get a desktop is that it is cheaper than laptops, last longer, and you can change any part that breaks. I had many laptops with one component basically making the entire device useless, if it was a desktop it could easily be fixed, for example soldered RAM.
This isn’t against desktops. It’s against idea that a desktop is significantly more future proof than another form factor.
The previous comment gives a pretty clear argument for why desktops are more future proof, I think. Being more repairable is a pretty big deal for the longevity of the whole system.
Not sure what “future proof” means, but my PC still has its original case from Windows Vista times, has seen 2 mobo replacements, 1 PSU replacement, and I don’t even know how many hard drive / SSD additions / swaps. RAM extensions too. Used to have a GPU but after the 2nd mobo/CPU replacement I dropped it.
Different screens, keyboards, and mice.
None of this would have easily been possible on a laptop.
In a world where hardware is getting more expensive again you are really sending the wrong message here.
Not to speak of environmental impact & consumerism.
Your history sounds exactly like the spiral of component replacement that is being discussed. it sounds like your replaced everything multiple times, but just kept the case.
separately, part by part. if they had a laptop they would have needed to buy at least 6 complete laptops by that time, or more realistically, give up on upgrades.
Disposable my ass. I just did the final upgrades to my AM4 platform to be my main rig for the next 5 years. After that it will get a storage upgrade and become a NAS and do other server stuff. This computer 7 years in has another 15 left in it.
Yeah, it’s crazy that someone could have gotten like a Ryzen 5 1600 then upgraded to a 5800x3D around 5 years later without needing to buy a new motherboard, which usually can mean having to buy a new set of ram too.
For a long time just doing a new build if upgrading to a newer CPU used to be the thing when Intel was dominant.
Yeah, I usually over spec when I build my main rig because I want to have it last and repurpose it later down the road. I finally retired a power supply that I bought back in the mid 2000s. I can’t power modern cards anymore unfortunately. 🫡 pc power and cooling single rail take a break. You’ve earned it.
The manufacturing of consent to move your machine to the cloud has begun. We had a good run lads.
You are literally the only person saying that out this this whole exchange.
"This persistent narrative in the media trying to talk consumers out of desktops as being viable options kind of sneakily ties into the greater “you will own nothing and you will be happy” narrative being pushed by big tech.
It’s really obvious and it needs to be consistently called out for what it is."
Literally the most upvoted comment in the linked article.
I guess some frogs are just to stupid to figure out they’re being slow boiled and it’s up to us to carry the dead weight out of the pan…
CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.
I find the quoted statement untrue. You still have all peripherals, including the screen, the PSU, and the case.
You can replace components as and when it becomes necessary.
You can add up hard drives, instead of replacing a smaller one with a larger one.
Desktop mobos are usually more upgradeable with RAM than laptops.
There’s probably more arguments that speak against the gist of this article.
All of the peripherals will carry on to any new system. With usb-c basically all you need to run in your case is a gpu and nvme.
Throw in thunderbolt and networking as well as hdd based das won’t be bottlenecked.
Yeah desktops can have more ram than laptops and that is the one case where a desktop can really shine. Even then there is usually a pretty big ram limit you need to pass.
Laptop CPUs are crippled garbage compared to desktop CPUs of the same generation. So there’s that.
The importance of open & interchangeable hardware and software goes way beyond the upgrades you may or may not make, or even saving money & reducing e-waste.
You get better products that way. Having complete control over your system benefits you even if you never exercise that control. It is literally a constraint on enshittification.
AMD challenges that trend, but the article writer dismisses them because of Intel’s market share.
Terrible article.
This might be true for Intel, I don’t know, I use amd. I know the limits of my cpu/gpu pairing. I bought the affordable low end GPU for the cpu and in 5 years I’ll upgrade to the upper end gpu when it’s really cheap. 5 years later, I’ll get a new computer
My last GPU? $300.
One before that? $300.
Next one? $300.
(buy used)
I don’t agree with this article. Everyone I know usually upgrades their GPU until the CPU is bottlenecking it heavily and that is only the case after a few GPU upgrades.
Yeah, and when CPU is the bottleneck upgrading the CPU, mobo, and ram but not the GPU.
This time though I only upgraded the CPU, since AM4 had supported multiple generations of CPUs. One of the best things to happen for PC.
Yes, desktop PCs challenge that trend. If you’re not chasing the newest of the new, you can keep using your old stuff till it dies. I’ve done one CPU upgrade, and a GPU upgrade, to my desktop in the eight years I’ve owned it, and it handles all of my games fine.
If you’re changing the motherboard, you’ll usually need a new CPU, and sometimes RAM. As long as your MOBO has a PCI/PCIE slot you can shove your old graphics card in there. Unless there’s a new RAM version, you don’t need to replace the RAM, and SATA’s been the standard storage connector for how long now?
Unless you’re going above your current PSU’s rating that thing’s good until it’s dead.
I just don’t see how this argument holds up. If your motherboard is old enough that they no longer make your CPU/RAM socket, and you’re looking to upgrade, chances are very good that thing’s lived far longer than most laptops would be expected to. But like. When I built my current desktop 8 years ago, it had 8gb of RAM and a… I don’t remember the graphics card, I know the processor was a pentium G something, and like 1tb of storage. It has an i7 (don’t remember the generation off hand), and an R9 290, and 32gb of RAM, and 7tb of storage now. Same motherboard. If I replace it I will need a new processor, and new RAM (the RAM is actively dying, so I haven’t been using it much), but these parts are all nearly a decade old, with the exception of the RAM. Well. One RAM stick is 8 years old, but that’s beside the point.
This just doesn’t line up with my own personal experience?
Unless you’re going above your current PSU’s rating that thing’s good until it’s dead.
Power supplies will work well past the point of providing clean in spec power on each rail. Lots of parts in a power supply can stop working properly before it physically no longer passes power.
Unless the PSU is relatively new it’s not a great idea to put it into a new build with testing that it is still in spec on each voltage rail under a load.
I wasn’t talking about a new build at any point in my comment.
This post is about new builds, upgrades and using old parts with new ones.
This post, to my reading, is about how desktop PCs are disposable, and my comment is providing evidence to the contrary
The title of this article just doesn’t match reality. It really only (maybe) applies to very high end systems that are already pushing the limits of all components. Most people don’t have the money to waste on that and have plenty of room to upgrade their hardware for a looong time.
If you don’t need much (e.g. no gaming, 3D rendering, etc.), especially if you don’t need a dedicated gpu, then you can upgrade for at least a decade before running into issues. To be fair, a laptop should last a decade as well in that case, but at a higher prices and while being less repairable.








