Idiotic tariffs, indifferent retailers, depraved flippers and AI mania are making the simple act of buying a graphics card the defining misery of PC gaming in 2025.
I’m having a good time on a laptop with no fancy graphics card and have no desire to buy one.
I also do not look for super high graphical fidelity, play mostly indies instead of AAA, and am like 5 years behind the industry, mostly buying old gems on sale, so my tastes probably enable this strategy as much as anything else.
I’ll be honest, I have never paid attention to GPUs and I don’t understand what your comment is trying to say or (this feels selfish to say) how it applies to me and my comment. Is this intended to mostly be a reply to me, or something to help others reading the thread?
Thank you for explaining! I am not sure why people are reacting badly to my statement, is knowledge of GPUs something every gamer is expected to have and I am violating the social contract by being clueless?
Well at one point to be a computer gamer you basically needed to put together your own desktop PC.
Integrated GPUs basically were only capable of displaying a desktop, not doing anything a game would need, and desktop CPUs didn’t integrate graphics at all, generally.
So computer-building knowledge was a given. If you were a PC gamer, you had a custom computer for the purpose.
As a result, even as integrated GPUs became better and more capable, the general crowd of gamers didn’t trust them, because it was common knowledge they sucked.
It’s a lot like how older people go “They didn’t teach you CURSIVE?” in schools nowadays. Being a gamer and being a PC builder are fully seperatable, now, but they learned PC building when they weren’t and therefore think you should have that, too.
It’s fine, don’t sweat it. You’re not missing out on anything, really, anyway. Especially given the current GPU situation, it’s never been a worse time to be a PC builder or enthusiast.
Oh boy. Thanks for the context, by the way! I did not know that about the history of PC gaming.
I did learn cursive, but I have been playing games on laptops since I was little too and was never told I had to learn PC building. And to be completely honest, although knowledge is good, I am very uninterested in doing that especially since I have an object that serves my needs.
I have the perspective to realize that I have been on the “other side” of the WHAT DO YOU MEAN YOU’RE SATISFIED, LEARN MORE AND CHANGE TO BE LIKE US side, although I’m exaggerating because I don’t actually push others to take on my decisions. I don’t spam the uninterested to come to Linux, but I do want people who get their needs adequately served by Windows to jump to Linux anyways because I want to see Windows 11, with even more forced telemetry and shoved-in AI and things just made worse, fail. Even though that would actually be more work for satisfied Windows users.
But I would not downvote a happy Windows user for not wanting to switch, and that kind of behavior is frowned upon, is it just more acceptable to be outwardly disapproving to those who do not know about GPUs and are satisfied with what they have with zero desire to upgrade? I don’t have Sufficient Gamer Cred and am being shown the “not a Real Gamer” door? I think my comment was civil and polite so I really don’t understand the disapproval. If it is just “not a Real Gamer” I’ll let it roll off my back, though I did think the Gaming community on Lemmy was better than that… I would understand the reaction if I rolled up to c/GPUs with “I don’t care about this :)” and got downvoted. Is Gaming secretly kind of also c/GPUs and I just did not know that?
Okay I literally just realized it is probably because I hopped on a thread about GPUs and do not know about the topic being posted about. Whoops. Sorry.
Depending on what happens with GPUs for datacenters, external GPUs might be so rare that nobody does it anymore.
My impression right now is that for nVidia gamer cards are an afterthought now. Millions of gamers can’t compete with every company in Silicon Valley building entire datacenters stacked with as many “GPUs” as they can find.
AMD isn’t the main choice for datacenter CPUs or GPUs. Maybe for them, gamers will be a focus, and there are some real advantages with APUs. For example, you’re not stuck with one particular amount of GPU RAM and a different amount of CPU RAM. Because you’re not multitasking as much when gaming, you need less CPU RAM, so you can dedicate more RAM to games and less to other apps. So, you can have the best of both worlds: tons of system RAM when you’re browsing websites and have a thousand tabs open, then start a game and you have gobs of RAM dedicated to the game.
It’s probably also more efficient to have one enormous cooler for a combined GPU and CPU vs. a GPU with one set of heatsinks and fans and a separate CPU heatsink and fan.
External GPUs are also a pain in the ass to manage. They’re getting bigger and heavier, and they take up more and more space in your case. Not to mention the problems their power draw is causing.
If I could get equivalent system performance with an APU vs. a combined CPU and GPU, I’d probably go for it, even with the upgradeability concerns. OTOH, soldered-in RAM is not appealing because I’ve upgraded my RAM more often than other components on my PCs, and having to buy a whole new motherboard to get a RAM upgrade is not appealing.
I’m having a good time on a laptop with no fancy graphics card and have no desire to buy one.
I also do not look for super high graphical fidelity, play mostly indies instead of AAA, and am like 5 years behind the industry, mostly buying old gems on sale, so my tastes probably enable this strategy as much as anything else.
Modern high end iGPUs (e.g. AMD Strix Halo) are going to start replacing dGPUs in the entry and mid-range segments.
I’ll be honest, I have never paid attention to GPUs and I don’t understand what your comment is trying to say or (this feels selfish to say) how it applies to me and my comment. Is this intended to mostly be a reply to me, or something to help others reading the thread?
Your laptop uses an iGPU. The “i” stands for integrated, as it’s built into the same package as the CPU.
The alternative, a dGPU, is a discrete part, separate from other components.
They’re saying that your situation is becoming increasingly common. People can do the gaming they want to without a dGPU more easily as time goes by.
Thank you for explaining! I am not sure why people are reacting badly to my statement, is knowledge of GPUs something every gamer is expected to have and I am violating the social contract by being clueless?
Well at one point to be a computer gamer you basically needed to put together your own desktop PC.
Integrated GPUs basically were only capable of displaying a desktop, not doing anything a game would need, and desktop CPUs didn’t integrate graphics at all, generally.
So computer-building knowledge was a given. If you were a PC gamer, you had a custom computer for the purpose.
As a result, even as integrated GPUs became better and more capable, the general crowd of gamers didn’t trust them, because it was common knowledge they sucked.
It’s a lot like how older people go “They didn’t teach you CURSIVE?” in schools nowadays. Being a gamer and being a PC builder are fully seperatable, now, but they learned PC building when they weren’t and therefore think you should have that, too.
It’s fine, don’t sweat it. You’re not missing out on anything, really, anyway. Especially given the current GPU situation, it’s never been a worse time to be a PC builder or enthusiast.
Oh boy. Thanks for the context, by the way! I did not know that about the history of PC gaming.
I did learn cursive, but I have been playing games on laptops since I was little too and was never told I had to learn PC building. And to be completely honest, although knowledge is good, I am very uninterested in doing that especially since I have an object that serves my needs.
I have the perspective to realize that I have been on the “other side” of the WHAT DO YOU MEAN YOU’RE SATISFIED, LEARN MORE AND CHANGE TO BE LIKE US side, although I’m exaggerating because I don’t actually push others to take on my decisions. I don’t spam the uninterested to come to Linux, but I do want people who get their needs adequately served by Windows to jump to Linux anyways because I want to see Windows 11, with even more forced telemetry and shoved-in AI and things just made worse, fail. Even though that would actually be more work for satisfied Windows users.
But I would not downvote a happy Windows user for not wanting to switch, and that kind of behavior is frowned upon, is it just more acceptable to be outwardly disapproving to those who do not know about GPUs and are satisfied with what they have with zero desire to upgrade? I don’t have Sufficient Gamer Cred and am being shown the “not a Real Gamer” door? I think my comment was civil and polite so I really don’t understand the disapproval. If it is just “not a Real Gamer” I’ll let it roll off my back, though I did think the Gaming community on Lemmy was better than that… I would understand the reaction if I rolled up to c/GPUs with “I don’t care about this :)” and got downvoted. Is Gaming secretly kind of also c/GPUs and I just did not know that?
Okay I literally just realized it is probably because I hopped on a thread about GPUs and do not know about the topic being posted about. Whoops. Sorry.
Laptops (and desktops) with no GPUs will become increasingly viable not just for older games. This was a general comment. :)
Depending on what happens with GPUs for datacenters, external GPUs might be so rare that nobody does it anymore.
My impression right now is that for nVidia gamer cards are an afterthought now. Millions of gamers can’t compete with every company in Silicon Valley building entire datacenters stacked with as many “GPUs” as they can find.
AMD isn’t the main choice for datacenter CPUs or GPUs. Maybe for them, gamers will be a focus, and there are some real advantages with APUs. For example, you’re not stuck with one particular amount of GPU RAM and a different amount of CPU RAM. Because you’re not multitasking as much when gaming, you need less CPU RAM, so you can dedicate more RAM to games and less to other apps. So, you can have the best of both worlds: tons of system RAM when you’re browsing websites and have a thousand tabs open, then start a game and you have gobs of RAM dedicated to the game.
It’s probably also more efficient to have one enormous cooler for a combined GPU and CPU vs. a GPU with one set of heatsinks and fans and a separate CPU heatsink and fan.
External GPUs are also a pain in the ass to manage. They’re getting bigger and heavier, and they take up more and more space in your case. Not to mention the problems their power draw is causing.
If I could get equivalent system performance with an APU vs. a combined CPU and GPU, I’d probably go for it, even with the upgradeability concerns. OTOH, soldered-in RAM is not appealing because I’ve upgraded my RAM more often than other components on my PCs, and having to buy a whole new motherboard to get a RAM upgrade is not appealing.
Thank you for explaining!
Same here, have never owned a graphics card in my life. When I occasionally do want a modern game it doesn’t need to be 200FPS RTX.
I’ve been using minipcs with integrated graphics ( and one with a laptop class GPU) instead of desktops and see no reason to stop.