Forget VRAM — these 5 GPU trends are way more disturbing

2 hours ago 2

After a 7-year corporate stint, Tanveer found his love for writing and tech too much to resist. An MBA in Marketing and the owner of a PC building business, he writes on PC hardware, technology, and Windows. When not scouring the web for ideas, he can be found building PCs, watching anime, or playing Smash Karts on his RTX 3080 (sigh).

The real implications of insufficient VRAM have finally started to rear their head in the last two years. Graphics cards with 8GB of VRAM are simply not capable of handling the latest titles, sometimes even at 1080p. However, a lack of VRAM might not be the worst problem plaguing GPUs right now. Ever since the launch of Nvidia's RTX 50 and AMD's RX 90 series GPUs, multiple disturbing trends have surfaced that paint a bleak future for graphics cards. These dark trends might ruin your GPU upgrade dream before VRAM even gets a chance.

Falling generational gains

The end of the line

The fact that the current generation of GPUs barely feels like an upgrade over the last one isn't lost on the community. Nvidia might have marketed the RTX 50 series cards with tall performance claims, but they all fell flat when raw benchmarks came out. The rasterized performance got a modest bump on most Blackwell GPUs, justifiably prompting many to term this generation as a mere refresh. AMD's RX 90 series might have delivered on the company's promises regarding ray tracing and upscaling performance, but raw performance gains over similarly priced RX 6000 cards aren't anything to write home about.

It almost seems as if GPU manufacturers have reached the end of the line when it comes to squeezing more performance by reducing transistor sizes. Without radical innovation in GPU technology, even the next-gen architectures from Nvidia and AMD (and Intel) might follow the same path. Upgrading your GPU might finally become pointless if gamers barely have anything to gain by doing so.

Hardware requirements outpacing GPU innovation

Maybe we don't need more realism in games

Disappointing generational gains are just one part of the equation; games are becoming more demanding with almost every single launch. This exponential rise in hardware requirements represents another blow to gamers' ability to make their GPUs last for 4–5 years. If you can't expect reasonable performance from the most demanding games over a period of four years after buying a high-end card, what even is the point? Dumping $800–$1,000 into a graphics card and still feeling shortchanged is a terrible indictment of the PC hardware industry.

One of the biggest wrinkles associated with rising hardware requirements is that very few games launch with a respectable level of optimization. Using crutches like upscaling and frame generation just to edge past 60 FPS is becoming the norm instead of the exception. Unsustainable GPU requirements combined with subpar optimization deal a killing blow to gaming PCs that don't have flagship hardware. Maybe it's time to stop making games harder to run, and figure out ways to adapt to the meager rate of GPU advancement.

Over-reliance on AI-powered frames

Fake frames are just that

To counter the falling rate of hardware improvements, GPU manufacturers have turned to AI-generated performance. AI might have been at play since the days of the RTX 20 series (DLSS 1.0), but this trend truly took shape when Nvidia fully committed to marketing AI-generated frames as real gains during the RTX 50 launch. AMD and Intel followed suit with FSR 4 and XeSS 2.1, respectively, by incorporating frame generation and more machine learning techniques into their pipelines.

Using software to bridge the gaps left behind by hardware is fine, but companies should not deceive consumers about the nature of the innovation. AI-generated frames are not equivalent to conventionally rendered frames, which is especially apparent after the jump from 2x to 4x frame generation. Frame generation needs a base framerate that's already too high for most gaming PCs — they have no use for this relatively new technology. The current trajectory of GPU manufacturers seems AI-first instead of using it to complement real innovation at the hardware level.

GPU companies are stealing from you

In case this is news to you, Nvidia has been silently eroding the value its GPUs provide to consumers by slashing the amount of GPU power you get despite paying the same (or even more) amount of money. I'm referring to the percentage of CUDA cores and memory bandwidth found on the 60, 70, and 80-class cards relative to the flagship of the respective generations. For instance, the RTX 4070 and RTX 5070 featured only 36% and 28% CUDA cores, respectively, compared to those on the RTX 4090 and RTX 5090, whereas the average had been around 55% till the RTX 30 series.

AMD also cut down on the number of cores on the RX 7800 XT, providing only 70% of the cores present on the RX 7900 XT​​​​​. This number had been close to 90% on the RX 6800 XT when compared to the RX 6900 XT. This has been the norm since 2022 — just like any other industry, shrinkflation is well and truly here in the world of gaming GPUs. If GPU companies are secretly stealing from consumers, and rendering GPU nomenclature meaningless, how is the average consumer supposed to make the right purchase decision?

Growing supply issues and unrealistic MSRPs

Something has to give

GPU scarcity and inflated prices are common during every launch, but things have escalated to a whole new level now. When the RTX 50 series launched, we saw RTX 5080s and RTX 5090s going for twice or thrice the MSRP, and no one could do anything about it. Paper launches have become normal, and that's the scariest part about all of this. GPU companies are comfortable announcing products that never make it to store shelves on online marketplaces. We saw this with the RTX 90 series, and even non-GPU products like the Ryzen 7 9800X3D.

Things always seem to return to normal, but only 6–8 months after launch. That's an insane amount of time to wait for the privilege of buying a new GPU. Stock issues give rise to inflated prices, but even GPU companies are to blame for it. Either they should ensure stock availability, or avoid announcing unrealistic MSRPs for their products. The age of buying a GPU at MSRP might very well be behind us, at least during the first few months of the launch.

Will 2027 turn the tide for GPUs?

Next-gen GPUs are most likely not going to be here before 2027, but will things really change with the RTX 60 series and whatever AMD ends up calling their next-gen cards? Most indicators point to the negative, but we might get a lineup like the RTX 30 series again — it is overdue anyway. If the RTX 50 Super refresh ends up being a disappointing launch, 2027 is what I'm targeting for a GPU upgrade, and finally switch from my once-great RTX 3080.

Read Entire Article