Performance – Nvidia does not want the x80 anywhere near the x90
Now focusing on the RTX 4090 for a brief moment
So for clarity the RTX 3090 at release had 14% more CUDA cores than the RTX 3080. The RTX 4090 it will have 68% more CUDA cores than the RTX 4080 16GB and 113% more CUDA cores than the RTX 4080 12GB. Nvidia simply does not want the x80 cards anywhere near the x90 card this time around. Outside of gaming the RTX 3080 was great for certain workloads and professional applications that could utilize CUDA\Optix without using a lot of wattage. Nvidia has also left themselves plenty of room to counter anything AMD tries with their RTX 4000 “Ti” series.
x80 vs x90 – Ampere then Ada
When the RTX 3080 being was being compared to the RTX 3090 there wasn’t a huge difference in specifications. Overall the RTX 3080 was roughly 20% below the RTX 3090 across all major specs, however the theoretical performance numbers were roughly 21% slower the 3090 as well. Overclocking the RTX 3080 memory could definitely push the performance numbers closer to the stock RTX 3090. I could easily overclock the RTX 3080 GDDR6X memory and push the throughput to 840 GB\s which is only 11% less than the RTX 3090 stock memory throughput. So for workloads that depends heavily on Optix\CUDA cores and memory throughput you will be getting very nice performance for the price in your professional apps or specific workloads. Speaking of prices we all know how the prices went with the Nvidia 3000 series and the AMD 6000 series. It was worth calling the RTX 3080, "flagship". Theoretical performance will be roughly 70% faster on the RTX 4090 over the RTX 4080 16GB from my early speculations, but we will have have to wait for benchmarks. Things get far worse for the RTX 4080 12GB, the RTX 4090 will be over 100% faster than the RTX 3080 12GB. So it’s not even close to what we had to the 20% difference with the RTX 3090 vs 3080.
Memory Bus Width
Nvidia Decisions Has Given Us An Interesting Predicament
Nvidia has established the ‘x80’ card as being the flagship that is usually followed up with the x80 "Ti" version the following year. This time around it feels as if the x80 is no longer the flagship and is severely limited. The 4080 12GB definitely feels more like an ‘x70’ tier based on the low performance numbers compared to the RTX 4090. Nvidia has placed the company in an interesting position in the eyes of enthusiast and gamers, but it has all come from Nvidia’s own decisions. Nvidia first introduced the Titan in 2013 with their 700 series and prior to the 700 series the x90 was the top dual-GPU for prosumers who wanted to game and create content. Nvidia moved away from the dual-GPU x90 cards and started the successful Titan brand with 700 series. The Titan took over the role as being the top enthusiast and content creator GPU. The Titan replaced the GTX\RTX x90 GPU from the GTX 700 series until the RTX 2000 series; however Titan RTX released at the professional and enterprise market prices ($2,499) so we can’t really count that one. So the for the 2000 series the RTX 2080 ‘Ti’ was the top of the of the line card for enthusiast. Multiple 3080 releases over the past few years have been very confusing due to Nvidia changing core counts, vRAM and other specifications while using the same tier name. Luckily this time we are getting the multiple 4080 variants upfront on day one.
|RTX 4000 - RTX 3000 Specifications
Traditional Price difference between the x80 & 90
For most of recent history the x80 has been anywhere from $350 to $800 (MSRP) less expensive than the x90. So that would mean from the GTX 500 series to the RTX 3000 series the average MSRP price difference between x80 to the x90\Titan\2080Ti is $443. The two most expensive price differences between the x80 and x90 was the 600 series ($600) and the 3000 series ($800). If we remove the two most expensive price differences between the x80 and x90, the 600 series and the 3000 series, that average drops to $340. Earlier I stated, and I will repeat, that the 2080 Ti price was calculated because the 20xx series did not have an x90 GPU variant; the Titan RTX (2000 series) MSRP was $2,499 which was aimed at an entirely different market so I did not include that price. So with the RTX 4000 series the price increases falls in-line with what we normally see if we compare the 4080 16GB to the 4090 ($400 difference in MSRP), but this number jumps to $700 if you compare the 4080 12GB to the 4090. Nvidia decided to cover the entire price difference over the past decade or so with the RTX 4080\90 on release day this time around. Instead of simply making one 4080 flagship Nvidia is trying to fill in the price gaps from the last generation and the incoming 7000 series from AMD, while leaving themselves and AiBs a large cushion to price their GPUs accordingly (4080 OCs \ RTX 4080 Ti).
4080 Price Outpaces x80 Prices From Previous Generations
x80 Sticker Shock
Some ethusiast might have an issue due to Nvidia using the x80 branding on GPUs that are clearly inferior by a large margin to the x90. The other issues are that the prices are not what we would typically pay for an x80 GPU. The average price for a GTX\RTX x80 GPU over the past 12 years has b een $599.99. So revealing that the 4080 16GB and the 4080 12GB sell for $1,199 and $899, respectfully, is a large increase for gamers. This is especially true given the fact that we have reached record inflation this year. Nvidia has slowly increased the x80 prices over the years with very few decreases and at this point you won’t see many (or any) decreases moving forward. The 2000 and 3000 series saw the release of the RTX 2080\3080 for $699 (MSRP). So seeing a $200 (4080 12GB) to $500 (4080 16GB) increase for a x80 GPU is pretty tough for the average consumer, especially knowing that it is clearly not the flagship and isn’t anywhere near the x90 this time around. Most people would probably just wait and see how the 4080 “Ti” performs when it releases. One thing we must not forget is that many people had no problems spending $1,200 to $1,800+ for a RTX 3080\3090 during the pandemic and shortages. So while it is easy to blame Nvidia, we cannot act like gamers & enthusiast\gamers didn’t have a hand in this. There were a lot of things that happened and are happening right now that will dictate how companies react to the market.
RTX 4080 12GB doesn’t use the same GPU die as the RTX 4080 16GB
The RTX 4090 & 4080 16GB uses the xx102 (AD102) GPU die. The final straw for some gamers and enthusiast would be that the RTX 4080 12GB GPU xx104 die size. Just using the RTX 3000 series as a reference the 3080 used the same GPU die as the 3090, obviously the x80 GPU die is cut down from the x90 die. The RTX 3070 used the xx104 (GA104) GPU die. This time around with the 4000 series Nvidia is releasing the RTX 4080 12GB with the xx104 (AD104) GPU die which is usually reserved for the x70 GPUs (RTX 3070\2070). Nvidia is selling a GPU and naming it an x80 tier while it uses an x70 GPU die (xx104), yet they are pricing it much higher than the previous x80 GPU as I explained earlier. This wouldn’t be the first time Nvidia has tried this and has got away with it. Several years ago Nvidia released the GTX 1080 using the xx104 die which is the same GPU die that the GTX 1070 used (the 1070 obviously had lower specs, lower SM count etc.). 10 months later the GTX 1080 “Ti” was released with the full xx102 GPU die. I remember having a few conversations about this on a small scale in the enthusiast communities, but it did nothing to stop GTX 1080s (xx104) from flying off the shelf.