RX Vega 64 Power Consumption
So far the Radeon RX Vega 64 LC has been a great upgrade to replace my R9 Fury X LC, but there are a few concerns. However, those concerns can easily be solved as I will explain below. Power consumption has always been one of AMD’s Achilles’ heels on both the CPU and GPU side. AMD has resolved their CPU power consumption mostly due to smaller CPU fabrications and outside help. The GPU power consumption is much better than it used to be and AMD is known to push the wattage above what is truly needed. So undervolting became a big thing for AMD GPUs over the years since it allowed users to save money on their electricity bill and worry less about heat output. Undervolting naturally led to underclocking for stability reasons, but the most people want the “full” performance of the GPU boost clocks, so normally people would undervolt to a certain point while leaving the core clock alone. With the Vega architecture AMD has focused on providing many built-in and manually accessible power saving features.
Radeon Tuning Controls and Presets
AMD's Radeon Software has three “Tuning Control” settings: Automatic, Preset and Manual for the RX Vega 64. Within the “Automatic” and “Preset” settings AMD provides “Auto Tuning” and “Tuning Preset” setting respectfully. The “Manual” Tuning Control obviously has no preset menu since you will be tuning the GPU, HBM, Fan and Power Limits manually.
For lower wattage usage the “Preset” with the “Power Saving” Tuning would be the best setup for daily use and most gaming sessions as we will see in my charts throughout this area of the article.
As you can see above the RX Vega 64 typically only pulls 3 watts during daily usage (from the GPU chip only). In all of my benchmarks in this article I used the “Automatic” & “Default” settings to give a normal representation of the performance. This mode will run the GPU in its standard mode that you’ll expect out of the box and of course you won’t get any massive power savings. However, this is where my problems began with this awesome GPU. While benchmarking Shadow of the Tomb Raider (Internal Benchmark Tool) I watched my entire Gaming Rig (Overclock CPU\RAM\MB etc.) literally hit a “peak” of 673watts. That is simply ridiculous, but remember that I was running a decently overclocked CPU\RAM\MB and my rig isn’t the most power efficient being 12 years old (Intel X58). During my Hitman 2 benchmark I hit a “peak” of 611watts. Once again that’s crazy high. Now remember that these are “peaks” from my entire gaming rig so you won’t always be pulling those many watts on average, but it’s still worth noting. I would also like to include that these “peak” power wattage results are found while running unrealistic stressful benchmark tools in-game and doesn’t represent actual gameplay scenarios. You won’t always need to run your card at full speed in every area or scene in the majority of games. The RX Vega 64 LC reports to pull 265watts (GPU Chip) using stock settings with no power saving features enabled.
From this point I knew that I couldn’t run the Vega 64 using the stock settings and started exploring the power saving features that AMD touted during their Vega marketing rollout. Unlike my previous R9 Fury X 2020 review were I used 3DMark FireStrike, this time around I have decided to use a more up to date benchmarking\stress tool from Unigine. So let’s take a look and see how well AMD power saving features work in the Superposition Benchmark (v1.1).
Focusing for a moment on the stock X58 + X5660 + Vega 64 LC we see that Unigine Superposition v1.1 pulls roughly 556-560watts easily without the Power Savings. Simply enabling the power savings features as we viewed above in the screenshot (“Preset” > “Power Saving”), when can see a decrease of approximately 84watts. The peak can still reach 561watts and that behavior was expected since the Power Saving features must balance performance and power saving features effectively. What’s more important is the average wattage that you will be pulling from your outlet. So everything is already looking good without touching any manual settings. Now let’s take a look at an overclocked X58 + X5660 @ 4Ghz.
Now we turn our focus to an overclocked X58 + X5660 @ 4Ghz while using the RX Vega 64 Power Savings feature. I ran a series of test at 4K and 1440p using the “Extreme Preset” to give a nice balance of GPU and CPU workloads. Using the 4K (3840x2160p) Extreme Presets we can see that my Gaming Rig is pulling roughly 598 watts with a peak of 604 watts. Using the power saving mode we can see a decrease of around 108watts, which lowers the average down to 490watts which is a very good start. The RX Vega 64 reported that it was using 265watts @ 4K, but when the power savings were enabled the RX Vega reported only 197watts was being used. So with power savings we can see the RX Vega 64 save around 68watts.
Now the 2560x1440p Extreme Preset pretty much shows the same results. We have a drop of around 98watts @ 1440p. AMD also has a feature that they call “Radeon Chill” which is basically a FPS Limiter. I limited the FPS to 30fps and we can see that my Gaming Rig pulls a much lower amount of wattage from the outlet.
I would like to state that for Power Saving reasons, using “Radeon Chill” FPS Limiter feature by itself is pointless without using the Power Saving Preset from my experience; unless you are using the FPS Limiter for other reasons such as screen tearing. Basically if you only use the FPS Limter you will limit your FPS as expected, but you won’t get any power saving benefits. So using them both in tandem would ideal. At the very least you’ll want the Power Saving features enabled for most gaming sessions.
Now I would like to take a look at the Power Saving features and the "Radeon Chill" FPS Limiter in an actual gaming session. I will be using Resident Evil 2 for this test.
Looking at the numbers you can clearly see a benefit across the board for the Power Saving features. Using the max wattage as a reference you can see that by simply enabling the Power Saving Preset I was able to slash my wattage by 168watts during actual gameplay. The RX Vega 64 went from pulling 265watts down to 200watts on the GPU chip so that's great even without the FPS Limit. Using the Radeon Chill feature along with the FPS Limiter set to 65FPS I was able to drop my power consumption by another 67watts to only 360watts max. Then by limiting my FPS to only 60FPS I dropped down another 15watts with the peak wattage of 345watts.
Let’s not forget that the power saving features are not also great for the GPU wattage usage, but great for the GPU heat output as well. While playing Resident Evil 2 with a 60FPS limit the RX Vega 64 only used an average of 76watts on the GPU. The temperatures averaged 35c as well. RE2 is easy game to run since it is optimized very well and obviously the Vega doesn’t have to sweat at all with low FPS.
The same can be said when playing Wolfenstein II: The New Colossus. I used the Power Saving Preset along with Radeon Chill to limit my FPS to 100FPS. My temps average 35c and my GPU wattage usage averaged only 73watts. When gaming @ 4K I limited my FPS to only 65fps and the RX Vega 64 used only 195watts which is down from 265watts
So the RX Vega 64 power consumption started off with some really wild results, but as we can see there are several simple ways you can instantly save a lot of wattage and lower the RX Vega 64 heat output. If you can live with limiting your FPS to 60fps – 100fps that can help you greatly during long gaming sessions and will definitely save you money on your electric bill. You can take it a step further and manually setup several undervolting and underclocking settings within the AMD Radeon software Tuning settings. AMD has went the extra step and allowed us more control over the HBM performance as well. I’ll need more time with the card to tweak settings manually and understand what this card requires more clearly. Overall I am impressed with the power consumption controls that we have access to.
Here is a screenshot of the manual settings that you can configure.