NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector (2025)

');$('.tpu-fancybox-wrap').css('maxWidth', maxWidth);*/instance.$refs.stage.on('transitionend', function() {updateButtonPos(instance);});},onUpdate: updateButtonPos,afterShow: function(instance, slide) {updateButtonPos(instance);instance.$refs.inner.find('.fancybox-tpu-nav').show();},beforeClose: function(instance, slide) {instance.$refs.inner.find('.fancybox-tpu-nav').hide();},afterClose: function(instance, slide) {$('.tpu-fancybox-wrap').contents().unwrap();$('body').removeClass('tpu-fancybox-body-wrap')},baseTpl: '

',});});}loadjs.ready(['jquery', 'fancybox', 'swiper'], function() {attachLightbox('a[data-fancybox]');if ($(window).width()<600) {$('.imgcontainer').each(function() {var $this=$(this);if (($this.find('a').length==1) || ($this.find('a').length>7))return;$this.addClass('swiper-container');$this.find('a').addClass('swiper-slide').css('width', 'auto').wrapAll('

');new Swiper ($this.eq(0), { slidesPerView: 'auto', slidesPerGroup: 1, spaceBetween: 15, pagination: { el: '.swiper-pagination', clickable: true } });});}$('.newspost').on('click', '.spoiler > .button, .spoiler > a', function(e) {e.preventDefault();$(this).next('div').toggle();});$('.newspost').on('click', '.ispoiler', function(e) {e.preventDefault();$(this).find('div').css('filter', '');$(this).removeClass('ispoiler');});$('.contnt').on('click', '.newspoll_btn', function() {popup.Show('TechPowerUp Quick Poll','Loading...');$.get('/news-poll/options?id='+$(this).data('id'), function(data) {$('#popup_content').html(data);});});});

Monday, July 15th 2024

NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector (1)

by

AleksandarK
Discuss (167 Comments)

In the preparation season for NVIDIA's upcoming GeForce RTX 50 Series of GPUs, codenamed "Blackwell," one power supply manufacturer accidentally leaked the power configurations of all SKUs. Seasonic operates its power supply wattage calculator, allowing users to configure their systems online and get power supply recommendations. This means that the system often gets filled with CPU/GPU SKUs to accommodate the massive variety of components. This time we have the upcoming GeForce RTX 50 series, with RTX 5050 all the way up to the top RTX 5090 GPU. Starting with the GeForce RTX 5050, this SKU is expected to carry a 100 W TDP. Its bigger brother, the RTX 5060, bumps the TDP to 170 W, 55 W higher than the previous generation "Ada Lovelace" RTX 4060.

The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.

Sources:@Orlak29_ on X, via VideoCardz

Related News

  • Tags:
  • 12V-2x6
  • 500 W
  • Ada Lovelace
  • Blackwell
  • GeForce
  • GPU
  • NVIDIA
  • PCIe
  • PCIe 6.0
  • RTX
  • RTX 4060
  • RTX 5090
  • Seasonic
  • Aug 21st 2023 NVIDIA BIOS Signature Lock Broken, vBIOS Modding and Crossflash Enabled by Groundbreaking New Tools (210)
  • Dec 24th 2023 NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024 (126)
  • May 24th 2024 NVIDIA RTX 5090 "Blackwell" Founders Edition to Implement the "RTX 4090 Ti" Cinderblock Design (118)
  • May 5th 2024 NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025 (154)
  • May 9th 2024 NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W (84)
  • Feb 19th 2024 NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard (106)
  • Jun 11th 2024 Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked (141)
  • Aug 22nd 2023 NVIDIA Announces DLSS 3.5 Ray Reconstruction Technology, Works on GeForce 20 and Newer (89)
  • Apr 5th 2024 NVIDIA Releases DLSS 3.7.0 With Quality E Preset for Image Quality Improvements (54)
  • Jun 22nd 2024 Legendary Overclocker KINGPIN Leaves EVGA and Joins PNY to Develop Next-Generation GPUs for Extreme OC (196)
Add your own comment
#1
qwerty_lesh

just reiterating the sentiment from the earlier discussion on the connector - RIP to all of the ATX 3.0 buyers out there.

#2
LabRat 891

500W + Transient Peaks, over the 12V-2x6? :roll:

Looks like Intel is setting precedent:
It's okay to engineer products that will fail inside warranty.

#3
Hyderz

any ball park on performance compared to 40 series?

5090 - top
5080 - up to 10% faster than 4090
5070 - 5% slower 4080/S ?
5060 - in between 4070 and 4070 super?
5050 - 1-3% faster than 4060?

#5
Legacy-ZA

More interested to see if they stopped sniffing glue; meaning, if this generation will be affordable and have more than enough VRAM.

#6
Broken Processor
Legacy-ZAMore interested to see if they stopped sniffing glue; meaning, if this generation will be affordable and have more than enough VRAM.

I don't see them reducing pricing while AI demand is so strong and that will depend on ASIC being convincing enough to justify moving from Nvidia software which is currently the best and most used for AI.

#7
64K

The 5060 power requirement increase over the 4060 makes sense because the 4060 wasn't really a xx60 class GPU to begin with in the normal generation stack. The 4060 was really a xx50 class GPU and labeled a 4060 to be able to overcharge customers. imo the entire Ada stack was overpriced except maybe the 4090 because it was the gaming flagship and you always pay a premium for that.

I, as always, look forward to the next generation of GPUs and I hope the pricing for the Blackwells makes more sense than Ada did but we'll see.

#8
AusWolf
ratirtSomething tells me, this NV cards gen will not have staggering performance increase over the 4000 series.
We will see when the cards are released but that is my guess.

Performance increase maybe, but performance-per-watt increase definitely not. Otherwise, they wouldn't have to bump the wattage on each tier by this much.

#9
R0H1T

It will be better (perf/w) because of GDDR7, though the biggest difference would come from better "IPC" if any!

#10
Legacy-ZA
Broken ProcessorI don't see them reducing pricing while AI demand is so strong and that will depend on ASIC being convincing enough to justify moving from Nvidia software which is currently the best and most used for AI.

They have shipped far less GPU's this generation than any other over a decade.

The excuse is A.I demand, but in reality, people are voting with their wallets. They are artificially keeping the prices inflated, and it's going to blow up in their faces.

Gamers also understood, that the whole product stack of the 4000 series, except for the 4090, was bumped up a tier masquerading as something they are not, paying more, while frame gen had to do the lifting.

Gamers in general aren't as stupid as most think. nVidia will have to change strategies. I like their cards more than AMD, but only because older games are less of a hassle to configure to work.

AMD is starting to look mighty fine, especially if they can pull off a 7900GRE next launch at the price it's at now.

#11
Chaitanya

With GPU along sucking 500W, this time around wont be surprised to find 1kW PSU being bare minimum for High end WS builds(with single GPU).

#13
ratirt
AusWolfPerformance increase maybe, but performance-per-watt increase definitely not. Otherwise, they wouldn't have to bump the wattage on each tier by this much.

I'm talking about the staggering perf increase not a performance increase in the expense of power usage.
I speak in general to be honest but I agree. If there is a performance increase it will be at the expense of power but still the increase is not going to be substantial. The increase in price. Oh, that is a different story and this will be staggering. It's just my guess. It is what I think will happen.

#14
64K
ChaitanyaWith GPU along sucking 500W, this time around wont be surprised to find 1kW PSU being bare minimum for High end WS builds(with single GPU).

Not just that but what a card can spike to. The MSI 4090 Gaming X reviewed here on TPU while gaming drew 430 watts but spiked to 660 watts. Insane.

#15
AusWolf

Speaking of power, why does a 100 W card need a 16-pin connector? :kookoo: :shadedshu:

#16
ratirt
AusWolfSpeaking of power, why does a 100 W card need a 16-pin connector? :kookoo: :shadedshu:

There is one standard, the 16 pin connector. The card does not have to use it fully though. I don't think that is a problem. It will just use what it needs to sustain proper functionality.

#17
AusWolf
ratirtThere is one standard, the 16 pin connector. The card does not have to use it fully though. I don't think that is a problem. It will just use what it needs to sustain proper functionality.

Every PSU has at least an oldschool 6-pin, which is more than enough. A 16-pin requires a messy adaptor or a new PSU, and it's therefore pointless and unwanted on a low-power card.

#18
londiste
64KNot just that but what a card can spike to. The MSI 4090 Gaming X reviewed here on TPU while gaming drew 430 watts but spiked to 660 watts. Insane.

That is just MSI doing bad VRM design.
Spikes for power delivery are normal, the short 10-20ms spikes that sites including TPU are measuring these days normally do go a good 30% over average for a decent design. And that is OK.

AusWolfEvery PSU has at least an oldschool 6-pin, which is more than enough. A 16-pin requires a messy adaptor or a new PSU, and it's therefore pointless and unwanted on a low-power card.

It is simply about consolidation of standards. 6-pin is more than enough for a low-power card. It is not enough for midrange where you would need an 8-pin. And higher end needs 2-3 of those...
Yes, the 16-pin has all the sense stuff and there is - or should be - limits based on what PSU can provide but that is still a more elegant solution.

#19
Bwaze
ratirtSomething tells me, this NV cards gen will not have staggering performance increase over the 4000 series.
We will see when the cards are released but that is my guess.

Something tells me it won't matter - due to explosion in AI demand and revenue I believe we will see a release mostly focussed on how these cards can be used for machine learning, home Neural acceleration etc., we will see focus on all the applications AI can and could some day perform, and so Nvidia will even rename the cards from gaming to something that should encompass gaming + neural acceleration, and "Gaming" sector won't be called gaming any more - and they will show even Gaming itself needs AI acceleration now, for all the smart antialiasing / resizing to accelerating NPCs, speech recognition and generation etc...

And they will provide proof - all the extra "Gaming" revenue that is pushing Gaming to record heights is coming from orders for AI related acceleration now.

So the tiers, pricing, everything is open to total change now that the market's changed.

#20
AusWolf
BwazeSomething tells me it won't matter - due to explosion in AI demand and revenue I believe we will see a release mostly focussed on how these cards can be used for machine learning, home Neural acceleration etc., we will see focus on all the applications AI can and could some day perform, and so Nvidia will even rename the cards from gaming to something that should encompass gaming + neural acceleration, and "Gaming" sector won't be called gaming any more - and they will show even Gaming itself needs AI acceleration now, for all the smart antialiasing / resizing to accelerating NPCs, speech recognition and generation etc...

And they will provide proof - all the extra "Gaming" revenue that is pushing Gaming to record heights is coming from orders for AI related acceleration now.

So the tiers, pricing, everything is open to total change now that the market's changed.

I put a like on this thought because there's an element of truth in it - and not because I actually like it.

Personally, I'd much rather see gaming and AI take separate paths, but I don't think it's gonna happen.

londisteIt is simply about consolidation of standards. 6-pin is more than enough for a low-power card. It is not enough for midrange where you would need an 8-pin. And higher end needs 2-3 of those...
Yes, the 16-pin has all the sense stuff and there is - or should be - limits based on what PSU can provide but that is still a more elegant solution.

It seems more like a diversification of standards to me. If you want AMD, your old PSU with 8-pin power is fine, but if you want Nvidia, even a low power model, you should get a new PSU, or use an ugly adapter. Why? What's elegant about this?

#21
wolf

Better Than Native

How do they perform and what will they cost, these power figures don't phase me at all, nor does the use of power connector - I find that to be an enormous nothing burger.

#22
oxrufiioxo
AusWolfEvery PSU has at least an oldschool 6-pin, which is more than enough. A 16-pin requires a messy adaptor or a new PSU, and it's therefore pointless and unwanted on a low-power card.

You can easily get a 2x8 pin to 16 pin adapter for almost every major psu maker. None of my power supplies have a native 12VHPWR socket yet all of them down to 750w seasonic GX works fine with my 4090 and all have a 2x8 to 1x16

#23
ARF
ratirtThere is one standard, the 16 pin connector.

Since when? Have you asked AMD? And how many Radeons and intel Arcs exactly use this new low quality power connector?

londisteIt is simply about consolidation of standards.

Wrong.

londiste6-pin is more than enough for a low-power card. It is not enough for midrange where you would need an 8-pin. And higher end needs 2-3 of those...

6-pin and 8-pin will remain the one standard, while the stupid nvidia will push for the low quality cost-saving, melting and hazardous "16-pin" which can never transfer safely more than 200 watts..

#24
oxrufiioxo
wolfHow do they perform and what will they cost, these power figures don't phase me at all, nor does the use of power connector - I find that to be an enormous nothing burger.

The process node isn't a huge leap so the majority of any gains will have to come from larger die size, higher IPC, and clock/ram speed.

My guess is the 5090 will be 50 to 60% faster than the 4090 but even more expensive with RT being a bit higher. The 5080 matching the 4090 or slightly exceeding it 10% ish for 1200 usd does anything lower really matter not to me lol.

Oh maybe a 700-800 usd 4080 matching 5070 with 12GB of vram lmao cuz nvidia gonna be nvidia with their fanbois defending it drunk on that green Kool-aid.

#25
AusWolf
oxrufiioxoYou can easily get a 2x8 pin to 16 pin adapter for almost every major psu maker. None of my power supplies have a native 12VHPWR or atx 3.0 socket yet all of them down to 750w seasonic GX works fine with my 4090 and all have a 2x8 to 1x16

But why should I use a bulky 2x8-pin adaptor when a single 6-pin cable would do the job just fine?

Add your own comment
NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector (2025)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Barbera Armstrong

Last Updated:

Views: 5543

Rating: 4.9 / 5 (79 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Barbera Armstrong

Birthday: 1992-09-12

Address: Suite 993 99852 Daugherty Causeway, Ritchiehaven, VT 49630

Phone: +5026838435397

Job: National Engineer

Hobby: Listening to music, Board games, Photography, Ice skating, LARPing, Kite flying, Rugby

Introduction: My name is Barbera Armstrong, I am a lovely, delightful, cooperative, funny, enchanting, vivacious, tender person who loves writing and wants to share my knowledge and understanding with you.