No Nvidia Killer? Big Navi's best could match the RTX 2080 Ti, fall short of the RTX 3080...

Techpowerup directly measure actual power consumption of the graphics cards they test (not system draw or sensor based) and reported the following for the RX 5700 XT:

View attachment 86877
(Source)

That’s a 62W difference between the two, and even if all of that electrical power is transferred into heat, the 5700 XT is still producing 21.8% less heat than the 2080 Ti. However, what one can claim is that the AMD GPU has a higher power per transistor value than the Nvidia one - 21.65 watts per billion transistors compared to 15.3 W/bTrans - although such power figures include DRAM consumption and board loses.

For reference purposes, the 1080 Ti is roughly the same as RX 5700 XT at 22.6 W/bTrans, the Radeon VII is 26.5, and the Vega 64 they tested would be 24.7 - this suggests that AMD went with the goal for getting the die as small as possible, to improve wafer output, as well as overall performance (it's notably better performing in games than the Vega 64), rather than outright power efficiency.

With no architectural changes whatsoever, an 80 CU 'Navi 10' would be possibly be hitting 450 to 500W, and while it wouldn't be the first graphics card they've released with that kind of power consumption, it would certainly be their first single GPU card at that level. That said, larger GPUs can't and don't need to run at high clock speeds, so some of that excessive power requirement would be clawed back that way. Also, an 80 CU next-Navi chip wouldn't be just 2x Navi 10s - it's not going to have double the number of ROPs and memory controllers, for example, so again, that will be bring the consumption down. We may still be looking at a 300-350W 'big Navi' card, though this also assumes that (a) AMD haven't streamlined the current requirements for their design and (b) TSMC haven't refined the N7 process by then.
What you’re saying does seem to go in line with the article. So big Navi is likely to be a 2080ti beater but very hot and power heavy. Potentially even liquid cooled, like the Fury X perhaps? And without features like DLSS and ray tracing. But I guess they’ll undercut Nvidia in price. Doesn’t really interest me, I like quiet computers and being an “enthusiast” I like features like ray tracing. I’d rather pay the Nvidia tax and have these extras myself. I’m loving DLSS at the moment and I think it could be a killer feature in an upcoming budget card, say a 3050 (if it exists).
 
So big Navi is likely to be a 2080ti beater but very hot and power heavy. Potentially even liquid cooled, like the Fury X perhaps?
That’s certainly a possibility, but I suspect that it won’t be 80 CUs - the ‘full’ chip, yes, but consumer boards will probably be equipped with 76 or lower. Depends on whether or not AMD want to go for high clock speeds or not.

And without features like DLSS and ray tracing.
One doesn’t need tensor cores to do temporal upscaling, which is what DLSS effectively is now, so AMD could offer something in the future, especially since RDNA chips already support DirectML (example of such a system by Microsoft: https://github.com/microsoft/Direct...Resolution/Samples/ML/DirectMLSuperResolution). I’d surprised if there isn’t any hardware acceleration for BVH searches or ray intersections, though, given that AMD have already created something for the XBSX/PS5 GPU.

They do need to avoid under cutting Nvidia’s prices too much, though - they’ve worked hard at addressing their old reputation of being ‘the cheap choice’ with their CPUs, and it would be disappointing to see this not happen with their consumer graphics card sector.
 
That’s certainly a possibility, but I suspect that it won’t be 80 CUs - the ‘full’ chip, yes, but consumer boards will probably be equipped with 76 or lower. Depends on whether or not AMD want to go for high clock speeds or not.


One doesn’t need tensor cores to do temporal upscaling, which is what DLSS effectively is now, so AMD could offer something in the future, especially since RDNA chips already support DirectML (example of such a system by Microsoft: https://github.com/microsoft/Direct...Resolution/Samples/ML/DirectMLSuperResolution). I’d surprised if there isn’t any hardware acceleration for BVH searches or ray intersections, though, given that AMD have already created something for the XBSX/PS5 GPU.

They do need to avoid under cutting Nvidia’s prices too much, though - they’ve worked hard at addressing their old reputation of being ‘the cheap choice’ with their CPUs, and it would be disappointing to see this not happen with their consumer graphics card sector.
They don’t need tensor cores no. But they don’t seem to be doing anything. Personally I can’t see how they will compete if they do nothing so there must be something in the works. They could also fix their drivers. Until they do I don’t think they will have much of a choice of being the “cheap choice”.
 
That's a lot of theorycraft for both pro amd and pro nvidia in the previous posts. But if AMD brings 2080ti performance for $400-500 I guaranteed it will sel like hotcakes. A used 2080ti is like what $800-900 today?
 
TBH I don't expect so much from AMD this coming generation (compared to Nvidia). Nvidia has the clear architectural advantage ever since the Pascal, hands down. Because Nvidia's 12nm cards already offer similar perf/watt to and competes really well against AMD's 7nm Navi (even though AMD'S die sizes are small in comparison). Let us be honest, with Nvidia having a node shrink in their Ampere, AMD needs nothing short of a miracle.
 
That’s certainly a possibility, but I suspect that it won’t be 80 CUs - the ‘full’ chip, yes, but consumer boards will probably be equipped with 76 or lower. Depends on whether or not AMD want to go for high clock speeds or not.


One doesn’t need tensor cores to do temporal upscaling, which is what DLSS effectively is now, so AMD could offer something in the future, especially since RDNA chips already support DirectML (example of such a system by Microsoft: https://github.com/microsoft/Direct...Resolution/Samples/ML/DirectMLSuperResolution). I’d surprised if there isn’t any hardware acceleration for BVH searches or ray intersections, though, given that AMD have already created something for the XBSX/PS5 GPU.

They do need to avoid under cutting Nvidia’s prices too much, though - they’ve worked hard at addressing their old reputation of being ‘the cheap choice’ with their CPUs, and it would be disappointing to see this not happen with their consumer graphics card sector.


104 posts, before someone mentions directML <--------

All those people who are so EXCITED for nVidia and DLSS, must understand that DLSS is a hoax. Why would a Game Developer even bother with DLSS, when directML is more robust and superior.... and is also an Industry Standard.

No game dev is going to waste their time on proprietary stuff, unless nVidia pays them. It's just like just like the rest of nVidia's hoaxes.. Jensen Huang could NOT pay enough game dev's to use RTX... so now nVidia had dumped that in favor of DXR... (only 6 RTX games existed + 4 DLSS games over the last 20 months....)

Exact same will continue to happen with DLSS... because nVidia themselves will have to support directML anyways.



RDNA will own the game market, because it does not rely on proprietary means, but accelerates industry standards, better than nVidia's bloated Turing, or Ampere.

Understand, a single die the size of navi10 using rdna2 would be as powerful as a 2080ti and still sell for $389 bucks. Many of you people have zero clue what is coming. nVidia is using bloated architecture NOT meant for gaming... and it can not compete.

navi10 vs TU-106 proved this.

 
I find this hard to believe to be honest.
In consoles, we have an RDNA2 GPU running at 2.2GHz, and we have an RDNA2 GPU with 52CUs at under 200W.
Assuming no IPC gain, the XSX GPU should already be 30% faster than a 5700XT. That is already in the ballpark of a 2080Ti. So unless Big Navi has 60CUs or less, I can't believe this will be true. The rumored 72CU or 80CUs should be much faster than a 2080Ti.
 
I find this hard to believe to be honest.
In consoles, we have an RDNA2 GPU running at 2.2GHz, and we have an RDNA2 GPU with 52CUs at under 200W.
Assuming no IPC gain, the XSX GPU should already be 30% faster than a 5700XT. That is already in the ballpark of a 2080Ti. So unless Big Navi has 60CUs or less, I can't believe this will be true. The rumored 72CU or 80CUs should be much faster than a 2080Ti.

There are a good many people who do not want to discuss this in earnest. But if what Dr Su has hinted at, is that RDNA proper (full uArch), should be on average at least 30% faster... and that a navi10 sized die (251mm^2) using RDNA(2) would be about as powerful as a 2080ti... and cost the same $389.


AMD Navi 21 (505mm2) 80 CUs
AMD Navi 22 (344mm2) 60 CUs <-- (AKA "big navi")
AMD Navi 23 (244mm2) 40 CUs
 
That's a lot of theorycraft for both pro amd and pro nvidia in the previous posts. But if AMD brings 2080ti performance for $400-500 I guaranteed it will sel like hotcakes. A used 2080ti is like what $800-900 today?
No, it won’t sell because in the same class there will be GeForce RTX 3070.
If this rumor is confirmed, AMD will face the same situation was facing this year, with a product that has a more solid Nvidia competitor (5700XT vs 2070 Super).
Price difference ? Yep, there is a price difference, but not so big...
 
1 st of all amd nvidia intel xe are some nice cpu gpu chipset builders. but talking about stability in drivers would be a nother question. when you game you should not have to gammble with youre time. install drivers game and you should be on. amd unstable non free sync errors bsod and so on.and 3d screen with 120 htz nvidia can be used on amd free sync and so on. now more support for x32. so games that was nice on x32 will no more run on x64. x64 patches needed for many programs.
 
Last edited:
No, it won’t sell because in the same class there will be GeForce RTX 3070.
If this rumor is confirmed, AMD will face the same situation was facing this year, with a product that has a more solid Nvidia competitor (5700XT vs 2070 Super).
Price difference ? Yep, there is a price difference, but not so big...
It's big enough. The 5700XT vs 2070S is basically about 5% more performance for a 25% higher price. The main thing that made Turing more popular was RTX, which was basically a gaming for anything other than the 2080Ti. If AMD implements DXR in RDNA2, which apparently they will, if the same difference is there performance and price wise, the AMD card will be a better deal.

People are now talking up DLSS. So that might be another thing that drives attention away from AMD, if they don't have an equivalent next gen. But AMD is slowly but surely becoming more compelling. They have their own unique features, and with the popularity of their CPUs more people might try out their GPUs.
 
I find it amusing how AMD cards are always Nvidia beaters before they release. After they release the performance is always worse than before lol.

What’s AMDs next big card after Navi called? Need to prepare myself for everyone telling me that it will beat Nvidia when this one gets beaten by an xx70 series GeForce.
 
It's big enough. The 5700XT vs 2070S is basically about 5% more performance for a 25% higher price. The main thing that made Turing more popular was RTX, which was basically a gaming for anything other than the 2080Ti. If AMD implements DXR in RDNA2, which apparently they will, if the same difference is there performance and price wise, the AMD card will be a better deal.

People are now talking up DLSS. So that might be another thing that drives attention away from AMD, if they don't have an equivalent next gen. But AMD is slowly but surely becoming more compelling. They have their own unique features, and with the popularity of their CPUs more people might try out their GPUs.
5% ? Hardly... The difference is 7% but in many games it is more than 15%.
It worth the price difference.

I own both the VGA (one 5700X with a 9700K and a 2070 Super with a just upgraded 3900X) and the Nvidia solution is definitely better.
 
5% ? Hardly... The difference is 7% but in many games it is more than 15%.
It worth the price difference.

I own both the VGA (one 5700X with a 9700K and a 2070 Super with a just upgraded 3900X) and the Nvidia solution is definitely better.
In the U.K. today you can get a 2070S for £399 and a 5700XT for £379 so the price difference is only 5% here.
 
Last edited by a moderator:
Hi Shadowboxer - not going to requote - I understand folks have problems with AMD drivers - it just their relentless nature doing so on every AMD post when it's not central to main point - As I stated I'm MOR gamer I have a RX2060 ( bang for bucks ) - I went with Nvidia for it's turing encoder ( not as bad as software encoders make out with good settings )- so I will also be interested in ampere - maybe get a rx2660 or what ever they call their cheapy . My main comment on this thread was for AMD to lever all their work with M/S and Sony on that range of GPUs in the up and coming Consoles to get the drivers perfect and to get maximum performance - especially as M/S does all that DX12 API stuff - which would dovetail nicely into your point about drivers - In V2 and V3 in the consoles Sony and M/S will be seeking less cooling & powerdraw for their slim models
 
Hi Shadowboxer - not going to requote - I understand folks have problems with AMD drivers - it just their relentless nature doing so on every AMD post when it's not central to main point - As I stated I'm MOR gamer I have a RX2060 ( bang for bucks ) - I went with Nvidia for it's turing encoder ( not as bad as software encoders make out with good settings )- so I will also be interested in ampere - maybe get a rx2660 or what ever they call their cheapy . My main comment on this thread was for AMD to lever all their work with M/S and Sony on that range of GPUs in the up and coming Consoles to get the drivers perfect and to get maximum performance - especially as M/S does all that DX12 API stuff - which would dovetail nicely into your point about drivers - In V2 and V3 in the consoles Sony and M/S will be seeking less cooling & powerdraw for their slim models
Sorry mate, I’m not going to stop calling out AMDs atrocious driver situation until they fix it. You are quite welcome to select “ignore” on my posts if this upsets you too much.
 
The top end cards are only a fraction of the market anyway, as long as performance is on par and the price is right, the fact it isn't faster than a 3080ti doesn't really mean much.

Can't deny that those top-end cards are excellent marketing material though.You may move more mid-tier and low-tier cards, but the top-tier helps to move them in the first place.
 
RDNA1 was competitive in its different segments. The shame is that AMD has taken the habit of launching their new cards way too late. If they can launch the product with actual availability at the same time as nvidia, they should be good.
 
In the U.K. today you can get a 2070S for £399 and a 5700XT for £379 so the price difference is only 5% here.

Best 2070 Super price here in Canada is Newegg at $732 all-in for the Gigabyte card. The Sapphire Pulse 5700XT would cost me $570, a very significant difference of $162 for about a 6% increase in fps in the games I play (wouldn't even notice that at 1440p). No brainer to get a 5700XT...if I were not skipping this generation. The only thing that may yet convince me is a fire sale on either of these cards for at least $100 less as I suspect the new ones when they are released will not be even reasonably priced, let alone cheap.
 
[...] all I've had seen is people hoping AMD can get a bit more competitive in the top end to put downward price pressure on Nvidia . Most of on here think $2000 is crazy just to play games on slightly better settings - NVidia knows and most of us know -these players will keep spending that dosh & more . Most of the times I see Nvidia fans ( paid trolls ) coming into every thread about AMD cards and raving how bad the drivers are - least with the Quantum guy who keeps mentioning his great gear it seems like a running gag .
[...]
You should refrain from using this argument in the future. That's like saying: I'm pissed that some people are able to afford a 700 horsepower Ferrari so I want VW to make a similar car, sold at a lower price. ???
VW knows if they were to release a 700 hp car at 1/3 the price, they'd sell like hot cakes, but they'd lose the **** ton of money they've put in the R&D to make it happen.

actually substitute want for demand, because I sense some entitlement in your writing. nothing atypical these days tbh.
 
You should refrain from using this argument in the future. That's like saying: I'm pissed that some people are able to afford a 700 horsepower Ferrari so I want VW to make a similar car, sold at a lower price. ???
VW knows if they were to release a 700 hp car at 1/3 the price, they'd sell like hot cakes, but they'd lose the **** ton of money they've put in the R&D to make it happen.

actually substitute want for demand, because I sense some entitlement in your writing. nothing atypical these days tbh.

Quantum does like to talk about his rig...for years now. Sadly it seems everyone with a 9900k and 2080ti will let you know about it. Most psychologists would have something to say about that. Don't use the word 'afford' either...it's insulting to those of us with a life/intelligence that don't spend stupid cash on video game hardware even though we can easily 'afford' it. I'd rather get what I need and a little more for my gaming 'hobby' and spend the rest on more important things...boat, cabin, etc.
 
Sadly it seems everyone with a 9900k and 2080ti will let you know about it.
it's insulting to those of us with a life/intelligence that don't spend stupid cash on video game hardware even though we can easily 'afford' it.
and you are boasting about being able to buy a 2080ti. lol. pot kettle.
but good on you for not buying a Ferrari even though you could afford it. After all we all play different games. I just addressed the fact that some people think they're being owed something. you're not.

Would it be great if AMD released a half-priced top of the line ampere killer? sure
Will it happen? no. why should they half-price it if they had to play catch-up and lose on revenue while at it. plus, they have other contractual obligations for the moment: towards MS and sony
 
Back