AMD Radeon RX 6000 series graphics cards revealed, feature double the performance of the...

What AMD did with RDN 2 is nothing short of great considering they had to work with the same node as previous gen. I believe years of expertise on console hardware and their HSA efforts from years ago contributed.... Also Nvidia's "claimed" troubles with the Samsung node?
 
What AMD did with RDN 2 is nothing short of great considering they had to work with the same node as previous gen. I believe years of expertise on console hardware and their HSA efforts from years ago contributed.... Also Nvidia's "claimed" troubles with the Samsung node?
Now that you mention the node...I find it interesting that AMD so far is not using 7nm EUV for any of their new products. Curious.
 
Now that you mention the node...I find it interesting that AMD so far is not using 7nm EUV for any of their new products. Curious.
They need to keep yields high for both their consumer GPUs and the consoles. Another factor is the volume TSMC can produce using 7nm+ compared to regular 7nm and cost per wafer.

As a side-note, Nvidia went with Samsung's 8nm precisely because it was the only one that could give them good enough volume (Samsung also has an 7nm EUV node).

Take this with a pinch of salt: rumors say that Samsung's 8nm is about 30% cheaper than TSMC's 7nm (which should be between 8000-10000$ per wafer). High demand for TSMC drove price up.
 
They need to keep yields high for both their consumer GPUs and the consoles. Another factor is the volume TSMC can produce using 7nm+ compared to regular 7nm and cost per wafer.

As a side-note, Nvidia went with Samsung's 8nm precisely because it was the only one that could give them good enough volume (Samsung also has an 7nm EUV node).

Take this with a pinch of salt: rumors say that Samsung's 8nm is about 30% cheaper than TSMC's 7nm (which should be between 8000-10000$ per wafer). High demand for TSMC drove price up.
Pricing is definitely a factor and it seems the optimized standard 7nm process is good enough for Ryzen and RDNA2, but I keep thinking AMD may have saved wafers on 7nm EUV for something else.
 
Pricing is definitely a factor and it seems the optimized standard 7nm process is good enough for Ryzen and RDNA2, but I keep thinking AMD may have saved wafers on 7nm EUV for something else.
they might be skipping it completely and just use 5nm EUV for Zen 4. 7nm+ was supposed to be a stop gap node which used EUV only for select layers until 5nm was ready for mass production, but at this point I doubt it will gain any serious traction.
 
Does that translate into anything meaningful? we need the benchmarks first, but is unlikely to impact FPS in games, not even in 4K I believe between these 2 cards. I could be wrong so lets see Steve's benchmark marathon first.
Do you usually buy such expensive hardware for 1-2 years? If so then it should not mater to you how much VRAM it has since it's not a normal 3-5 years investment.
Some games slow to a crawl without enough vram. MSFS for an example of a current game.
 
They need to keep yields high for both their consumer GPUs and the consoles. Another factor is the volume TSMC can produce using 7nm+ compared to regular 7nm and cost per wafer.

As a side-note, Nvidia went with Samsung's 8nm precisely because it was the only one that could give them good enough volume (Samsung also has an 7nm EUV node).

Take this with a pinch of salt: rumors say that Samsung's 8nm is about 30% cheaper than TSMC's 7nm (which should be between 8000-10000$ per wafer). High demand for TSMC drove price up.
Yes I read that somewhere too. (Samsungs cheaper process contract) But as I understand it, TSMC's capacity was full anyway when Nvidia eventually approached them with demands for a cheaper deal.
In any case...The Samsung deal turned out to be not so good for them due to abysmal yields...which is the REAL reason for the shortage of Ampere chips.
 
The Radeon 5700XT heralded a very big jump in performances and unsettled Nvidia on how much the Radeon have been making strides.

And it continues on to the Radeon 6xxx series.

Interesting times.

If all goes well, I'll continue with my planned upgrade to the 6900XT from my current 5700XT.
 
Yes I read that somewhere too. (Samsungs cheaper process contract) But as I understand it, TSMC's capacity was full anyway when Nvidia eventually approached them with demands for a cheaper deal.
In any case...The Samsung deal turned out to be not so good for them due to abysmal yields...which is the REAL reason for the shortage of Ampere chips.

Yes, I also thought Nvidia had tried to strongarm TSMC into giving them a better deal. Given how far TSMCs capacity is booked out in advance, they understandably told Nvidia to pound sand.

Nvidia has a reputation of being a difficult company to work with, so it's not particularly surprising behaviour.
 
As I said, AMD and their fake misleading benches.

We know now why the 5xxx series seems so fast in gaming. They used the smart access memory to inflate the results.

Here it is the same, also using heavy overclocking to even match the competition. Imagine Intel or Nvidia doing this, presenting OC cpus or gpus and comparing them with stock versions of the competitors, just lol.
These graphs coming from AMD seem pretty honest and straightforward to me and I see no heavy overclocking. There is only slight overclocking (named Rage Mode) applied for the RX 6900 XT benchmark. I hope you are not shilling for Nvidia here, of all places. :)
 
Initial tests of the 3070 show that even the AIB models with good cooling have very limited OC headroom, of course for the cost of increasing power consumption noticably. If this is any indication to Ampere's OC-ability, it would be reasonable to think bigger brothers also have poor OC potential.

On AMD's side, rumors suggest that RDNA 2 may offer a very good OC potential, at least for AIB designs.

Nvidia cards don't draw a good picture on power usage, OC headroom, availability so far. Let's hope RDNAs offer a good competition, forcing Nvidia to price-drop or come back with revised products. Competition is good for us.
 
6800 offers double the amount of vram for $79 more
Yes, compared to the RTX 3070, the RX 6800 is definitely a good deal. However, when compared to the RX 6800 XT, the RX 6800 loses some of its lustre. The RX 6800 XT is only $70 more than the 6800 and the performance delta is pretty big. The RX 6800 XT is definitely the price/performance sweet spot in this case.

I honestly don't know if 16GB of VRAM isn't too much for the RX 6800. Perhaps 12GB would have been more appropriate for its level of performance. Back in the day, I bought a Palit GeForce 8500 GT Super+ with 1GB of DDR2. I didn't have any illusions that it would be able to use the whole 1GB and only bought it because it was the same price as the vanilla EVGA version at the time.

In this case, 12GB would still be 50% more than the RTX 3070 and probably would have allowed the RX 6800 to have been more attractively-priced while not losing any performance or longevity. I think that the whole reason that the RX 6800 has 16GB of VRAM is because it LOOKS better (and it does) but I don't know if it IS better. I could be wrong.
 
We've been yelling for AMD to be competitive for decades and it looks like they're finally here to kick some ***. I've switched over to AMD cpus and I might have for the first time ever a total AMD system. Great Job Dr. Lisa.
It's nothing to be afraid of, I've been AMD/ATi since 2008. It works the same as any other PC. LOL
 
Some games slow to a crawl without enough vram. MSFS for an example of a current game.
4kg37w.jpg
 
As I said, AMD and their fake misleading benches.

We know now why the 5xxx series seems so fast in gaming. They used the smart access memory to inflate the results.

Here it is the same, also using heavy overclocking to even match the competition. Imagine Intel or Nvidia doing this, presenting OC cpus or gpus and comparing them with stock versions of the competitors, just lol.

1) The overclocking you are referring to is rage mode, and it isn't really overclocking in the sense you are implying. GamersNexus stated that it simply increases the power budget but ultimately the clocks are still determined by the boost algorithm. AMD has stated the performance gains from rage mode in the benchmarks were 1 - 2%. It's less a difference in performance than typical silicon lottery.

2) The 6800XT results had neither Smart Access memory nor Rage mode enabled, just saying.

3) I don't really see how one can inflate results with Smart Access Memory given that it's a feature supported by the card that works at low level. It's like saying Nvidia is inflating performance because it accelerates CUDA workloads and AMD doesn't.
 
What AMD did with RDN 2 is nothing short of great considering they had to work with the same node as previous gen. I believe years of expertise on console hardware and their HSA efforts from years ago contributed.... Also Nvidia's "claimed" troubles with the Samsung node?
I'm sure that the 35 years of video card experience might've had a little to do with it too. When you have veteran engineers like that, all they need is a budget that isn't of the shoestring variety to work miracles. ;)
 
AMD didn't use the 6000 GPU series in those slides so you entire argument doesn't make sense at all. They used an RTX 2080 Ti and only gave a small teaser at the end with 3 games for the 6800 XT.

FYI Nvidia used DLSS in some of their slides, is that "cheating" too? For example when they compared raytracing vs the GTX 1080 they enabled DLSS to make it seem like the the RT cores offer that much better performance.

The only reason Nvidia didn't compare to AMD is because they didn't have something at the 2080ti performance range to compare to. But you can be damn sure that they would have enabled DLSS or other things if they had to.

As for Intel, they have some of the worst slides you can get. Weird settings, weird software used etc.
Steve Walton HIMSELF said that he'd take the RX 5700 XT over the RTX 2060 Super, RTX 2070 and RTX 2070 Super and I'm fairly certain that he wasn't using some mythical RX 5000 series' smart-access memory.

Some people are so butthurt by this that I have to imagine that they work for nVidia because good competition is good for all of us who aren't affiliated with either. I do want AMD and ATi to dominate but only for the moment, only long enough to achieve parity in the markets because although they've done really well, parity has not been yet achieved, not even close.
 
Last edited:
The Radeon 5700XT heralded a very big jump in performances and unsettled Nvidia on how much the Radeon have been making strides.

And it continues on to the Radeon 6xxx series.

Interesting times.

If all goes well, I'll continue with my planned upgrade to the 6900XT from my current 5700XT.
I wouldn't recommend that. The 6800 XT is the best value of the three and almost as good a performer as the RX 6900 XT. I think that ATi just made the RX 6900 XT to make a statement to nVidia:
"Look, I can make something that people with more money than brains will buy too!"
 
Yes, I also thought Nvidia had tried to strongarm TSMC into giving them a better deal. Given how far TSMCs capacity is booked out in advance, they understandably told Nvidia to pound sand.

Nvidia has a reputation of being a difficult company to work with, so it's not particularly surprising behaviour.
There was that time when Jensen publicly threw TSMC under the bus and blamed them for Fermi's problems. Of course, the blame COULDN'T have been nVidia's. People don't forget things like that and I bet that the board members of TSMC were only too happy to tell Jensen to take a hike.
:p
 
There was that time when Jensen publicly threw TSMC under the bus and blamed them for Fermi's problems. Of course, the blame COULDN'T have been nVidia's. People don't forget things like that and I bet that the board members of TSMC were only too happy to tell Jensen to take a hike.
:p

There isn't many companies that walk away with a positive impression of Nvidia. Just ask Linus Torvalds

 
The Radeon 5700XT heralded a very big jump in performances and unsettled Nvidia on how much the Radeon have been making strides.

And it continues on to the Radeon 6xxx series.

Interesting times.

If all goes well, I'll continue with my planned upgrade to the 6900XT from my current 5700XT.

I'm upgrading to 5900x/6900xt from 3700x/5700xt. can't wait for the cpu/gpu perf increase voodoo amd planning on implementing
 
You may see a 10 to 20% improvement of Nvidia RTX 3000 series cards in the next few months, once they update drivers and start using a AMD Ryzen 5900XT or similar CPU (or next gen Intel when they get around to it). Biggest improvements for the RTX 3090. The figures from both GPU families will be superseded as time goes on. Performance at launch is not the whole picture.
 
Back