AMD Radeon RX 6000 series graphics cards revealed, feature double the performance of the...

I think anyone buying a Desktop CPU for the next year will be buying Ryzen 5000. Intel's offerings are subpar until at least Alder lake at the end of 2021, but intel's track record that's not even a guarantee. So any new desktop in the next year will have that Smart Access Memory. Curious to see if it makes a difference though.
 
6800 is a 3070 competitor and wants $80 more. Sure it has 16GB but AMD IMO can't charge more given it's no faster than the 3070 and will have weaker RT performance. Disappointing it's only 60CU's not the 64CU's most thought it would be. I'll be grabbing the 6800XT for sure, better value IMO.

It's interesting that all of the FUD is only occurring at the lowest end. Seems like even Nvidia realize they are in some trouble. Don't get me wrong. I want to see independent benches before I buy anything but it looks like AMD are about to put a card to market that competes with the RTX 3090 and costs hundreds of dollars less. Smart. In more ways than one.
 
I hope AMD is given the same treatment as NVIDIA regarding these "selective" graphs and claims they're making.
AMD have been more upfront than Nvidia - cards like 5700XT ended up being faster than what they said it was . NVidia flat out lied about the updated Nvidia Shield - 25 % faster = when the chip was exactly the same - and 0% faster - still saw if advertising months after to same BS - when it was shown to be a gross lie by multiple reviewers .
AMD at the moment are far more honest than Nvidia - I think they value the community they wish to build
 
AMD have been more upfront than Nvidia - cards like 5700XT ended up being faster than what they said it was . NVidia flat out lied about the updated Nvidia Shield - 25 % faster = when the chip was exactly the same - and 0% faster - still saw if advertising months after to same BS - when it was shown to be a gross lie by multiple reviewers .
AMD at the moment are far more honest than Nvidia - I think they value the community they wish to build
Haha...why are you talking about the NVIDIA Shield? We're talking about GPUs here. Let's be honest both companies have been caught in several lies, but if AMD is going to keep playing "underdog" then they need to be held to the same scrutiny as NVIDIA for their claims and testing if they turn out to be "less than true"
 
I hope AMD is given the same treatment as NVIDIA regarding these "selective" graphs and claims they're making.

All data should be reviewed by independent 3rd parties. That said, AMD isn't making crazy performance claims like Nvidia. They are showing you benchmarks vs the competition. We will soon find out exactly how representative those numbers are of the broader market.
 
My foolishness is thinking mm maybe I can pickup another 5700 and sli it for under $200 no no. idk why I gravitate toward headaches.

The sad part is AMD has no competition, Nvidia has no stock. So yeah these cards pawn. Wouldn't mind a used 2080 ti for $400, OC it to smoke a 3070. Slapping a custom water block on 14nm would be way easier to mount.
 
While this is true I'm not sure that how much a benefit that would be. I'm sure it would help if you intend to keep the card long term but I'm not sure about the immediate 1-2 years. I'm not saying the AMD card is worse but I feel like more people would rather just take the savings and upgrade more frequently.
Do you usually buy such expensive hardware for 1-2 years? If so then it should not mater to you how much VRAM it has since it's not a normal 3-5 years investment.
 
My foolishness is thinking mm maybe I can pickup another 5700 and sli it for under $200 no no. idk why I gravitate toward headaches.

The sad part is AMD has no competition, Nvidia has no stock. So yeah these cards pawn. Wouldn't mind a used 2080 ti for $400, OC it to smoke a 3070. Slapping a custom water block on 14nm would be way easier to mount.
That's only true if AMD doesn't have stock issues early on too ;)
 
Of course Nvidia RTX 3000 series cards will offer better ray-tracing performance or quality, that's why they have increased price and power consumption. Question for the "insiders": are these AMD cards and the Nvidia RTX ones made in the same factory by the same people? Because it sure looks like these are just two departments of the same corporation.
 
Last edited:
Of course Nvidia RTX 3000 series cards will offer better ray-tracing performance or quality, that's why they have increased price and power consumption. Question for the "insiders": are these AMD cards and the Nvidia RTX ones made in the same factory by the same people? Because it sure look like these are just two departments of the same corporation.
Nvidia cards are made by Samsungs's fab and AMD's is made by TSMC's fab.

I don't know what "quality" is in this situation, but I do agree that Nvidia will have better perf for ray-tracing. How much better? We'll just have to wait and see.
 
As I said, AMD and their fake misleading benches.

We know now why the 5xxx series seems so fast in gaming. They used the smart access memory to inflate the results.

Here it is the same, also using heavy overclocking to even match the competition. Imagine Intel or Nvidia doing this, presenting OC cpus or gpus and comparing them with stock versions of the competitors, just lol.
 
We've been yelling for AMD to be competitive for decades and it looks like they're finally here to kick some ***. I've switched over to AMD cpus and I might have for the first time ever a total AMD system. Great Job Dr. Lisa.
 
As I said, AMD and their fake misleading benches.

We know now why the 5xxx series seems so fast in gaming. They used the smart access memory to inflate the results.

Here it is the same, also using heavy overclocking to even match the competition. Imagine Intel or Nvidia doing this, presenting OC cpus or gpus and comparing them with stock versions of the competitors, just lol.
AMD didn't use the 6000 GPU series in those slides so you entire argument doesn't make sense at all. They used an RTX 2080 Ti and only gave a small teaser at the end with 3 games for the 6800 XT.

FYI Nvidia used DLSS in some of their slides, is that "cheating" too? For example when they compared raytracing vs the GTX 1080 they enabled DLSS to make it seem like the the RT cores offer that much better performance.

The only reason Nvidia didn't compare to AMD is because they didn't have something at the 2080ti performance range to compare to. But you can be damn sure that they would have enabled DLSS or other things if they had to.

As for Intel, they have some of the worst slides you can get. Weird settings, weird software used etc.
 
As I said, AMD and their fake misleading benches.

We know now why the 5xxx series seems so fast in gaming. They used the smart access memory to inflate the results.

Here it is the same, also using heavy overclocking to even match the competition. Imagine Intel or Nvidia doing this, presenting OC cpus or gpus and comparing them with stock versions of the competitors, just lol.
Smart memory is a feature that is available with the right mainboard+CPU combination, so yes, it does not apply to everyone. AMD is using this to entice customers to go all AMD.

As for „heavy overclocking“ - if Rage mode only improves performance by 1-2% I would not call that heavy or even overclocking. It would be interesting to see if and how that affects power consumption, most importantly do cards stay in the TBP budget given by AMD.

As for overclocking, one could argue that Ampere, or at least the 3080 and 3090, is / are already running very close to their limit. If Navi 21 has headroom, that‘s a plus. Note that AIB cards are rumored to offer noticeably higher clock speeds.

One more thing: The 6800XT was also shown vs. the 3080 without Rage mode and smart access memory and numbers looked pretty good. Testing for AMD and nVidia GPU was done on the same platform.

Still, there are many open questions wrt RT performance, their DLSS alternative, power consumption details (what does the number include— everything on the board including USB-C, or just parts, is it a typical or max value..?), have there been updates to Video Core Next....

What we need is an architecture deep dive and of course third party reviews.

But as always, get what you feel is best for you.

I for one feel that if AMD‘s numbers are accurate it means there will finally be a choice from top to bottom in the GPU sector, something we haven‘t had for a long time. And this is good.
 
Ppl that got a 3090 got rekt today

Nah. I’m still within my return window. It did cross my mind, but when I thought about the hassle of returning it, playing all the new games on my old card, waiting for the 6900xt, then go through all the trouble of getting it and what not to save $500... nope. $500 just isn’t that important to me these days and I don’t really know what else to spend it on. Plus I’d be giving up features and I don’t even know if some of the driver features I rely on for my simrig exist on the AMD side of things.
 
Nah. I’m still within my return window. It did cross my mind, but when I thought about the hassle of returning it, playing all the new games on my old card, waiting for the 6900xt, then go through all the trouble of getting it and what not to save $500... nope. $500 just isn’t that important to me these days and I don’t really know what else to spend it on. Plus I’d be giving up features and I don’t even know if some of the driver features I rely on for my simrig exist on the AMD side of things.
Can you give me 500$ if you don't know what to spend such money on? I can think of a few things for little old me :)
 
I'd love to see the face of the NVIDIA fanboi who bought that 3090 for like $2k.

Let us know if that was worth the $1k extra over the 6900.
 
250w card. That is ugly not much improvement at all . Only more core and more power. More watt than a vega 64 ... deception noise , noise , noise , heat, noise, heat .
 
250w card. That is ugly not much improvement at all . Only more core and more power. More watt than a vega 64 ... deception noise , noise , noise , heat, noise, heat .
Not much improvement? The Radeon RX 5700 XT is a 225 W graphics card, and for just 25 more watts (11% increase), you're getting the following with the RX 6800:

  • 160% more transistors
  • 50% more CUs, TMUs, ROPs
  • 3% higher Game clock
  • 10% higher Boost clock
  • 66% higher peak FP32 TLFOPS
  • 100% more VRAM
  • 128 MB of additional L3 cache, running at up to 1.94 GHz

No matter how good/bad it actually is in the reviews, that's a phenomenal achievement.
 
250w card. That is ugly not much improvement at all . Only more core and more power. More watt than a vega 64 ... deception noise , noise , noise , heat, noise, heat .
Unlike Nvidia's 320W cards non overclocked, they had to design a new and expensive cooler. I think they were reluctant to use water cooling like AMD honestly did with Vega 64. It looks that tables have turned around and the central point is TSMC, remember the cards from AMD using Samsung chips. I wonder what advantage would hold Nvidia on the same node and the same technology.
 
Back