Nvidia's quarterly earnings beat expectations thanks to gaming and hyperscale growth

midian182

Posts: 9,718   +121
Staff member
What just happened? Nvidia has revealed its third-quarter financial results, and while the company did beat expectations, revenue fell 5 percent compared to the same quarter one year ago.

In the quarter ending October 27, Nvidia reported revenue of $3.01 billion. While that’s down from the $3.18 billion a year earlier, it’s a 17 percent jump over the $2.58 billion from the previous quarter. Non-GAAP earnings per diluted share were $1.78, compared with $1.84 a year earlier and $1.24 in the previous quarter.

"Our gaming business and demand from hyperscale customers powered Q3’s results," said CEO Jensen Huang. "The realism of computer graphics is taking a giant leap forward with NVIDIA RTX."

Breaking down Nvidia’s segments, its GPU business revenue was down 8 percent to $2.56 billion, while gaming revenue was down 6 percent YoY to $1.66 billion but up 26 percent from the previous quarter. The quarterly increase was spurred by growth in its GeForce desktop and gaming GPUs.

Data center revenue was also down YoY, to $726 million, but the company believes the business will see strong sequential growth in this segment “driven by the rise of conversational AI and inference.” Its automobile sector, meanwhile, earned $162 million and the professional visualization segment hit $324 million, up 6 percent.

The figures beat expectations, with analysts looking for earnings at $1.57 per share on revenue of $2.91 billion. For the fourth quarter, Nvidia expects revenue of $2.95 billion, which is lower than Wall Street’s $3.06 billion prediction.

Permalink to story.

 
Well for people who think AMD Navi is choking Nvidia, Nvidia's gaming department alone earns close to AMD entire income in Q3



AMD Q3 Revenue : 1801M usd
source
 
Well for people who think AMD Navi is choking Nvidia, Nvidia's gaming department alone earns close to AMD entire income in Q3



AMD Q3 Revenue : 1801M usd
source

Intel makes 5 times that just in it's PC centric unit compared to AMD yet AMD is still taking a lot of marketshare in the CPU market.

FYI no one here is implying that AMD is putting a stranglehold on Nvidia except you. You do this in every thread. A quick glance at your post history shows that you put out posts saying "people are saying such and such but Nvidia but it isn't true!" seemingly every Nvidia or AMD post.
 
Intel makes 5 times that just in it's PC centric unit compared to AMD yet AMD is still taking a lot of marketshare in the CPU market.

FYI no one here is implying that AMD is putting a stranglehold on Nvidia except you. You do this in every thread. A quick glance at your post history shows that you put out posts saying "people are saying such and such but Nvidia but it isn't true!" seemingly every Nvidia or AMD post.

Well there is a certain user here who always say AMD's Navi is killing Nvidia and many people, including you, seem to agree.
Lol it's nice that you notice I kinda debunk those AMD fans who claim a lot of many thing about Nvidia which aren't true :).
I couldn't care less about Intel. They are a boring company that keep recycling products and AMD is the same with their GPU department.
 
Last edited:
Well there is a certain user here who always say AMD's Navi is killing Nvidia and many people, including you, seem to agree.
Lol it's nice that you notice I kinda debunk those AMD fans who claim a lot of many thing about Nvidia which aren't true :).
I couldn't care less about Intel. They are a boring company that keep recycling products and AMD is the same with their GPU department.

I could care less about some AMD vs Nvidia thing.

The more worrying part here is that despite the increases in pricing of GPUs across the board people are still buying. It seems to me that PC Gamers are willing to spend increasingly more money for the same tier of product and at this rate I very much doubt prices will lower next gen. That's a doubled edged sword, as it discourages people from building/buying a PC due to the high price and go console instead. If the next consoles have RTX 2080 (this is a guess, not fact) like performance why would your regular non-savvy person drop $1,200 plus on an equivalent PC build when the console is $500 and both have similar performance in games?
 
If the next consoles have RTX 2080 (this is a guess, not fact) like performance why would your regular non-savvy person drop $1,200 plus on an equivalent PC build when the console is $500 and both have similar performance in games?
Hasn't this always been the case, though? Take the Xbox 360, for example - it was something like $400 at launch in Nov 2005. That wouldn't get you a gaming PC back then. The nearest graphics card, in terms of architecture and specification, to the GPU in the 360 would be something like the Radeon X1900. It's MRSP on launch was $300 for the XT model. Add the rest of the required PC units and you'll have a total cost 2 to 3 times greater than the console.
 
Hasn't this always been the case, though? Take the Xbox 360, for example - it was something like $400 at launch in Nov 2005. That wouldn't get you a gaming PC back then. The nearest graphics card, in terms of architecture and specification, to the GPU in the 360 would be something like the Radeon X1900. It's MRSP on launch was $300 for the XT model. Add the rest of the required PC units and you'll have a total cost 2 to 3 times greater than the console.

For that amount of money you got AMD's biggest chip at the time in the X1900 XT at 384 million transistors. It had around Nvidia 7800 GTX (yes that word order is correct) performance. Now "budget" level 2060 cards are $400+ and the 2080 Ti costs $1,200 or more (you are looking at $1,300 for a good card). My point was never that a decent PC is going to be cheaper, it was that the exorbitant cost of GPUs dissuades people from even considering PC.

When the cost of GPUs discourages me, a PC enthusiast from buying, I know there are people how don't drop $700 USD on a GPU like me who are going to be even more hesitant to go PC. I can only imagine what people looking at $400 "budget" cards think about the platform.
 
I could care less about some AMD vs Nvidia thing.

The more worrying part here is that despite the increases in pricing of GPUs across the board people are still buying. It seems to me that PC Gamers are willing to spend increasingly more money for the same tier of product and at this rate I very much doubt prices will lower next gen. That's a doubled edged sword, as it discourages people from building/buying a PC due to the high price and go console instead. If the next consoles have RTX 2080 (this is a guess, not fact) like performance why would your regular non-savvy person drop $1,200 plus on an equivalent PC build when the console is $500 and both have similar performance in games?

Console are unsuited for many game types like MOBA, Strategy and FPS so it's not something I even bother considering.

Nvidia has flooded the market with Turing at every price point now, 11 options from 150usd to 1200usd. If you had already owned Pascal GPU then it's natural just to skip this generation and wait for Ampere. I don't know where you get the idea that building a PC is getting more expensive. Sure price to performance is lacking for the RTX Turing compare to Pascal but that only when you looking at rasterization performance. In any ray trace workload the 400usd 2060S will just straight up murder the 700usd 1080 Ti. Furthermore there are many techs in the Turing architecture that developers haven't used yet because that would alienate the majority of people who own Pascal (Concurrent FP+INT, Variable Rate Shading, Mesh Shading, Texture Space Shading). For newly released games like COD Modern Warfare, Red Dead Redemption 2 which utilize DX12 or Vulcan, the 2060S match up with the 1080 Ti just fine. With COD you can get >60fps with Ultra preset + DXR shadows at 1440p and that is fine for single player FPS game.

And well for the people who spend 1000+usd for GPU, price to performance was not an issue to begin with.

Looking at Nvidia Q3 profit margin, an increase of 4% from 60% Q2 to 64% Q3 must mean Nvidia is making a killing with 2060S and 2070S. I find it ridiculous that people are comparing Navi and Turing die sizes and make conjecture about profit margin lol.
 
Last edited:
Now "budget" level 2060 cards are $400+ and the 2080 Ti costs $1,200 or more (you are looking at $1,300 for a good card). My point was never that a decent PC is going to be cheaper, it was that the exorbitant cost of GPUs dissuades people from even considering PC.
I would argue that a 2060 really isn't a budget price range product - just look at the test results at 1440p on Ultra settings:


Yes, those results were done with a CPU+RAM configuration that isn't of a budget machine, but 1440p helps to negate that issue. For $350 to $400 one is getting a seriously capable graphics card, especially at 1080p.

The true budget level is $100 to $120, where one is looking at the likes of a Radeon RX 550 type of product. And at 1080p on medium-ish settings, they'll perform pretty well in a budget PC too:


Add in an additional $50 and RX 570/580s or GTX 1650s come into the equation, and they're really capable for the money:


To me, the issue isn't that the very best graphics cards cost over $1200, it's about the perception that such products are necessary for PC gaming - and this problem is entirely down to the culture we've been shaped, through marketing and advertising, to accept (for all products, not just PC components) if item X costs 4 times the price of item Y, it must be 4 times better. While this is essentially true in terms of outright GPU performance, the games played on such system aren't going to be 4 times better.

Furthermore there are many techs in the Turing architecture that developers haven't used yet because that would alienate the majority of people who own Pascal (Concurrent FP+INT, Variable Rate Shading, Mesh Shading, Texture Space Shading).
The concurrent FP+INT aspect of Turing GPUs is invisible to developers, as it's handled entirely by the SM dispatch units, so it's not something that can be "not programmed for," as so to spea.

TSS is a process that can be done on DX11 hardware (example) and is open for implementation by developers, on systems other than just Turing-based machines. VRS is currently limited to Turing though, but as it's now part of D3D12 functionality, it shouldn't be too long before we see it in other GPUs.
 
I would argue that a 2060 really isn't a budget price range product - just look at the test results at 1440p on Ultra settings:


Yes, those results were done with a CPU+RAM configuration that isn't of a budget machine, but 1440p helps to negate that issue. For $350 to $400 one is getting a seriously capable graphics card, especially at 1080p.

The true budget level is $100 to $120, where one is looking at the likes of a Radeon RX 550 type of product. And at 1080p on medium-ish settings, they'll perform pretty well in a budget PC too:


Add in an additional $50 and RX 570/580s or GTX 1650s come into the equation, and they're really capable for the money:


To me, the issue isn't that the very best graphics cards cost over $1200, it's about the perception that such products are necessary for PC gaming - and this problem is entirely down to the culture we've been shaped, through marketing and advertising, to accept (for all products, not just PC components) if item X costs 4 times the price of item Y, it must be 4 times better. While this is essentially true in terms of outright GPU performance, the games played on such system aren't going to be 4 times better.

The benchmarks show it performing under the 2070, which is FAR less then what you got last generation. The 1060 for example outperformed the 980, a card two tiers above it.


It did that while costing less then the 2060. And look around the reviews, it was certainly considered a budget card yet it provided FAR more value then the 2060 relative to it's release.

If all I cared about were competent 1080p performance I would buy an $80 RX 580 off eBay, which is essentially disposable at that price point. Like with the launch of the 10xx series, performance per dollar should increase with each generation. I could care less about performance relative to whether it can play games competently. That is not a sign of progress, it's a sign of complacency.
 
The MSRPs for the GTX 1060 and RTX 2060, on launch, were $350 and $250 respectively - so a 40% increase. Performance-wise, it's a similar picture:

GTX1060.png


So you're paying more, but you're getting more. However, if we consider your point of:

Like with the launch of the 10xx series, performance per dollar should increase with each generation

But why should that the norm? If anything, the 10 series was an exception, as they were a substantial leap forwards with performance. The 1060s launch price was something like $50 more than the 960, which in turn was about the same as the 760's price.

The 760 was a physically bigger chip than the 960 (in terms of transistors and die size; they're on the same process scale) so the former would have been a little more costly to manufacture than the latter; the 1060 was smaller than both of them. All of them are tiny compared to the 2060, though - it's more than twice the size of the 1060 in transistor count and die area.

I'm certainly not advocating that the prices of the current top end graphics cards is justifiable, but I would certainly argue that what one can get for $100 to 200 in budget range, and $300 to $400 for the mid-range, is well worth the money.
 
But why should that the norm? If anything, the 10 series was an exception, as they were a substantial leap forwards with performance. The 1060s launch price was something like $50 more than the 960, which in turn was about the same as the 760's price.

The 760 was a physically bigger chip than the 960 (in terms of transistors and die size; they're on the same process scale) so the former would have been a little more costly to manufacture than the latter; the 1060 was smaller than both of them. All of them are tiny compared to the 2060, though - it's more than twice the size of the 1060 in transistor count and die area.

I'm certainly not advocating that the prices of the current top end graphics cards is justifiable, but I would certainly argue that what one can get for $100 to 200 in budget range, and $300 to $400 for the mid-range, is well worth the money.

Actually the only one Pascal GPU that fell outside the norm was the 1080 Ti, the 1070/1080 FE was originally priced at 450/700usd and no AIB was selling below those prices. At those prices the 1070 was going against the 980 and 1080 against the 980 Ti, which they offer ~30% more performance.

Now Nvidia is getting the flak because they priced the 1080 Ti, which outclasses the 1200usd Titan Pascal, at 700usd and became the norm for future comparison. Surely Nvidia can just keep on making 1080Ti and it will eventually become cheaper over time, after 2.5 years I'm sure Nvidia could make profit selling the 1080 Ti at 400usd.

The original RTX lineup was not so special against Pascal but right now the 1660S, 2060, 2060S, 2070S are nice choices for PC gamers.
 
Surely Nvidia can just keep on making 1080Ti and it will eventually become cheaper over time, after 2.5 years I'm sure Nvidia could make profit selling the 1080 Ti at 400usd.
That's a good question - some of the decision will be driven by the chip manufacturer, TSMC. They're the one who invested money in developing a process node just for Nvidia's Turing line (a modified version of their 16FF process, called 12FFN) so you'd think that they'd want to keep churning out GP102 chips off their standard 16FF node to help recoup the cost.

There are a few reasons why this would be a unlikely choice to make. Firstly, the chips in 1080 Ti cards are pretty big (470 mm2) so a single 300 mm wafer isn't going to churn out all that many of them, so there's little incentive to produce such a large processor for a relatively low amount of money.

Secondly, Nvidia have 7 different GPU designs across 42 SKUs that use the 12FFN node, and with no replacement for the 20 series appearing any time soon, there's still plenty of orders still to fill on that manufacturing line.

Lastly, with Nvidia going to Samsung for their next processor and TSMC with their 7nm order books full, thanks AMD's Ryzen and Navi, there's just no incentive to keep making a large, 3 year old product.

To return to the thread topic, it would be nice if AMD and Nvidia released a breakdown of their GPU shipped, even if it was only a relative % split; I don't think it would reveal anything that we don't already suspect (I.e. smaller GPUs out ship larger ones) but it would be good to see just exactly what processor dominates TSMC's production lines.
 
There are a few reasons why this would be a unlikely choice to make. Firstly, the chips in 1080 Ti cards are pretty big (470 mm2) so a single 300 mm wafer isn't going to churn out all that many of them, so there's little incentive to produce such a large processor for a relatively low amount of money.

You kinda forgot the 350usd RTX 2060 is a 445mm2 12FFN chip there, with the cost of GDDR6 double that of GDDR5 (no idea about GDDR5X that 1080ti use) I'm pretty sure the 1080Ti BOM would be close to RTX 2060 after 2.5 years anyways. So yeah with Nvidia being the lone customer for TSMC's 12nm FFN, they had to fill all the orders by themselves, thus creating the entire RTX lineup, with only the 2080 Ti make any sense of existing whatsoever (before the refresh Super series).

I think Steam Hardware Survey is a good indicator for GPU sales atm since the mining era is gone for good. With Nvidia making bank off their 1650, 1660, 1660 Ti, 2060, 2060S, 2070S and 2080Ti while AMD only has their 5700XT to rely on (2060S and 5700XT have equal percentage). RX 570 and 580 are being resold from mining farms so I doubt AMD is making any money from them.



Also despite HUB bashing of the 1650, I think the 1650 are meant for MMO games, so making verdict base on AAA games performance is not very convincing.
 
Last edited:
You kinda forgot the 350usd RTX 2060 is a 445mm2 12FFN chip there, with the cost of GDDR6 double that of GDDR5 (no idea about GDDR5X that 1080ti use)
Another good point, although I should have explored mine further - the GP102 chip filled 7 different SKUs (1080 Ti, Titan X, Titan Xp, Tesla P40, Quadro P6000, P102-100, P102-101) but all of them are/were very high-end and unlikely to recoup development costs by themselves; the TU106 doesn't fill that many more SKUs (just 8) but their target market sectors (mid-range desktop gaming, mid-to-high end laptop gaming) should, in theory, pull in more sales. Enough to not only cover the development cost but generate decent profit too (again, in theory). That all said, I think the appearance of the TU116/117 was in response to the issue you raised (large die).
 
Back