cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
At CES 2019, NVIDIA CEO Jensen Huang held back no punches when voicing his opinion of the Radeon VII. When interviewed by Gordon Mah Ung of PC World, he called the Radeon VII "underwhelming." He said, "'The performance is lousy and there's nothing new." "[There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it.'" He even suggested that AMD thought of the launch this morning. Jensen Huang went on trash FreeSync as competition to G-SYNC by stating, "'(FreeSync) was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards.'"

On the subject of RTX pricing criticisms, he admitted to mistakes in rushing the first cards to market, but all has been rectified with the $350 RTX 2060 release. He went on to completely dismiss Intel's graphics team as "basically AMD right?" There are plenty more juicy quotes direct from the mouth of NVIDIA CEO Jensen Huang in the article!

When can streaming be as good as a gaming PC? The answer is never," he said. The problem is simply physics and the speed of light. Streaming gaming can never match a graphics card in a box in terms of latency and image quality. But, he said, that doesn't mean game streaming isn't viable; it's just that the needs an enthusiast gamer don't match that of a casual gamer.
 
"'(FreeSync) was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards.'"

I think if this was the case, the true, real actual case, someone in the tech press would have caught on before now no? Like why haven't we heard of anyone discovering that freesync is a scam?

In fact I seem to remember a hardocp blind test and people tended to pick the freesync monitor as the preferred experience.

 
He thinks the $350 2060 is a great deal, and I have to disagree with him. A $250 2060, that would have been a real deal. 6GB VRAM for $350 cards is ridiculous and insulting. An 8GB 2060 at $300, that would have been awesome. $350 for an 8GB version, still good. BUt I'm sorry 6GB really limits the lifespan of this card.
 
Lol. Jensen Huang.... some of it's true. It's old tech shrunk. Intel does have alot of AMD'ers. Lousy performance is funny when it matches his 2nd best card, has more ram, costs the same or less, and is running on a newer process.

AMD really does need a new or heavily updated architecture though. Like 2 years ago.
 
The only thing he got right in that article was the comment about the mainstream market being "right" about the prices of the RTX cards & how they weren't ready.
 
I think if this was the case, the true, real actual case, someone in the tech press would have caught on before now no? Like why haven't we heard of anyone discovering that freesync is a scam?

In fact I seem to remember a hardocp blind test and people tended to pick the freesync monitor as the preferred experience.


I have 3 monitors on my desk right now all of them Freesync, my two cheaper ones ~$250 flicker and tear and do all sorts of garbage, my more expensive primary one ~600 works flawlessly (prices in Canadian). Freesync is a standard spec that much is true but the hardware used to implement it varies across the board and with that so do the frequency ranges they support and the timings that go with it, some cheaper controllers and panels are not capable of operating at frame rates that aren't divisible by 8 for example and can't do lower than say 24fps or greater than 60fps if they do they will do all sorts of bad things as a result. The Freesync spec is pretty open to interpretation where GSync is very well defined in regards to operating specs which requires more expensive parts but guarantees a higher quality of service overall.

That aside Jensen's best response to AMD's presentation would have been to pretend he didn't hear anything about it and pass it off as nothing worth mentioning, but nVidia is positioned well in this case because they priced the RTX cards high and they know it so they can easily price drop to match or beat AMD on this as they will have had a few months under their belt where AMD can't do this as easily as they are second to market.
 
Jensen Huang speaks from experience.

1-142-800x402.jpg

y0vbcck8gbh11.jpg

maxresdefault.jpg
 
I mean, I am a little underwhelmed as well. All AMD are doing with the VII is continuing the price curve that nVidia has started. 2080 levels of performance for slightly less than 2080 pricing isn't exactly getting me hot under the collar. Sure, 16GB is nice, but it is missing the RT and DLSS kind of value-adds. I would have been far more impressed with some disruptive pricing. Alas.

Pretty much everything else in that first paragraph is a joke though. "all has been rectified with the $350 RTX 2060 release"? In what fucking world? And I have to back up what Nexus6 said with regard to FreeSync as well.
 
I have 3 monitors on my desk right now all of them Freesync, my two cheaper ones ~$250 flicker and tear and do all sorts of garbage, my more expensive primary one ~600 works flawlessly (prices in Canadian). Freesync is a standard spec that much is true but the hardware used to implement it varies across the board and with that so do the frequency ranges they support and the timings that go with it, some cheaper controllers and panels are not capable of operating at frame rates that aren't divisible by 8 for example and can't do lower than say 24fps or greater than 60fps if they do they will do all sorts of bad things as a result. The Freesync spec is pretty open to interpretation where GSync is very well defined in regards to operating specs which requires more expensive parts but guarantees a higher quality of service overall.

That aside Jensen's best response to AMD's presentation would have been to pretend he didn't hear anything about it and pass it off as nothing worth mentioning, but nVidia is positioned well in this case because they priced the RTX cards high and they know it so they can easily price drop to match or beat AMD on this as they will have had a few months under their belt where AMD can't do this as easily as they are second to market.
It is true that FreeSync quality varies from monitor to monitor ... but that's what good ol' internet research is for.
 
AMD stands to make a lot of revenue if this performs as well as a 2070ti or 2080 and costs the same or less. Especially if yields are good. If they also get raytracing and something similar to dlss working, they'd make a killing. Dlss isn't going to happen unless they license it, unfortunately. Raytracing will depend on whether their architecture can handle it.

Huang is in denial, quite literally.
 
It is true that FreeSync quality varies from monitor to monitor ... but that's what good ol' internet research is for.
Exactly I bought them knowing they had an issue and really you get what you pay for, they are 22" 1080p screens I use for Netflix and web browsing while I play on the main 1440p 32" but with G-sync you don't have to do that if it is certified it works end of story so you can throw a little shade at the fact they let poor implementations of freesync still get labelled as supporting it.

You can at least respect nVidias dedication to their brand name, they aren't going to let some 3'rd party tarnish their reputation, they are gonna handle that shit themselves because why have others do what you can do your self, right!
 
Lol. Jensen Huang.... some of it's true. It's old tech shrunk. Intel does have alot of AMD'ers. Lousy performance is funny when it matches his 2nd best card, has more ram, costs the same or less, and is running on a newer process.

AMD really does need a new or heavily updated architecture though. Like 2 years ago.

I think he means it's lousy given its using expensive hbm2 and 7 nm while only matching the 2080 and missing all the features of rtx.
 
I mean, I am a little underwhelmed as well. All AMD are doing with the VII is continuing the price curve that nVidia has started. 2080 levels of performance for slightly less than 2080 pricing isn't exactly getting me hot under the collar. Sure, 16GB is nice, but it is missing the RT and DLSS kind of value-adds. I would have been far more impressed with some disruptive pricing. Alas.

Pretty much everything else in that first paragraph is a joke though. "all has been rectified with the $350 RTX 2060 release"? In what fucking world? And I have to back up what Nexus6 said with regard to FreeSync as well.

This. 100%.

I'm not all that worried about RT/DLSS as value-adds. I am seriously worried about the shifting price curve and erosion of value. I would have rather seen a card that competes versus a 2060 for sub-$300, than chasing the halo products. Unless you can claim the halo crown that's a losing endeavor every time.
 
AMD stands to make a lot of revenue if this performs as well as a 2070ti or 2080 and costs the same or less. Especially if yields are good. If they also get raytracing and something similar to dlss working, they'd make a killing. Dlss isn't going to happen unless they license it, unfortunately. Raytracing will depend on whether their architecture can handle it.

Huang is in denial, quite literally.
I don't see a reason for AMD to work on Ray Tracing this cycle, there are so few games to use it and the performance impacts are large enough that only a fraction of the people who have RTX cards will bother with it, better to wait till next cycle when most of the major engines have worked that shit out same goes for DLSS. DLSS is basically still in Beta and it primarily moves the AA process off the shaders to the tensor cores to provide a small performance bump but in most implementations internally runs the AA processes at a lower resolution and upscales so it is more of a cheat than anything else. While it does give better performance than TAA it doesn't look as good or it renders in ways that many find distracting (myself included). Again this is a feature AMD can either develop a competitor too or wait until its finalised and implement after the fact in their next hardware cycle.
 
AMD has 2 years to get ray tracing. Next gen cards from nvidia will have way better rtx. Then everyone go its the card that the 2080 should of been.
 
This. 100%.

I'm not all that worried about RT/DLSS as value-adds. I am seriously worried about the shifting price curve and erosion of value. I would have rather seen a card that competes versus a 2060 for sub-$300, than chasing the halo products. Unless you can claim the halo crown that's a losing endeavor every time.

I'm personally not arsed about RT/DLSS either, I just want as much rasterisation performance as possible for my money. But objectively speaking they do add value, it's just that value ranges from "Nothing" to "Must own" depending on the person.

Frankly, as much as it hurts to say, depending on the actual retail prices (vs the MSRP), the RTX 2080 actually is better value. If this came in at $600 it would be a much more interesting launch for like-minded people. $500 and the internet would be on fire.
 
The only thing I look forward to in these interviews is what jacket he is wearing.
 
I'm still waiting on a mid-range announcement from AMD. The $699 isn't going to work with my budget. I'm wanting something for $400 or less.
 
I think he means it's lousy given its using expensive hbm2 and 7 nm while only matching the 2080 and missing all the features of rtx.
HBM may be marginally more expensive than GDDR6 but as Samsung is ramping up production of HBM as it does better with real time analysis on sensor data and other things its costs are going down, mean while decreasing production of GDDR6 is bringing its prices up so the cost difference may not actually be terribly different but due to the nature of the spec it does limit memory configurations making it harder to get into specific price segments. That said, honestly "only matching" the second fastest card on the market at stock speeds isn't something I would consider bad. I mean who out there who owns a 1080TI is jumping up and down saying how badly they need the 2080TI compare that to the people out there who are running say 980's or older, looking for an upgrade and weeping at the cost of the 2080's if nothing else this will force a price shift and bring up some nice bundle packages come March.
 
I'm sure at this point anything other than fielding questions on why your stock lost half it's value in a few weeks gets you hard in the pants. Not surprised he's laying it on thick, after all, this was his actual element, not ai, crypto fabrication, and the ilke. Keep him in the graphics dogfight, I do enjoy the spectacle.
 
I'm personally not arsed about RT/DLSS either, I just want as much rasterisation performance as possible for my money. But objectively speaking they do add value, it's just that value ranges from "Nothing" to "Must own" depending on the person.

Frankly, as much as it hurts to say, depending on the actual retail prices (vs the MSRP), the RTX 2080 actually is better value. If this came in at $600 it would be a much more interesting launch for like-minded people. $500 and the internet would be on fire.

I can understand with HBM and on 7nm, AMD may not be able to afford to sell it for less, so they cranked the clock up to what they had to in order to get it to a competitive price/performance point. I think that was a mistake, but you gotta roll with what you got back from the fab I suppose.

I agree, $600 it would be very interesting, and $500 the world would have just blown a load.

Heck, if nVidia hadn't just announced Freesync, you could make a very compelling case at $700... not exciting, but you could have kept a straight face while attempting to do it. But you can't use the VRR/monitor cost argument against nVidia any longer. Now it's two cards, both cost about the same, one is double to power draw, the other also can pretend to do raytracing and whatever magic pixie dust DLSS ends up actually being. The power draw requirement alone is significant, and that's before you get the people that blindly buy Green because that's what they are conditioned to do. The VRAM I don't think even enters the conversation (except for a very niche crowd that actually needs it, and they were probably going to buy Pro-class cards anyway)
 
Starting to think there is something used in making the leather jackets he loves to wear that is causing him to act up LOL. Either sniffing the jackets or direct skin contact.
 
I agree that the performance of Vega 2/Radeon 7 is underwhelming when strictly looking at performance, as of right now the value seems to be pretty solid (this may change once 3rd parties bench the chip). As far as G-Sync and FreeSync/Adaptive Sync/VRR the only one I consider a scam is G-Sync and am willing to bet most of the issues the nVidia cards had with FreeSync monitors can be fixed with drivers.

Also the RTX 2060 is pretty underwhelming in every way. The move from the GTX 960 to the GTX 1060 was a way better deal than the move from the GTX 1060 to RTX 2060 will ever be.
 
There's a lot of petty bitterness in Jen's words. I expect that having the money to create a narcissistic bubble that insulates a person from all social interaction that might provide opportunity to consider and calibrate one's personality does wonders to regress an adult back into a child, emotionally.


Is it me or does Jensen Huang sound like a toolbag in this piece?

So far, I've only read the H topic's OP, and that comes through loud and clear.
 
The only thing that is going to crush farther is the stock of his company. Im pretty sure Softbank has a good reason to get rid of all Nvidia stock this year. There is an old saying " The greedy loose twice "...:D
 
Once again, everyone thinks of the now and not the later. I do think this card is underwhelming and I think AMD knows it. They have a new architecture in the works and we all know they've invested an obscene amount of money and time to go 7nm even if tsmc is doing the grunt work.

So they get 7nm ready now and prove they can do it. Learn from it so when the next gen is ready for production it will not be held back by the 7nm transition. Oh and by the way, they get to make at least some of their money back now with this card.

Its a proof of concept part that they took a little further to make some money back.

Would be nice if nvidia pulled their heads out of their asses though. I think we'd all prefer a healthy market.

If you're thinking I'm an AMD fanboy, I'm currently running a nvidia gtx 970 in my rig. I just buy whatever seems the best value for my needs at that time.
 
As long as the VII doesn't explode and catch fire they'll do okay.

...and I quite enjoy Freesync, and I did very little research into my monitor except look for the Freesync logo.
 
Back
Top