Hopefully the piss poor Turing sales spurs NVIDIA to build another GPU like Pascal that is 100% for gaming and not some cut down pro rendering card to pad their margins using their customers as suckers.

Citation needed. Even up to the 2080Ti's were consistently always sold out, and price never dropped. If you understand how supply and demand works you understand price only drops when market no longer supports the price.

Where is your "piss poor sales" data?

Anyone who is even remotely unbiased will recognize that Turing is a Turd and DLSS is hot garbage.
You sound totally unbiased. ;)
 
Citation needed. Even up to the 2080Ti's were consistently always sold out, and price never dropped. If you understand how supply and demand works you understand price only drops when market no longer supports the price.

Where is your "piss poor sales" data?

Nvidia themselves said Turing sales failed to meet expectations.
 
keep-calm-and-stay-on-topic-2.png
 
Can we get a star wars themed keep calm and stay on target :p (Sorry, I think that every time I see it!)
 
The final image using DLSS really is appalling when stacked up against a simple upscale.

nVidia made a huge miscalculation--people playing games today are much more savvy than they were 10-20 years ago. Nowhere as easy to fool with empty marketing bluster and bluff. The company would have done far better to release the 2080ti without RTX and DLSS, and, more importantly, without the physical Tensor cores--for ~$400 or so. Would have been hard to beat had the company a bit better judgment. But nVidia's always been big on proprietary features (G-sync, et al) that are both expensive and often hardly worth it at the same time.

"AI", of course, is a marketing term in this context, it's just algorithms and programming--seems like an incredible waste of time, energy, and money--the "deep learning" should have been done by nVidia before trying to foist something like this on people. There are so many other approaches that already do a demonstrably better job than DLSS, as this informative presentation highlights.

Takes me way back to the 3dfx-nVidia wars, when 3dfx introduced FSAA. nVidia's immediate response was "We don't believe in FSAA, we believe in high resolutions," which, unfortunately, none of their GPUs at the time could adequately do. And you had your occasional person who swore up and down that his low-res image with jaggies galore looked "so much better" than 3dfx's low-res beautifully jaggie-free (98% anyway) FSAA'ed image quality at the time. Some things never change--it's kind of comforting, in a way...;)
 
I'm not sure that Nvidia could have invested enough to encourage game developers to buy into RTX and DLSS without releasing it. We could speculate either way.

It's not performing as expected and Nvidia has owned up to that. Performance in games seems to be a variable so far. Taking one game where the implementation is obviously horrible and using the title DLSS = FAIL is what makes it click bait. If you enjoy reading negative trendy BS like that, good for you!

I think it may be some time before we see an engine that can take advantage of the hardware to it's fullest. It will doubtfully be in any game that's retrofit to use the technology.

If people keep railing against it devs may hesitate to support it which is what I think part of the subculture seems to want to see.

How exactly is talking about one of the primary pillars of Nvidia's RTX marketing clickbait? FFXV's use of DLSS is bad, BFV's use of it is horrible, Metro's use seems okay but not great. As it stands right now, DLSS is a bad feature and Nvidia put a ton of hype and marketing behind something that does not work as good as they claimed. Right now DLSS is a worse solution than then FXAA and the performance benefits are nowhere near worth the downgrade on image quality. That's not even getting into the utterly stupid way Nvidia and game devs are segregating who can use it as specific resolutions. Nvidia excuses it by saying "it will get better". If its a bad feature now they should have waited to roll it out until it got to that "better" point. For as dubious as the benefits of RTX are right now, at least those features work properly.
 
I'm not sure that Nvidia could have invested enough to encourage game developers to buy into RTX and DLSS without releasing it. We could speculate either way.

It's not performing as expected and Nvidia has owned up to that. Performance in games seems to be a variable so far. Taking one game where the implementation is obviously horrible and using the title DLSS = FAIL is what makes it click bait. If you enjoy reading negative trendy BS like that, good for you!

I think it may be some time before we see an engine that can take advantage of the hardware to it's fullest. It will doubtfully be in any game that's retrofit to use the technology.

If people keep railing against it devs may hesitate to support it which is what I think part of the subculture seems to want to see.

Far too often the hardware press is silent or downplays things like this. We need people like GN, HUB, or [H] to call out stuff like this. It keeps users in the know and sends a message to the hardware companies that people are aware of the lies and not happy about it. I'd love to see the entire tech press talking about DLSS and how bad its current implementation is. Nvidia owning up to it is one thing, now they need to get off their ass and prove they mean it. DLSS had a chance to be revolutionary, to be a defining reason to get Turing cards at any level. Right now its the exact opposite. I'd rather devs not waste time working on it if Metro is the best result we're going to get right now. I haven't felt this disappointed by new graphics tech in a long time.
 
Far too often the hardware press is silent or downplays things like this. We need people like GN, HUB, or [H] to call out stuff like this. It keeps users in the know and sends a message to the hardware companies that people are aware of the lies and not happy about it. I'd love to see the entire tech press talking about DLSS and how bad its current implementation is. Nvidia owning up to it is one thing, now they need to get off their ass and prove they mean it. DLSS had a chance to be revolutionary, to be a defining reason to get Turing cards at any level. Right now its the exact opposite. I'd rather devs not waste time working on it if Metro is the best result we're going to get right now. I haven't felt this disappointed by new graphics tech in a long time.

I understand your frustration but I'm not sure the expectations that everyone has are realistic. In today's market with investors demanding ever escalating profits and no tolerance for variance, companies are hard pressed in the R&D department. Considering that the tensor cores have other applications like machine learning it was probably easier for NV to just launch the product and work on the software development with partners over time. It probably would have made people happier if the 1660ti had shipped sooner to offer a cheaper alternative but it looks like it's right around the corner.

I also think that [H]ard OCP did a very fair job of reviewing the technology.

My primary beef is with many of the YouTube reviewers who's analysis is only surface deep and derivative. They spice up their commentary and stir up rage and indignation to get views. It's nothing more than pandering to the lowest common denominator.
 
YouTube journalism is worse than the tabloids. In the age of internet, anyone with a webcam can become a "Hardware Expert" and when cagey posts links on this site it does nothing other than support the general level of ignorance.

It's really sad. I'd rather go back to the days when it was just Kyle, Tom, and Anand where Kyle called people out for posting bullshit or inaccurate data.

What you end up with these days is more videos like the one the Verge posted on building a gaming PC, but the mistakes and inaccuracies are less evident, so people accept it as gospel.

Assuming you are not just posting for attention and DLSS distraction, it should be noted that not all techtubers are created equal.

If the video lacks an intro and is just a bunch of bar graphs with crappy rock music in the background, you can ignore it.

However, HUB material is good enough to be published in Techspot. Others like Gamers Nexus publish their material as well. They are not just some geek playing games in their bedroom.
 
Assuming you are not just posting for attention and DLSS distraction, it should be noted that not all techtubers are created equal.

If the video lacks an intro and is just a bunch of bar graphs with crappy rock music in the background, you can ignore it.

However, HUB material is good enough to be published in Techspot. Others like Gamers Nexus publish their material as well. They are not just some geek playing games in their bedroom.

The point that I'm trying to make here is that I think many sites including the HUB are pandering rage porn rather than doing thoughtful hardware reviews. As Curl pointed out, the HUB seems to be an equal opportunity hater. I personally don't believe it's helpful for the industry or the consumers. What it does do is incite the community and garner clicks and cash.

LOLOL This has to be a joke. If anything they are equal opportunity haters. They give AMD much more hell than they ever gave nv.
 
The point that I'm trying to make here is that I think many sites including the HUB are pandering rage porn rather than doing thoughtful hardware reviews. As Curl pointed out, the HUB seems to be an equal opportunity hater. I personally don't believe it's helpful for the industry or the consumers. What it does do is incite the community and garner clicks and cash.

Its clear you don't actually watch HUB content and are trying to judge based solely on a single video.
 
Just a heads up, they are a known biased anti nVidia web site. Not a place you want to get your information or news from.
Come on, they give the 2060 a thumbs up, and the Radeon VII a right kicking... the opinion about them being anti-Nvidia seems to stem from their video where they questioned whether or not they didn't get an early 2060 review sample because of previously knocking another Nvidia product.
 
No, what they are is anti-bs. Hardware Unboxed doesn’t hold back on their opinions and tells it how it is.

A lot of people say that they are biases just because they haven't signed the NDA from Nvidia. Does not signing the NDA automatically mean you are biased? Seems like a nightmare 1982 scenario. Does this make Kyle Biased because he doesn't want to be told how to write his reviews by the manufacturer?
 
Citation needed. Even up to the 2080Ti's were consistently always sold out, and price never dropped. If you understand how supply and demand works you understand price only drops when market no longer supports the price.

Where is your "piss poor sales" data?


You sound totally unbiased. ;)


Well unless you've been hiding under a rock this isn't news: https://www.fool.com/investing/2019/02/15/nvidias-q4-report-24-lower-sales-earnings-cut-in-h.aspx

  • The gaming segment suffered 45% lower sales due to weak sales of graphics cards for the gaming market plus declining shipping volumes of NVIDIA-powered Nintendo (NASDAQOTH:NTDOY) Switch gaming consoles. This core division accounted for 45% of NVIDIA's total revenues in the fourth quarter, down from 55% in the third quarter and 60% in the year-ago period.
And third, sales of certain high-end GPUs using our new Turing architecture, including the GeForce RTX 2080 and 2070, were lower than we expected for the launch of a new architecture.

Kress noted that the new cards provide a "revolutionary leap in performance and innovation" over the products that came before, but NVIDIA's potential customers appear to be waiting for lower chip prices or more widespread implementation of the new technologies in actual games.
 


Another Anti-Nvidia video showing the 2080 making the Radeon VII look very bad....God that Anti-Nvidia is showing very badly in this video!


Lol, I wish my sarcasm meter and reading comprehension wasn't broken. Watched the whole video and was like where's the Anti-Nvidia stance? Lmao. I generally just trust Hardocp reviews when I'm looking to buy because Brent and Kyle's subjective observations usually match my own and I hate videos, but the benchmark review was detailed and objectively solid with great subjective observations as well.
 
Lol, I wish my sarcasm meter and reading comprehension wasn't broken. Watched the whole video and was like where's the Anti-Nvidia stance? Lmao. I generally just trust Hardocp reviews when I'm looking to buy because Brent and Kyle's subjective observations usually match my own and I hate videos, but the benchmark review was detailed and objectively solid with great subjective observations as well.

Yap that is why i like HUB.....They actually play the game, and specifically mentioned they do NOT use in game benchmarks etc. Thats why I like them....they tell it how it is.
 

You are wasting your breath, any second an nvidiot is going to come in here and blame it solely on crypto currency.

It doesn't take a rocket scientist to know that adding 2-400$ to the price of your next gen cards is going to price you out of the market. The fact that you don't get proportional performance benefits is just icing on the cake. Instead of a Bugatti Veyron you got a Last year's lambo with a built in microwave and a coffee maker. Sure you can make breakfast, but it's hard to drive fast with a hot pocket in your mouth.
 
All this talk about RTX features did get me thinking about yesteryear....and GeForce TnL when it was introduced. I think our conclusion from 19 years ago, very much applies today.

Let me close with this, don't let anyone blow so much smoke up your skirt that it clouds your vision. (no we don't wear dresses) You better believe that marketing companies are paid BIG BUCKS to do just that to you and me, while trying to get us to buy their products. No, they are not lying to us, but sometimes there can be so much spin, it might remind you of that top you had when you were a little kid.
 
You have to give new stuff some time to get implemented correctly, like with DX12 async compute, all new AAA games use that to magnificent results these days, right? … right?

It does yield good results, just not on nv GPU's until turing. Meanwhile, it's been in GCN since at least 2.0 if I recall correctly.
 
I think right now, this DLSS and RTX/ray tracing "experiment" has shown they are not ready, and not just from a technology perspective. At least part of this is dependent on the content creators to implement (and implement well). I do wonder what 10 years down the line will look like. Will someone come up with something that supersedes this before it can really build much steam? Does this could go the way of "3D"? Does it become standard?
 
It does yield good results, just not on nv GPU's until turing. Meanwhile, it's been in GCN since at least 2.0 if I recall correctly.

Teah, but my point is that almost noone uses it, most DX12 implementations are pretty meh other then ashes of singularity that only get's used to benchmark things.
 
There is no AI on the cards.
The AI is the servers NVidia use to optimise the data for DLSS.

But still they could release "experimental DLSS files" and allow users to test them out. After testing them out the person chooses the "best." Have it all go through GeForce Experience. Reward the users with Battlefield V skins or whatever.
 
It wouldn't work. So many reasons why it wouldn't work. Where would you even begin? How does someone rate the scene? Are they going to stop gaming every couple of seconds to rate the last few seconds of gameplay, that last few seconds of gameplay would have to be sent to Nvidia to put into their super computer. There is also a flaw in your understanding of the process. The Tensor cores on your card don't "learn" they just process the algorithm that has been created by the super computer learning.

But, even if they somehow managed to implement something like this, it would still be a disaster. Think of the thousands upon thousands of subjective view points of data going into training the AI. You have to train the AI to bridge the gap between a low res image and a super high res image not what thousands of people think is a good image. The AI would have a meltdown.
Just have the person rate each scene. Can't you pause the game with Ansel and take a picture? If so then it should work in the same way to rate a scene out of 5 stars right?

Give people who rate scenes a hidden "gamer rating." If they choose scenes that the community likes, then raise their rating.

Doesn't seem too hard to me. Call it Hybrid DLSS. :)
 
Just have the person rate each scene. Can't you pause the game with Ansel and take a picture? If so then it should work in the same way to rate a scene out of 5 stars right?

Give people who rate scenes a hidden "gamer rating." If they choose scenes that the community likes, then raise their rating.

Doesn't seem too hard to me. Call it Hybrid DLSS. :)

What? It doesn't work like that. Doesn't seem to hard to you? That's because you have no understanding of how it works. Sending screenshots of what they think is good, sheesh!! Think of the administration needed for something like this. You would have to check every image, because some people would send crap on purpose, some would have no clue and some people think they know a good picture quality but don't.

I am sorry to be blunt, but it's a completely daft idea.

All that home users could do is add their computing power to Nvidia's super computer to try and train the AI faster, sort of like BOINC.
 
There are people who need Tensor cores
For example Google's TPU deep learning logic
And software such as Alpha Zero & Leela Chess zero

I think NVIDIA didn't go about in the right way this release
But there is a demand for Machine Learning using Tensor cores

Sure there is.. But soon there will be zero demand from gamers..
 
So if DLSS is basically garbage, what else is there to do with those Tensor cores everyone is paying for?
 
NVidia rolling out the 5 year NDA subscription is when to avoid the upcoming card.
Especially with a 50% premium and a deliberately hidden agenda why.
It smelt bad from the start.
 
So if DLSS is basically garbage, what else is there to do with those Tensor cores everyone is paying for?

You can play chess

Esp the 1660 3/6 gb might make a cheap & good card to install Leela chess, if it retains the tensor cores
 
I’ll take resolution over any AA all the time just looks better.

I tend to prioritize higher frame-rates where possible, but I definitely agree on resolution over AA. Resolution effectively hides jaggies anyway, as long as you're not stretching it over too large a screen.
 
What? It doesn't work like that. Doesn't seem to hard to you? That's because you have no understanding of how it works. Sending screenshots of what they think is good, sheesh!! Think of the administration needed for something like this. You would have to check every image, because some people would send crap on purpose, some would have no clue and some people think they know a good picture quality but don't.

I am sorry to be blunt, but it's a completely daft idea.

All that home users could do is add their computing power to Nvidia's super computer to try and train the AI faster, sort of like BOINC.

Well how do NVIDIA engineers determine that something looks good? Do they just "trust" the AI or do they actually inspect the work that it performs and then adjust the algorithm to correct for errors?

Have you seen how Steam Controller profiles work? All of them are submitted to the same repository. Everyone can look through them and choose the one that they want to use. The Steam Controller profile with the highest actual play time is the first one that I see when I look through them. So in that instance the metric to determine the "best" is the one that people use the most.

So what's so hard about making various DLSS profiles? Allow people to choose which they want to try. Then vote if they like it or not. They can even go by hours logged like Steam if they want.

Its not hard at all.

Also I didn't suggest that they send in screenshots. I said pause the game like it is done with Ansel. So you play through a scene and then hit the pause button and rate the scene. Check boxes that say, "too fuzzy", "picture perfect", "too sharp", etc. Ever seen an overlay in a game? Used MSI Afterburner? Free player feedback and free labor. Its just Big Data; I thought NVIDIA was all about collecting and processing data? I vote on items I purchase from Amazon all the time.

What's so hard about having multiple DLSS profiles for a game and voting on them? Maybe I'm as "daft" as you say, but I think user feedback could have saved them time and money with DLSS if they had shown the community "real DLSS" in action instead of "best case scenarios."
 
Back
Top