NVIDIA’s RTX Speed Claims “Fall Short,” Ray Tracing Merely “Hype”

They may not be though. Both Hardware Unboxed and Gamers Nexus stated in videos posted just yesterday that not only did they not have review samples yet but "NOBODY" has them.

A GPU review, especially one as anticipated as these are can't be thrown together overnight. I've never done one obviously but I would guess that from the time it shows up on Brent's doorstep, thru all the testing, retesting, benchmark generating and actually writing the article to the time Kyle has finished editing, writing up his summary and finally uploading it onto the website to go live, 3 weeks is probably cutting it pretty close.

That's why I said earlier, I wonder if Nvidia is doing this deliberately to keep reviews out of the public until actual retail release and all the pre-orders are sold out. I'm hoping that's just me being all tin foil hat.

Sure it takes three weeks if everyone writing reviews is literally retarded...

A lot of the shit your wrote down doesn’t take more than a few minutes or hours. Like “benchmark generating”. Shit takes minutes in Excel
 
There's rumors that with the CUDA cores being more versatile that there could be ~50% rasterization gains.
If there were huge IPC gains, they would have mentioned it at their event. All signs point to Nvidia burning early adopters, which is not new behavior for them. I would love to be proven wrong, but not optimistic.
 
There's rumors that with the CUDA cores being more versatile that there could be ~50% rasterization gains.

If there were huge IPC gains, they would have mentioned it at their event. All signs point to Nvidia burning early adopters, which is not new behavior for them. I would love to be proven wrong, but not optimistic.

They did share the 2080 vs 1080 charts. There is some perf unaccounted for but not 50%. More like 20-30%. Which would be huge for the 2080ti’s case...
 
Yep. And the crazy thing is only a small percentage of the chip is dedicated to this. Nvidia knows the software devs need time to make the change so they're rationing the chip hardware between normal rasterization and Ray tracing. As more software goes this route they'll up the dedicated die space for it. Makes perfect sense.

And I don't understand the negativity when the card hasn't even been benched yet.

The only thing I am complaining about is the price for the 2070. If it ends up being $400 all said and done shortly after launch I won't mind, but I don't see that happening.
 
Yes, Ray Tracing is the future. In order to implement any new computer technology we need hardware, software, and "consumables" which utilize both of them.

If a game (a "consumable") utilizes Ray Tracing (tm), I'm sure there's a cost involved. Now your $59.99 marquee game is gonna cost, say, $109.99. (Nvidia did the same with GSync. C'mon...they're gonna get their 2 1/2 pounds of flesh if someone wants a game (or other "consumable") to be "Ray Tracing(tm) Certified".)

That game will not sell unless a video card can run it.

That video card cannot run it unless there's software which can implement it.

Chickens and eggs...

Yes, this is a first step and it will not usher in a wondrous new age of gaming as soon as the card clicks into my PCIe slot. Instead, it will lay a foundation for future development.

Of course, I will forego any of these new cards due to two items:
1.) The prices are outrageous
2.) Nvidia's corporate practices are repugnant
 
If there were huge IPC gains, they would have mentioned it at their event. All signs point to Nvidia burning early adopters, which is not new behavior for them. I would love to be proven wrong, but not optimistic.

Disagree. If there were huge IPC gains, the smarter move is doing the opposite and staying quiet about it so they don't cannibalize the Pascal cards they're trying to clear.

And not sure how any early adopter would get burned when returns exist.

There would be no business upside to spilling early. As I've pointed out, the performance is conveyed in the pricepoints. They tell us where the benchmarks will fall.
 
Last edited:
I'm irritated by the 2080Ti because I think that should have been the 2080. There's nothing wrong with the performance of the 2080Ti, but that card isn't going to be leading for long at all for the new RT features. If what we have seen in benchmarks so far, and I understand everything's beta and early, then the 2070 is going to be a weak card for the new features.

So the now 2080 should be the 2070 and the 2080Ti should be the 2080 and they should be working on a shrunk 2080Ti. I'd be willing to put $500 on 2080 performance, but not on the 2070.
 
... there’s good reason to keep your wallet in your pocket and wait and see how this plays out...

Duh, it's called wait for the [H] review..

Don't really need some other website to tell me that...


'i hope they flop' - Pretty stupid thing to say...
 
Last edited:
Yeah, pre-orders are kind of dumb, but you can just keep the thing if you're happy with the upgrade or return it if it's not worth it. Or wait a few months and probably get a card closer to MSRP. I'm going from a regular 1080, do I'm guessing it'll be about 2x faster at high res. Still pricy, but meh.
 
It's new. Give it a few generations.. Don't expect miracles with this generation of cards...

But people shouldn't be paying or charged exorbitant amounts for what is currently a gimmick feature and isn't going to be usable for them. How many people that spend $800 - $1200 on a GPU use a 1080p monitor? And how many of them are going to want to play games at 30 FPS?

Also, the 30 - 60 FPS mentioned probably only applies to the RTX 2080 Ti. People buying 2070s and 2080s probably won't be able to use ray-tracing while getting a satisfying FPS - and so, they'll simply not use ray-tracing beyond a 5 - 10 minutes novelty after getting their GPU or a new ray-tracing game. They'll likely get into the habit of turning off ray-tracing and then getting on with playing their game.

So, the 20XX pricing is whack.
 
I would like to point out 1 example where the first gen of new tech did not flounder but actually succeeded in an epic fashion. The 8800GTX. It ushered in not only a completely new architecture (moving away from pixel pipelines to stream processors) but also was the first to combine hardware physx.
The 8800GTX was not only successful but it utterly dominated the performance of 7900GTX. In many cases it was 2x faster.
 
I just love the way people jump all over new gimmicks like ray tracing as though its going to be a game changer. Every 'showcase' for ray tracing I've seen just makes things a bit shinier, not noticably better. It wont look significantly more photo-realistic, especially if you're actually playing the game. More importantly, it wont make a game better. There are a great many things they could focus on that would do more to improve photo realism but they'd never do that because it doesn't sell new hardware the same way as 'groundbreaking new features' do. Hardware companies have to keep coming up with new garbage so they can sell a device thats only really a minimal improvement over current technology. They know its a bad deal so they'll spend a fortune trying to convince people that your life was incomplete without it and if you don't rush out and get it straight away you'll somehow be missing out. That said, if playing games with some ray-tracing (games that were every bit as playable without it) means that much to you and you feel like ignoring common sense and not waiting for a few reviews then go ahead and spend your money. Just remember buyer's remorse sucks. :D
 
This generation is really losing its luster already and its just due to this marketing BS. Between Toms Paid fluff Piece (takes two to make a fluff piece), the NDA, GPG, need I go on? its just so much shit from Nvidia at ONCE. I think they cocky to think that the absence of AMD meant that everyone would be ok with this attempt at a GPU industry take over.
 
The Nvidia presentation at Gamescom of their RTX cards with ray tracing processing was just hot air.

AMD is defining the narrative of gaming, for now. They are producing the CPU/GPU for the playstation/xbox systems currently and the next gen. Ray tracing will become standard in games when it's standard on consoles. It's not economically logical to buy an overpriced RTX to play some first gen ray tracing show pieces.
PC gamers are beholden to console developers willingness to go the extra distance which usually is limited. I don't see this making big traction at all til it hits console level. That I also don't see happening anytime soon w/o Nvidia making a legitimate big league hardware console.
 
Thank god I am happy with my 1080ti right now, because I sure as shit don't wanna get this card when it comes out. I will have to skip a few generations with nVidia until this new tech matures. And that price lol
 
I see no reason to read, much less get excited over, these click-bait articles. They're just speculation by the ignorati.

I'll wait for the [H]ardOCP review before I even think of ordering an RTX 20X0.
It's not like I need to upgrade from my GTX 1080, it's doing fine, thanks.
But YMMV, feel free to roll the dice and order an RTX, if you'd enjoy doing so.
 
Disagree. If there were huge IPC gains, the smarter move is doing the opposite and staying quiet about it so they don't cannibalize the Pascal cards they're trying to clear.

And not sure how any early adopter would get burned when returns exist.

There would be no business upside to spilling early. As I've pointed out, the performance is conveyed in the pricepoints. They tell us where the benchmarks will fall.
If they were worried about cannibalizing the Pascal cards they wouldn't be releasing them at all right now unless it just a hype paper launch and perhaps trying to force retails hands a bit to ease up on the Pascal generation price gouging.

Couldn't the sort of the opposite of DSR with ray tracing and upscaling to native resolution be done while keeping native resolution shading/lighting off for rasterized graphics and blending the overall combined images? Like that would defiantly be lower performance impact than fully ray traced at native resolution, but still I would think a big improvement to lighting and shading over rasterization that is pretty fake to start comparatively speaking.
 
Last edited:
these days, maturity would only come with every major console generation.

so we are at least a couple of years off.

guess you can call rich pc enthusiasts as real-world guinea pigs. (they would have fun doing it though)
 
Can't agree. Ray tracing is the future, and has been the future for about 40 years now. I for one am excited that the future is almost here. Once cards are fast enough to run it in real time @ 4k 60 fps, it will be worth the wait. But to get to that point, we first need to take steps to get there. This is one of those steps.

but those are just visuals.

Animation fidelity has gone nowhere for the past years. While we have great looking Frostbite games, the characters are still as robotic as ever.

And so far, we still haven't seen anything groundbreaking in this aspect.
 
What version and dimensional universe are these "guys" from? Not this dimension! I have been watching ebay for 1080Ti for the last month. I have seen maybe 3 or 4 cards at $400 and was used as heavy mining cards. Most right now are going for $500-550 period. I am crippled up on workers comp and have nothing but all day to watch these "FABULOUS" prices everyone keeps talking about and no one sees.

Besides, if you are spending $500 bucks on someone else used crap, why not just save an extra $100 and watch the one day sales and get new one for $600. Lowest new one I missed this week was $599! I am getting my black Evga 1080Ti with rewards for just under $600.

P.S. Nope deal was expired, back to 649. Still buying it.
Most of the time they post the starting bid, which isn't remotely close to what you'll pay. Annoying that people do that.
 
If they were worried about cannibalizing the Pascal cards they wouldn't be releasing them at all right now unless it just a hype paper launch and perhaps trying to force retails hands a bit to ease up on the Pascal generation price gouging.
Theres a business meta you're not considering. Theres an opportunity cost to sitting on new cards beyond a carefully calculated crossover point. It's not as black and white as "wait til the channel is empty and only then release". Other problems would arise by delaying longer than necessary.

It's safe to assume that everything playing out now has been long in the planning, including not releasing detailed comparisons to Pascal during the announcement, including staggering the announcement/shipping window.

They also assumed that the vacuum created by not immediately releasing performance data and focusing on Raytracing would mean weeks of naysaying and FUD articles/videos. But they went ahead. If we didn't already know the pricepoints, then their execution now could make you wonder if they were finally going to break their streak and ride in on complacency with minimal gains. I'd lay money that performance will scale in lock step with the prices.
 
Last edited:
What version and dimensional universe are these "guys" from? Not this dimension! I have been watching ebay for 1080Ti for the last month. I have seen maybe 3 or 4 cards at $400 and was used as heavy mining cards. Most right now are going for $500-550 period. I am crippled up on workers comp and have nothing but all day to watch these "FABULOUS" prices everyone keeps talking about and no one sees.

Besides, if you are spending $500 bucks on someone else used crap, why not just save an extra $100 and watch the one day sales and get new one for $600. Lowest new one I missed this week was $599! I am getting my black Evga 1080Ti with rewards for just under $600.

P.S. Nope deal was expired, back to 649. Still buying it.

Spending $650 on a 1080Ti right now seems like the worst possible thing to do. At best, you spent $650 on two year old tech. At worst, the 2080 smokes the 1080Ti and EVGA doesn’t offer a decent Step Up card.

Also for all the doom and gloom people... The Pascal cards were announced with zero actual benchmarks and just two bullshit performance slides (and one was VR). People ripped the cards for not showing actual performance or having review samples. Also since much of the presentation was VR focused, people decided the cards must suck since they didn’t care about VR. It was all very similar to what is going on right now.

Then the Pascal review embargo lifted 10 days before launch and the 1070 was trading blows with the Maxwell 980Ti...
 
While the price sure is obscene, I have to say in when new version of DirectX introduced new features and the next gen cards where able to make use if them, games also still had to catch up


So RTX taking some time to get there isn't actually any news in PC gaming at all


What is more important to me is that DXR is an API AMD can use as well and therefore can make hardware for it too


Also NVidia isn't going to undercut itself considering how well 10xx cards are doing in today's games and how much inventory they probably still have


Also the cards seem to sell like crazy
So I'm guessing despite all the negative stuff nvidia has read the market well enough, again
 
Nvidia is the leader alright... at parting fools with their money...

Cmon, no need to be an ass and call people names.

For all the shady and shitty things Nvidia does, they have good products. This is not Razer.
 
This generation is really losing its luster already and its just due to this marketing BS. Between Toms Paid fluff Piece (takes two to make a fluff piece), the NDA, GPG, need I go on? its just so much shit from Nvidia at ONCE. I think they cocky to think that the absence of AMD meant that everyone would be ok with this attempt at a GPU industry take over.
Was there a second NDA? Link to it?
 
The performance is there, you can purchase it at your discretion.

You'd be a fool to buy something slower/hotter/louder for the same price.

The performance is there? Until I see a [H]ard review, it's nothing and even then I would not buy it... the cost of that thing is a house payment for a lot of people...
 
This generation is really losing its luster already and its just due to this marketing BS. Between Toms Paid fluff Piece (takes two to make a fluff piece), the NDA, GPG, need I go on? its just so much shit from Nvidia at ONCE. I think they cocky to think that the absence of AMD meant that everyone would be ok with this attempt at a GPU industry take over.

Too much tin foil hat stuff here..
 
If you're using a 3-d light sample, why would you want to soften it with a 2-d pixel avg? A ray traced pixel is way way better. Different orders of magnitude.

Both techs are basically using AI to enhance or direct the rendering process anyways. I'll take the one with the supercomputers backing it please.

This is a gross oversimplification and makes far too many assumptions.

Ray tracing does not automatically guarantee better renderings. When I was doing a lot of ray tracing work, I used to run a quick render, just to make sure all the objects I wanted in the scene were there. A quick render took a few minutes to do. It used 1 ray, and 1 bounce for the calculations. It looked like crap, but it showed all the objects.

A really good ray trace needs hundreds of rays and about 16 bounces (the number will depend on the number of lights in a scene) for each pixel. I wonder how many rays NVidia is using? Is it hard coded? Is it dynamic? How many bounces? Hard coded? Dynamic? Are they changing the counts based on frame rates? Or is it up to the developer to do that? How many light sources are supported in a scene? How do they handle that?

The "AI" implies they are doing these things dynamically, which means from frame to frame there could be visual differences based on how many light sources and objects are moving around. The more light sources and moving objects, the less pixel caching you can do. Moving light sources add another level of complexity to a scene.

I have no doubt a dedicated video card with hardware based ray tracing support can beat any CPU doing it. There are just too may questions and not enough answers, right now.

I know one thing. If I do not see dedicated ray tracing programs throwing support at this, then that is all I need to know to
 
Last edited:
AA is basically over rendering 3d-2d then doing a 2d average to produce a pixel.

Hybrid ray-tracing is building each pixel from a top and bottom acceleration structure. It's basically sampling in 3d and averaging in 3d.

Seems wrong to do really expensive 3d calculations, then ruin them with a 2d filter.

this also remember the AI image processor has to denoise the whole frame too
 
If I do not see dedicated ray tracing programs throwing support at this, then that is all I need to know to


check the SIGGRAPH stuff for that looks like all the major CAD/ 3D render software will have support for it and it even looks like the major studios re; Disney and Pixar want it yesterday

one demo i found
 
One is lighting the other is post process AA or some type checkerboard rendering sort of thing. I don't see why they couldn't be used at the same time.
 
Comon cheap 1080 TIs. For that sweet, sweet SLI.


Unfortunately, the games that support SLi are bocoming few and far between.

I'm with the group that is going to keep what I have for a while and see how the reviews play out.

I'm not all that optomistic.

This reminds me of when DX 11 and Vista rolled around, it was supposed to be the next big deal, well that was a pile of shit.

I'm sure in a couple of years ray tracing will be commonplace.
$1000 just isn't on my radar.
 
Last edited:
check the SIGGRAPH stuff for that looks like all the major CAD/ 3D render software will have support for it and it even looks like the major studios re; Disney and Pixar want it yesterday

one demo i found

Actually this is what I have been thinking about since I saw the Quadro RTX unavailing at SIGGRAPH. Bringing RTX ray tracing to the industries that have both the money and the want to utilize the tech makes sense. As Nvidia continues to make money hand over fist from their commercial lines why would they press so hard to put that tech into consumers hands so fast. They literally could have waited 1 year to do the Consumer RTX line and I doubt it would have been that big of a deal. Make an 11xx series line for this year while also building up real world use of their RTX tech through commercial customers. Then in 2019 release a way more usable Consumer card with the R&D cost significantly reduced via the commercial sales. This would also give dev's more time to take advantage of the new tech.

Honestly the price they are listing for the consumer cards makes sense if you factor how much money they have probably put into the R&D for the tech. At least to me it makes sense. This is a Paradigm shift for GPU's, to me the cost makes sense, the performance estimates make sense, it is the lack of options for consumers that don't want to invest in this new tech right away that has me annoyed. but when you dont' have competition there is not really any reason to cater to customers.

Lets see how this plays out.
 
Have you guys checked out the return policy for most (if not all) NVIDIA GeForce RTX 2080 and 2080 Ti GPUs? They are non-refundable. Not to beat a dead horse here, but I would an excellent idea to order after the reviews come out. No worries, NVIDIA won't run out of these.

I don't like the power consumption and the many three-slot cooling solutions that I have seen so far, so I will skip this generation. I'm looking forward to NVIDIA's 7nm refresh. By then the tech will have improved, with more support from developers, not to mention power consumption and heat output. I've been an early adopter too many times; however, usually, I got my money's worth. I don't feel like this is one of those times. These GPUs are overpriced, no matter how one looks at it an I don't don't feel like throwing money away.

I've read many of the comments on this forum regarding pricing. Some folks are of the opinion that the price hike is justified because of the increased performance. I disagree. The norm used to be that when a new product came out, it replaced an old one at the same price point. Price hikes used to be minimal and usually you as a consumer got something in return (better quality cooler or something). So this logic that you pay more for more performance with every new release is flawed. In 5 years we'll pay $1500 for a midrange GPU. It is not how things in tech have worked in the past, and I can't see this working in the future. If you want to get consumers to adopt a new product, you have to offer them bang for the buck at every price point/tier.

I preordered a 2080 from Microcenter. No money down, just an email. If the benchmarks turn out good, I'll show up to buy one. If they turn out weak I just call and cancel.

Wow, that is cool. That's how pre-ordering should work :)
 
Back
Top