NVIDIA’s RTX Speed Claims “Fall Short,” Ray Tracing Merely “Hype”

I am pretty sure that the cards selling for $350 on ebay are plain 1080's and not 1080ti's.

If you find a ti for $350, let me know because I will buy it immediately.

The 1080's are $350 used which should be close to 2070 performance.
The 1080ti's are about $500 used, which should be close to 2080 performance.

Were looking at a $200 price difference for similar performance at 1440P (2070) or 4K (2080)
 
I'll be honest , I don't give a shit about the ray tracing.

If the 2080 ti gives me 15+ % more FPS in 4k than my Titan Xp I'll take it , if not going back,
Thank goodness most people don't have shity low standards like you...
 
Ray tracing will be nice in the future, maybe a few years from now on 3 nm. But for today, the 2070/2080/2080 Ti are a rip off for what NVIDIA is asking consumers to pay because they want gamers to subsidize their data center efforts which is bullshit. Anyone with a recent Pascal chip from the NVIDIA side should probably stay away and even AMD Vega owners probably can safely skip it. I plan to hold on to my Titan XP until 7nm at the very least and if NVIDIA is still being retarded, I'll wait till 2020 to see what AMD and Intel have to offer.
 
More speculations.... that mean nothing. Please guys wait for the reviews, it's in like 3 weeks..

No speculation required (see below).

2070 =$600 founders edition = Gtx 1080 performance + 10% for $200 more
2080= $800 founders edition= 1080ti performance + 10% for $200 more
 
More speculations.... that mean nothing. Please guys wait for the reviews, it's in like 3 weeks..
We already know for absolute fact that most of the games were being shown at 1080p that were using Ray tracing and that they also could not keep 60fps. Even the Developers for Metro Exodus came right out and said that they're going to Target 60fps at 1080p for ray tracing and we all know what "target" actually ends up meaning. Ray tracing is going to be a joke this generation unless you really want to play games at 1080p with your $1,200 video card and hope to manage 60fps.
 
Last edited:
A friend of mine just told me that the new DLSS anti-aliasing cannot be used at the same time as the RTX ray tracing in games. It's one or the other. That really deflates this launch for me.
Your friend is wrong. It has both on some titles per Nvidia's list.
 
Given the rather average performance jump historically, where x80 cards are equivalentish to preceding Ti cards (i.e. GTX 980 /780Ti, 1080/980Ti) and the Ti cards are typically only decent in performance gains, then I would be very surprised if the 2080Ti blows the 1080Ti out of the water with, say 50% more fps in games using the same exact settings @4K. Similarly for the 2080 vs the 1080Ti, if Nvidia is true to form then it will equal a 1080Ti essentially (but with ray tracing focus). If this ends up being the case then Nvidia is throwing any pretense of giving a damn and charging ridiculous prices.
 
nVidia could have avoided this bad press if they released the card already.

It could be intentional for nvidia part. After all they still have tons of 10 series to sell. They could delay further but it probably not wise to launch the card when AMD going to launch 7nm card sometime next time. Nvidia might be ahead right now but they not reallg the type that let their guard down. They were fully aware that AMD capable of coming up with something surprising. And sometimes it is harder for them to counter.
 
I think it's a decent deal from nVidia. As much as they abuse their high end segments with price, they fill them with product.

This time they're only skewing the value 20% higher for 30-40% more product.

A few blue crystals? Maybe some real shiny luster.

Tensor cores look good. Not marketing hype good. but more than ehh.
 
The Nvidia presentation at Gamescom of their RTX cards with ray tracing processing was just hot air.

AMD is defining the narrative of gaming, for now. They are producing the CPU/GPU for the playstation/xbox systems currently and the next gen. Ray tracing will become standard in games when it's standard on consoles. It's not economically logical to buy an overpriced RTX to play some first gen ray tracing show pieces.

Though i imagine console maker will be very interested to have this kind of tech for their next console. Why PS4 pro exist? Because sony did not want to lose their existing player base to high end PC. So in the end AMD will be forced to develop something similar else they might lose the semicustom win.
 
A friend of mine just told me that the new DLSS anti-aliasing cannot be used at the same time as the RTX ray tracing in games. It's one or the other. That really deflates this launch for me.

If you're using a 3-d light sample, why would you want to soften it with a 2-d pixel avg? A ray traced pixel is way way better. Different orders of magnitude.

Both techs are basically using AI to enhance or direct the rendering process anyways. I'll take the one with the supercomputers backing it please.
 
"Riiiiiiidggeee racer!" - Sony CEO at E3 2006

"Raaaaaaaaay tracer!" - Jensen, NVidia CEO at E3 2019?! Could happen....

I honestly think the idea of even a 1.3x performance increase across a generation isn't terrible. As long as the next generation was comparable to the cost of the previous generation and at similar price points. 1.3x performance for double the cost is ridiculous though. I hope AMD mops the floor with nVidia and teaches their smug development team a lesson.

Ray tracing sounds cool but if AMD comes up with a product priced at nVidia's (This generation) prices that performs 90-100% as well as the 2080, it'll be a hell of a deal considering the 2080 doubled in price over the 1080. Let alone if it somehow ends up being superior because they focus on raw graphics processing power and not new exclusive ray-tracing features.
 
Last edited:
I hope that RTX flops so that NVIDIA has a setback and comes to their senses. I believe that the raytracing implementation is done in software and runs on CUDA cores at the hardware level. These cores have been partitioned and are not accessible by the user. If you think about it, it makes sense. Of course, I don't know that much about GPU at the silicon level.

Still, I hope they flop. NVIDIA has been pulling all kinds of crap for a while now and they need to stop and act accordingly. I remember when the GTX 1080 launched and I was able to order a pair shortly after from NVIDIA's own website. Why now we have a one to two months launch delay? These new cards are priced higher, so clearly existing 10 series inventory would have sold anyway. The real reason is that these new cards don't perform that much better, and who knows how they do with raytracing. Gamers who shell out big bucks in 4K 120Hz diaplays of ultrawides with G-Sync don't want to play at 30 to 60 FPS at native resolution. No one who shells out so much money wants to game at 1080p.

No one asked for ray-tracing. More CUDA cores would have been fine this generation.

If that's the case then we never see games aiming to be photorealistic as much as they could. At one point they have to push new thing. Nvidia thought this is the right time to do it when competitor have hard time to keep up even with their 'mid range chip'.
 
Can't agree. Ray tracing is the future, and has been the future for about 40 years now. I for one am excited that the future is almost here. Once cards are fast enough to run it in real time @ 4k 60 fps, it will be worth the wait. But to get to that point, we first need to take steps to get there. This is one of those steps.

Yep. And the crazy thing is only a small percentage of the chip is dedicated to this. Nvidia knows the software devs need time to make the change so they're rationing the chip hardware between normal rasterization and Ray tracing. As more software goes this route they'll up the dedicated die space for it. Makes perfect sense.

And I don't understand the negativity when the card hasn't even been benched yet.
 
If you're using a 3-d light sample, why would you want to soften it with a 2-d pixel avg? A ray traced pixel is way way better. Different orders of magnitude.

Both techs are basically using AI to enhance or direct the rendering process anyways. I'll take the one with the supercomputers backing it please.
So you're saying that games with ray tracing won't need AA? Sorry, I'm not buying that.
 
I don't understand why this news shocks anyone. Good products from huge companies tend to sell themselves. When the marketing department goes full Leroy Jenkins, chances are the product isn't as great as they claim. Where is that guy Vega who was trying to crap all over me in the last big thread? He needs to print a copy and eat it like a good boy.
 
replace this damn beautiful 980Ti

Haha i am on a GTX 570, so pretty much any GFX card will be a substantial update, even if my monitor have me locked in at 1080p and freesync and a wish to build AMD this time.
I really would love to not buy a RX 580, but cheaper Vega cards could help with that ( cheapest 56 & 64 cards have pretty much same price )
 
So you're saying that games with ray tracing won't need AA? Sorry, I'm not buying that.

AA is basically over rendering 3d-2d then doing a 2d average to produce a pixel.

Hybrid ray-tracing is building each pixel from a top and bottom acceleration structure. It's basically sampling in 3d and averaging in 3d.

Seems wrong to do really expensive 3d calculations, then ruin them with a 2d filter.
 
Just checked opening price for the 2080 ti are also just about 1600 USD here in Denmark. :eek:
I dont think i have ever paid 400 USD for a GFX card, dident even need that when i was gaming seriously, cuz that was done in lowest settings and a modest resolution on the 22" CRT screen, and 200 Hz & 200 FPS at least.

Dont get me wrong i think game graphics are sweet and i always have then as high as i can in single player or RTS games, but for competitive FPS games i dident and i assume still dont need that.
I even wish some one would make the old school kind of FPS game i like, but those seem to be phased out or morphed into some kind of garbage i would never play again.
 
Just checked opening price for the 2080 ti are also just about 1600 USD here in Denmark. :eek:
I dont think i have ever paid 400 USD for a GFX card, dident even need that when i was gaming seriously, cuz that was done in lowest settings and a modest resolution on the 22" CRT screen, and 200 Hz & 200 FPS at least.

Dont get me wrong i think game graphics are sweet and i always have then as high as i can in single player or RTS games, but for competitive FPS games i dident and i assume still dont need that.
I even wish some one would make the old school kind of FPS game i like, but those seem to be phased out or morphed into some kind of garbage i would never play again.

They basically skipped the Titan -> six months later ti transition. I think that’s why Titan users shrug this off. It’s a normal price and they actually have more options. The people that wait for the $800 ti cards are the ones up in arms.

Who knows... wait six months maybe it will be $800 (doubt it though).
 
When Nvidia introduced the 400 series cards with support of the new DX12 tessellation, yeah, tessellation tanked performance.

Just imagine had MSFT did the right thing and not listened to Nvidia crying like a little baby that AMD had the advantage (after they spent countless years including it in their cards and who knows how many millions of dollars) and force Nv to "use it the way AMD designed it" seeing as they already did all the heavy lifting and Nv got to do things their own way long enough.. MSFT as usual did not do this (obviously)

Hell if MSFT were "smart" they would have forced all companies to be "level playing field" and majority of DX12 would have been purely in hardware (would actually force companies to really focus on design going forward instead of just using fancy voltage/power controllers, if they did BOTH power use would be way down)

Had they done exactly this, GTX 10 series would have fallen flat on its face performance and power usage, big time (nstead someone like AMD with their Radeons are paying for this catering BS because they cannot build exxactly the way Nv does/did because that would be automatic lawsuit wating to happen)

but hey, why force level playing field when you can cater towards certain companies whiny ways of doing business.

just imagine how absurdly piss poor the performance would have been had Nv had no choice but to do it in hardware etc like AMD had done since they implemented it (tessellation) instead Nv got to "cheap out" and do things via software layers and still terrible performance in comparison...then there is that wonderful old PhysX that when the Radeons were "allowed" to use it, they absolutely trashed Nv performance wise with lower tier of GPU.

Nv absolutely does not like being "upstaged" so they have and continue to go out of their way to fk everyone over by tweaking game code etc etc etc.

Nv is a big dumb bully, the sooner people realize how much they are throwing $$$$$$$$$$$$$$$ at them just to be bent over, the better IMO, they may do some "neat things" but most of this has not been for the benefit of anyone but themselves and Jen expensive leather coats and caviar ^.^
 
I will consider than when it start raining with 4K monitors and i manage to catch one, personally i am not that bitten by the high res fab, hell my TV are even a 10 year old 1080p television from LG.
I do like a increasing number of televisions have freesync support, maybe i go that route one day, the amount of TV i watch are pretty low as i dont do time killer programs but only make smarter programs.

PS. i did try my 42" TV as a monitor when my CRT screen died a year ago, worked fine on my huge pro computer table, though at that size that close ( 2 - 3 feet ) then > 1080p might be preferable.
 
My hope is that enough people believe the hype to drive down the price of existing cards. Current prices are a sham. A Vega 56 at $250 would make me spend some bones.
 
Haha i am on a GTX 570, so pretty much any GFX card will be a substantial update, even if my monitor have me locked in at 1080p and freesync and a wish to build AMD this time.
I really would love to not buy a RX 580, but cheaper Vega cards could help with that ( cheapest 56 & 64 cards have pretty much same price )
DSR came out many years ago so no you're not locked in at 1080p...
 
Hehe it would be nice if AMD could pull a rabbit out of the hat like they did on Intel back in the old days, and then both on the CPU and GFX side at the same time :cool:
 
AA is basically over rendering 3d-2d then doing a 2d average to produce a pixel.

Hybrid ray-tracing is building each pixel from a top and bottom acceleration structure. It's basically sampling in 3d and averaging in 3d.

Seems wrong to do really expensive 3d calculations, then ruin them with a 2d filter.
It sounds like you could be onto something but I will believe it when I see definitive proof. Thanks though.
 
There's rumors that with the CUDA cores being more versatile that there could be ~50% rasterization gains.
 
More speculations.... that mean nothing. Please guys wait for the reviews, it's in like 3 weeks..

They may not be though. Both Hardware Unboxed and Gamers Nexus stated in videos posted just yesterday that not only did they not have review samples yet but "NOBODY" has them.

A GPU review, especially one as anticipated as these are can't be thrown together overnight. I've never done one obviously but I would guess that from the time it shows up on Brent's doorstep, thru all the testing, retesting, benchmark generating and actually writing the article to the time Kyle has finished editing, writing up his summary and finally uploading it onto the website to go live, 3 weeks is probably cutting it pretty close.

That's why I said earlier, I wonder if Nvidia is doing this deliberately to keep reviews out of the public until actual retail release and all the pre-orders are sold out. I'm hoping that's just me being all tin foil hat.
 
They may not be though. Both Hardware Unboxed and Gamers Nexus stated in videos posted just yesterday that not only did they not have review samples yet but "NOBODY" has them.

A GPU review, especially one as anticipated as these are can't be thrown together overnight. I've never done one obviously but I would guess that from the time it shows up on Brent's doorstep, thru all the testing, retesting, benchmark generating and actually writing the article to the time Kyle has finished editing, writing up his summary and finally uploading it onto the website to go live, 3 weeks is probably cutting it pretty close.

That's why I said earlier, I wonder if Nvidia is doing this deliberately to keep reviews out of the public until actual retail release and all the pre-orders are sold out. I'm hoping that's just me being all tin foil hat.
Your line of thinking holds water.
I guessed this would happen when they first tried to pull their 5 year NDA crap.
If Kyle had signed that or the new one they demanded of him (reported last week), this thread wouldnt be up on [H].

I read reports that the ray tracing is performed at lower res and upsampled/filtered so its not even 1080p.
This explains why the image loses colour detail and peak brightness vs none ray traced.
 
Back
Top