Why does raytracing suck so much, is it because of nvidia's bad hardware implementation?

Status
Not open for further replies.
I have to say that I agree.

What we're going to see, it seems, is a further decelleration of performance improvement per generation of the midrange, and an acceleration of the price in the higher end. While it is because of physics, I agree that that makes no difference to the average buyer.

It is what it is.
 
So you're looking at $50 more expensive than their previous mainstream parts right out of the gate. Granted that's not as egregious as the high end pricing (looking at you $1199 2080Ti FE), but $250-300 is really a cut off for people looking at mainstream cards. The 1660Ti is a much better card in that space. If the 2060 was a $299 card and the 1660Ti was a $249 card I think it would sell a lot better.

Also, the used pricing of Pascal cards still puts a damper on sales. A $250 1080 runs circles around the 2060 and the RTX features of the 2060 aren't fast enough to actually be used in any meaningful way.

At the end of the day, I'm pro new technology. I just don't think that I should have to pay a premium to be a beta tester though...

Geez. Not only is $50 pretty paltry, but they also released a 1660Ti without RTX at the old x60 series price with a nice generational boost. No one is twisting your arm to get in now.

It's an even more bankrupt position to be arguing used cards. You know what, used stuff is cheaper, it always is, or no one would buy it. Arguing used prices is a ridiculous argument.

If the price keeps going up, nobody is going to buy it eventually.

All that might happen is people move down a tier, to pay the price they are comfortable with.
 
All that might happen is people move down a tier, to pay the price they are comfortable with.

That is what people are doing.

Nvidia re-sorted their performance tiers at the die to SKU level with the GTX600-series- the GTX680 getting a midrange Gx104 die, and the Gx1x0 die not being seen till the GTX780- and they appear to have done the same again with the RTX2000-series.

The entry-level enthusiast parts are getting smaller relative to their largest consumer parts, and the prices of the largest parts are going up.
 
Geez. Not only is $50 pretty paltry, but they also released a 1660Ti without RTX at the old x60 series price with a nice generational boost. No one is twisting your arm to get in now.

It's an even more bankrupt position to be arguing used cards. You know what, used stuff is cheaper, it always is, or no one would buy it. Arguing used prices is a ridiculous argument.



All that might happen is people move down a tier, to pay the price they are comfortable with.

I don't think I explained myself correctly. My fault. Traditionally You got more performance in a lower price tier with a new generation. This generation you got almost identical performance in the same price tier just with new features that arguably aren't very useful. Which is why used cards are a significantly better value and why I brought them up.

When you move down a tier from the Pascal generation to the Turing generation following your suggestion you gain virtually nothing. The only card that makes any difference is on the extreme high-end and it is priced astronomically.
 
I don't think I explained myself correctly. My fault. Traditionally You got more performance in a lower price tier with a new generation. This generation you got almost identical performance in the same price tier just with new features that arguably aren't very useful. Which is why used cards are a significantly better value and why I brought them up.

When you move down a tier from the Pascal generation to the Turing generation following your suggestion you gain virtually nothing. The only card that makes any difference is on the extreme high-end and it is priced astronomically.

This really only matters if you are looking to upgrade every generation, so you hold one more generation. That isn't exactly a big deal for any but a fringe of the market.
 
What about when "3D accelerators" came into the picture? Hardware anti aliasing? Hardware T&L? Bits and pieces demoed here and there too until it became widely implemented. I don't own an RTX card and honestly, I wasn't expecting stellar ray tracing performance from a 1st gen product. Somewhere down the line, they'll figure out how to do ray tracing even faster and more efficiently, but someone has to get the ball rolling first.

Sure, the ball is rolling I suppose. But I guess that's part of the answer that OP was looking for; adoption is low, tech is new, and the implementation isn't really a full ray tracing solution
 
and the implementation isn't really a full ray tracing solution

It is 'full ray tracing'; however, there's not enough grunt available with current technologies to do 'full ray tracing' on more than the simplest games (see Quake II RTX) and that with their noise reduction routines. What does also exist is hybrid raster and ray tracing, which will be the way forward for the next decade or so. We still haven't reached the point where raster graphics output on current titles can meet human acuity and responsiveness, which will be something along the lines of 4k120 on the desktop at a minimum, and far higher for VR.
 
It is 'full ray tracing'; however

Maybe along that road we will see the improvements everyone is looking for.

For now, Nvidia is asking 1200 for a card that doesn't have enough "grunt" to get through a 25 year old game demo heavily optimized for RTX at 4K. Not enough RT cores? I guess that could be why it sucks
 
Maybe along that road we will see the improvements everyone is looking for.

For now, Nvidia is asking 1200 for a card that doesn't have enough "grunt" to get through a 25 year old game demo heavily optimized for RTX at 4K. Not enough RT cores? I guess that could be why it sucks

NVIDA has two generations that can run this...AMD must be really bad then since they cannot run it...period.
 
Maybe along that road we will see the improvements everyone is looking for.

For now, Nvidia is asking 1200 for a card that doesn't have enough "grunt" to get through a 25 year old game demo heavily optimized for RTX at 4K. Not enough RT cores? I guess that could be why it sucks

And AMD has nothing, what's your point? The 2080ti still offers the best raster performance as well. It's not as if you are paying just for RT features and not getting a bump in raster performance. It's still faster by 30-40% than a 2080, and AMD still can't match a 2080 in most titles.
 
1. I guess, but you really can't expect them to go mainstream on RTX, its 1st gen after all
2. see above
3. Its already on the most popular engines, like Unity, Unreal, frostbite, (who uses Cryengine?) so I expect several titles being RTX enabled. Granted RTX kind of sucks on many of them, but then again, its 1st gen.
4. That's the price to pay for being an early adopter.
1+2 If you want your hardware to succeed you need software both 1 and 2 points are made that if you need to make it work that is something you need to address segmented performance on ray tracing does not help nor does the part help where only the top end part does ray tracing well enough to mitigate performance asking more money for it does not help selling hardware (features for the rich not for the poor). This is the biggest issue for failing to meet a better game experience for the gaming community.

3. You can implement DXR on any engine that does not cover what I mean by having games/software actually use it. The drawback is no user base and a segmented one you need to implement 3 versions of ray tracing for it to function across the whole platform if you don't people can enjoy ray tracing on 720p if they are lucky. It is the single most divisive issue where you are now forced to have/use less ray tracing (Battlefield V first few patches did this).

4. This point is disingenuous you either bring your goods to the market and sell it or try and market it and hope people buy it. I find that Nvidia has been lacking on both software and hardware fronts to sell us ray tracing. It is not like the gpu and ray tracing materialized out of thin air they worked hard on it (several years) Nvidia has influence on developers when it comes to pushing things as Gameworks on them and now with ray tracing not so much ?
 
1+2 If you want your hardware to succeed you need software both 1 and 2 points are made that if you need to make it work that is something you need to address segmented performance on ray tracing does not help nor does the part help where only the top end part does ray tracing well enough to mitigate performance asking more money for it does not help selling hardware (features for the rich not for the poor). This is the biggest issue for failing to meet a better game experience for the gaming community.

3. You can implement DXR on any engine that does not cover what I mean by having games/software actually use it. The drawback is no user base and a segmented one you need to implement 3 versions of ray tracing for it to function across the whole platform if you don't people can enjoy ray tracing on 720p if they are lucky. It is the single most divisive issue where you are now forced to have/use less ray tracing (Battlefield V first few patches did this).

4. This point is disingenuous you either bring your goods to the market and sell it or try and market it and hope people buy it. I find that Nvidia has been lacking on both software and hardware fronts to sell us ray tracing. It is not like the gpu and ray tracing materialized out of thin air they worked hard on it (several years) Nvidia has influence on developers when it comes to pushing things as Gameworks on them and now with ray tracing not so much ?

1+2) We have ben here before: https://www.hardocp.com/article/2006/11/08/bfgtech_geforce_8800_gtx_gts/18 DXR is from Microsoft, not NVIDIA...just like DX10 was...going to be funny to watch you post when AMD start supporting DXR.../backpaddle?

3) There is one DXR option? It's called DXR and is an extension of DX12: https://devblogs.microsoft.com/directx/tag/dxr/

4) You sounds like you failed Economics 101 and is posting form the "entitled" posistion.

It's going to be soooooooo fun to see you run around in circles when both AMD and Intel starts supporting DXR...
 
4. This point is disingenuous you either bring your goods to the market and sell it or try and market it and hope people buy it. I find that Nvidia has been lacking on both software and hardware fronts to sell us ray tracing. It is not like the gpu and ray tracing materialized out of thin air they worked hard on it (several years) Nvidia has influence on developers when it comes to pushing things as Gameworks on them and now with ray tracing not so much ?
Funny you mention the term 'disingenuous' with that argument, lol. Gameworks and RT are not equivalent in the eyes of developers. Gameworks is easier to push onto devs when you have an overwhelming market share of the GPU market. RT only stands to benefit a very small percentage of GPU owners in the present time, so game devs would not be as enthusiastic in implementing it so quickly. They see its tremendous future potential, but as of now short term priorities still hold. Nvidia it must be assumed, has been as aggressive as they can possibly be in pushing RT to the devs, THEY NEED TO for their RTX line to take off. Yet here you are suggesting otherwise in defiance of all logic. :rolleyes:
 
1+2 If you want your hardware to succeed you need software both 1 and 2 points are made that if you need to make it work that is something you need to address segmented performance on ray tracing does not help nor does the part help where only the top end part does ray tracing well enough to mitigate performance asking more money for it does not help selling hardware (features for the rich not for the poor). This is the biggest issue for failing to meet a better game experience for the gaming community.

This is how the 3D industry has always worked. Hardware acceleration, antialiasing, 32-bit color. They all were available first on the most expensive parts and then trickled down over time. Why would RT be any different especially given it’s a massive change compared to what we’ve had for the last 30 years.
 
This is how the 3D industry has always worked. Hardware acceleration, antialiasing, 32-bit color. They all were available first on the most expensive parts and then trickled down over time. Why would RT be any different especially given it’s a massive change compared to what we’ve had for the last 30 years.

Exactly. I'm seeing the same pattern that we had when we went from software rendering to 3d hardware accelerated rendering with RT. It's not a new rendering technique, but we're only beginning to see it becoming realistically usable for gaming. I remember back in the 90s, looking at ray traced renders and thinking how awesome it would be once we can actually see that being used in games. I can't wait until it becomes the norm and not something that developers needs to be "showcased". People forget how big a performance hit things we now see as "normal" was to hardware at their time of introduction to PC gaming. Things like reflective surfaces, bump maps, anti aliasing, shadows, tesselation. It's funny, even back then, a lot of naysayers were complaining about how those things weren't needed because they "can't see much of a difference."
 
Last edited:
Its not that it sucks, its just that this is the first wave of hardware implementation. I assume it will get better over time or morph into something else, like Hardware T&L if some of you are old enough to remember from the Geforce 2 days.

I can see why people think it sucks though. Spending all that money on a product they were led to believe was a shoe in for the future (which it is) but not right now.

Never buy the newest model of anything.
 
Its not that it sucks, its just that this is the first wave of hardware implementation. I assume it will get better over time or morph into something else, like Hardware T&L if some of you are old enough to remember from the Geforce 2 days.

I can see why people think it sucks though. Spending all that money on a product they were led to believe was a shoe in for the future (which it is) but not right now.

Never buy the newest model of anything.
Yeah, this isn't really any different from a number of new technologies where being an early adopter means paying through the nose for what is ultimately a sub-optimal experience. It's almost always better to wait for the second generation or later, as the tech becomes cheaper and more capable. But at the same time, you have to accept that if you want the latest and greatest new thing right now, you'll have to pay for it.
 
cuz its gen 1 lol. I am waiting for gen 3 or 4 before I think about anything ray tracing.
 
Regardless of how much the OP thinks Ray Tracing Sucks.

We might soon have the first Ray Tracing Killer App:

https://www.nvidia.com/en-us/geforce/news/cyberpunk-2077-nvidia-partnership-ray-tracing/
For Cyberpunk 2077, we’ve partnered with CD PROJEKT RED as an official technology partner to bring real-time ray tracing to the game.

“Cyberpunk 2077 is an incredibly ambitious game, mixing first person perspective and deep role-playing, while also creating an intricate and immersive world in which to tell this story. We believe the world of Cyberpunk will greatly benefit from the realistic lighting that ray tracing delivers,” said Matt Wuebbling, head of GeForce marketing at NVIDIA.

For a first look at real-time ray tracing in Cyberpunk 2077, check out these exclusive 4K gameplay screenshots:


cyberpunk-2077-nvidia-geforce-e3-2019-rtx-on-exclusive-4k-in-game-screenshot-002-420px.jpg

“Ray tracing allows us to realistically portray how light behaves in a crowded urban environment,” said Adam Badowski, Head of Studio, CD PROJEKT RED. “Thanks to this technology, we can add another layer of depth and verticality to the already impressive megacity the game takes place in.”

If you happen to be at E3, swing by Booth 1023 in the South Hall to check out real-time ray tracing in action in the latest Cyberpunk 2077 E3 demo presentation.
Note about final line does show up in quotes unless you click it. Dang, someone needs to check this out...
"If you happen to be at E3, swing by Booth 1023 in the South Hall to check out real-time ray tracing in action in the latest Cyberpunk 2077 E3 demo presentation."
 
Regardless of how much the OP thinks Ray Tracing Sucks.

We might soon have the first Ray Tracing Killer App:

https://www.nvidia.com/en-us/geforce/news/cyberpunk-2077-nvidia-partnership-ray-tracing/

Note about final line does show up in quotes unless you click it. Dang, someone needs to check this out...
"If you happen to be at E3, swing by Booth 1023 in the South Hall to check out real-time ray tracing in action in the latest Cyberpunk 2077 E3 demo presentation."

I hope Nvidia plans on releasing the 3XXX series before then otherwise no one will be able to take advantage of it.
 
I hope Nvidia plans on releasing the 3XXX series before then otherwise no one will be able to take advantage of it.
You mean what the news sites keep reporting that Nvidia is using Samsung 7nm EUV for their cards and that bit keeps popping up just before AMD has an announcement ?
 
I was playing BFV today, 1440P, Ultra, with Ray Tracing on Ultra and no DLSS on a 8086K with RTX 2080. I got between 50 and 80 FPS while playing. Most of the time I was in the 70s. My only real issue is the denoising isn't great yet leaving a film grain appearance in some instances, and the price barrier. I don't normally use it for multiplayer because I like higher frame rates, but the performance isn't as crippling as people seem to think it is. Certainly fine for single player non-competitive games at least for me.

And it will get better too. These are the same complaints people made when T&L engines were just becoming a thing. The same was said about 32bit color, and GLIDE vs Direct X. It's all the same.

That said, my previous video card was a 980ti. If I hadn't missed the 1080ti deals, I probably would have gotten one instead of a 2080.
 
If not tongue-in-cheek, Nvidia has been working with developers for quite some time to get new features supported. They've been significantly more effective than AMD in this regard, across the product spectrum.

No I'm 100% serious. It's just not very popular and needs some push from nvidia if they want it to be successful. If I had a choice I'd buy the non-RTX variant of the 2080 ti today.
 
No I'm 100% serious. It's just not very popular and needs some push from nvidia if they want it to be successful. If I had a choice I'd buy the non-RTX variant of the 2080 ti today.

Well, it is being pushed. It's just not easy to implement a hybrid renderer, especially if essentially backporting and then keeping the full raster pipeline in place simultaneously.

Just switching to full ray tracing is far easier, as seen in Quake II RTX.
 
Well, it is being pushed. It's just not easy to implement a hybrid renderer, especially if essentially backporting and then keeping the full raster pipeline in place simultaneously.

Just switching to full ray tracing is far easier, as seen in Quake II RTX.

I agree with your point but we are now what, 8-9 months post-release of the 20 series and how many titles actually support it? Is it even 10 games?
 
I agree with your point but we are now what, 8-9 months post-release of the 20 series and how many titles actually support it? Is it even 10 games?

See: every DirectX release. Vulkan. Mantle never got that far.

The chicken-and-the-egg problem is real; Nvidia is being railed for jumpstarting the process by making the investment and taking the risk.

AMD... is just going to lose marketshare by choosing to not keep up.
 
See: every DirectX release. Vulkan. Mantle never got that far.

The chicken-and-the-egg problem is real; Nvidia is being railed for jumpstarting the process by making the investment and taking the risk.

AMD... is just going to lose marketshare by choosing to not keep up.

Eh just to be clear I'm not arguing for AMD or anything here. I'm about as neutral in that fight as one could be. I consider DX12 to pretty much be a failure as well.
 
Status
Not open for further replies.
Back
Top