Or else what? Nobody's going to buy next-gen consoles? Nobody's going to develop for them?
The PS5 is rumored to have a Navi GPU and a Zen CPU. From what I have read, Navi is pretty much a die shrunk Vega. Not a redesign. So I would be shocked if it had ray tracing. This will definitely impact how quickly nVidia's new ray tracing technology catches on with developers. And it will also really differentiate PC gaming from console gaming. This could wind up being more of a blow to console gaming than it is to nVidia and their ray tracing.

https://www.trustedreviews.com/news/ps5-release-date-price-games-news-rumours-specs-2993694
 
Nvida's half baked RTX semi-Raytracing is hardly a go to end all be all gaming gold standard. I will wait for the maturity before investing hard earned cash into any "semi Ray tracing" proprietary tech video card. I'd be surprised if anyone in 5 years even remembers the RTX moniker.
 
This will definitely impact how quickly nVidia's new ray tracing technology catches on with developers.

The companies that make the engines that will eventually be used for those upcoming consoles have have already 'caught on'.

I really don't see what you guys aren't getting about this. This is here, it's here to stay, and it's the way forward.

And there's nothing stopping AMD from having already included ray-tracing hardware in Navi. Ray-tracing is far simpler than rasterization, just also far more compute intensive; perhaps it's something that AMD will excel at better when it comes down to it since they seem to struggle so much at getting the most out of their hardware with the current rendering schemes.
 
I think waiting for most or half of the games to have it would be better really. I think the next gen would be a good buy if these features take a hold.
 
You seem to get really worked up and angry when people on the Internet have opinions that differ from yours, even slightly.

I don't mind pointing out when people expose their ignorance.

The flailing about by individuals that cannot fathom the leap in capability being made available is a bit amusing :ROFLMAO:
 
I think waiting for most or half of the games to have it would be better really. I think the next gen would be a good buy if these features take a hold.

If you have no need for the extra performance or to experience the new technology first-hand, then you should wait- just like very new release ;)
 
Well I do not know if its faster that my 1080ti Hybrid, so I might not lose any performance not going with it. I am on 1080p 144hz, so I doubt I will gain anything except the new features, which is not even in a handful of games yet. I think the next gen will be sweet and offer much more for me. Or I wait until the prices come down and play.
 
Well I do not know if its faster that my 1080ti Hybrid, so I might not lose any performance not going with it. I am on 1080p 144hz, so I doubt I will gain anything except the new features, which is not even in a handful of games yet. I think the next gen will be sweet and offer much more for me. Or I wait until the prices come down and play.

I'm at 1440p165, and the 1080Ti on water ain't enough to max it out- not even close. So yeah, I don't think the 2080Ti will offer 'enough' of a bump to make it worth the price for either of us.

But if I were on a 900-series or earlier? I'd be in line.

And if I were dying to play with ray-tracing myself? I'd be in line. I might still...
 
Meanwhile at the Hardforum talking about guys from India and Nvidia.
 
Last edited:
Really who gives a fuck, Nvidia doesn’t owe them money, nor did they come out and say why they cut them off (but it’s pretty obvious why they cut them off...).

If you aren’t going to sell my hardware I’m not giving you free shit!

Good on the person sticking to their guns though, I agree 100%. Pre ordering hardware is dumb dumb dumb. But everyone is free to do what they want with their money.
 
Personally I would not buy those cards for the ray tracing feature with that performance at that resolution. As for games not supporting this feature it remains to be seen what the performance will be, including power usage as the TDP has been increased significantly.

I've seen lots of information in terms of game support, and that's really what has me optimistic, both that we'll see very quick uptake and that we'll see support from AMD soon too. As far as buying them for ray tracing, I agree generally if your goal is to play everything with it on, but it's certainly got that 'wow' factor. We've gone from 'that's not possible' to 'we're getting baseline playable performance in upcoming games'. That's a big leap!

All in all, I think people are sort of choleric with so little known about the cards performance. I think that many had other expectations than what seems to be delivered, resulting in more than a little disappointment.

Anyone that was expecting ray tracing at 4k60 or 1440p144+ really just isn't paying attention. Raster performance will be the best available on the market.
 
Personally I would not buy those cards for the ray tracing feature with that performance at that resolution. As for games not supporting this feature it remains to be seen what the performance will be, including power usage as the TDP has been increased significantly.

All in all, I think people are sort of choleric with so little known about the cards performance. I think that many had other expectations than what seems to be delivered, resulting in more than a little disappointment.
I’m not sure what expectations they had with the card or how they would not be met since the card isn’t released to the public yet. Most of the drama is inferred from people wanting to be overly dramatic.
 
I watched it 75% of the way through...he is an Nvidia apologist and is trying VERY hard in this video to justify Nvidia's insane pricing. BS. I stopped watching him for a reason, and this video shows that I was right and shouldn't be watching him.

Him, AdoredTV, Linus etc are all garbage. I don’t know why anyone watches them for tech advice when there’s so many good sources out there.
 
Get in bed with a scumbag company and they may do scumbag things to you.
 
If you want to be on nv's tab, yeah you probably shouldn't tell people not to buy the product, that's just business. You've got to say it more discretely, like "Would I pre-order right now? Well I'm just waiting for some benchmark leaks first before jumping in." Not "DON'T PRE-ORDER!"

Like how Tom's does, "However, if you take the more-prudent approach of waiting for reviews to validate Nvidia's claims". They basically say it's best to wait for benches without dinging nvidia.
 
On topic, this guy got shafted "IF" everything he said happened, happened for those reasons.
However losing nVidia AND other sponsors for one comment a guest makes...seems overkill, so I think we don't have the full story

Off topic: Jesus H fucking christ. Every thread about the new cards varies between people trying to talk up AMD even though they have nothing, those being of sound mind and wanting to wait and see ACTUAL benchmarks and people who are literally gobbling nVidia cock, like taking it balls deep in all orifices., almost attacking those who dare question nVidia...

y'all need to be nicer to each other.
 
Facepalm.

Realtime ray tracing has been the videogame holy grail since the inception of 3d videogames.

Let me settle this.

We have had real time raytracing for 20 years. It's never been unobtainable. There are TWO big misconceptions about RT that need to be addressed.

#1 Realtime large-scale or full-scene raytracing has always looked worse than faking it with raster graphics at the same level of performance. That is, take a scene that runs at 60FPS on a set resolution and set piece of hardware and the rasterized scene using game engine techniques will look MUCH better. The only way to make RT look better is to slow the render down. Nvidia has not changed this.

#2: many game engines employ ray tracing techniques at the sub-scene or screen-space level, things like screen reflections, global illumination, and ambient occlusion ALL use raytracing in 2.5 and 3D. This is used in areas where using the method produces better and faster results than pre-baking or using shortcut code. Nvidia hasn't changed this.

The new RTX cards won't offer anything new, just tons of die space devoted to features that don't actually make scenes render faster. Yes, they render RT faster, but they STILL don't render fast enough for most PC gamers. If the 2080ti can't render the RT at acceptable FPS, what makes you think the 2080 or 2070 will even come close? That die area could have been used to add more CUDA cores or improved the current cores. But they didn't.
 
Let me settle this.

We have had real time raytracing for 20 years. It's never been unobtainable. There are TWO big misconceptions about RT that need to be addressed.

#1 Realtime large-scale or full-scene raytracing has always looked worse than faking it with raster graphics at the same level of performance. That is, take a scene that runs at 60FPS on a set resolution and set piece of hardware and the rasterized scene using game engine techniques will look MUCH better. The only way to make RT look better is to slow the render down. Nvidia has not changed this.

#2: many game engines employ ray tracing techniques at the sub-scene or screen-space level, things like screen reflections, global illumination, and ambient occlusion ALL use raytracing in 2.5 and 3D. This is used in areas where using the method produces better and faster results than pre-baking or using shortcut code. Nvidia hasn't changed this.

The new RTX cards won't offer anything new, just tons of die space devoted to features that don't actually make scenes render faster. Yes, they render RT faster, but they STILL don't render fast enough for most PC gamers. If the 2080ti can't render the RT at acceptable FPS, what makes you think the 2080 or 2070 will even come close? That die area could have been used to add more CUDA cores or improved the current cores. But they didn't.

And that’s why I think people should wait for RT release before we start a massive hate train. The 2070 only has 6 MR and the 2080ti has 10 (66% faster). nVidia is usually very calculated and I can’t imagine them giving up die space for something unusable.
 
Last edited:
Back
Top