RTX is a joke

Status
Not open for further replies.
I love this.

Ray tracing doesn't make a difference.

vs.

Ray tracing looks so much better than rasterization that they must be nerfing rasterization to give RT a boost.

You're right. We need to ditch rasterization completely. The sooner, the better.
Sarcasm? Though honestly that HL1 Path Tracing does look great to support an authentic interest
 
I remember when MSAA came out, made a lot of games unplayable anything above 2xMSAA. As cards get faster, experiences improve. MSAA runs smooth on any card today.
There's a cycle to this and acceptance of RT will be here soon once consoles make use of it.

Give it time.
It's been over four years man. The hype and price increases for minimal effect are real, thanks Jensen.
 
DLSS is that time?
I really like DLSS 3.0. It really has improved my experience in Cyberpunk.
It's still has texture glitches in some areas but the game is smooth.

I think people really want tech that is baked into the game.
Hardware dependent tech locks people out price and brand wise.
 
It's been over four years man. The hype and price increases for minimal effect are real, thanks Jensen.
Agreed it’s been quite a long time and to just harm real innovation on hw acceleration of path (ray) tracing

Rtx came out for the 20 series? 2018.. we’re in 2023 and this is what we call stagnation
 
Agreed it’s been quite a long time and to just harm real innovation on hw acceleration of path (ray) tracing

Rtx came out for the 20 series? 2018.. we’re in 2023 and this is what we call stagnation
The GPU industry, Nvidia in particular has been engulfed in RTX/RT. Literally - that's all the think about and develop around. DLSS, frame gen - just things to try and make RTX work reasonably well - and at what cost to the industry development wise let alone the consumer - who never really asked for it along with all of it's baggage. Every video card has this crap and you pay for it out of your pocket. Is there really no wonder why video card prices have gone through the roof? It's not just the dedicated silicon but the countless man-hours of development. Yeah - you pay for that too. In the end - you have to ask yourself - is this RTX hype really worth it? For this little benefit? Guess what - It's too late - we are all paying for it - whether we want to or not now.
 
Last edited:
what about the performance hit? did you have to enable DLSS

I turned on FSR 2.0 to assist the frame rate but high was unplayable in spots, at low settings it played well enough. Hogsmeade was where the frame rate tanked the most for me. Inside Hogwarts it did pretty good.
 
Nvidia's stock is super high right now. Mostly due to the buzz word AI.

Revenue down 21%
Expenses up 27%
Operating Income down 58%
Net Income down 53%
FCF down 53%

Trading at 120x earnings
Lol 😂 😆
 
Nvidia's stock is super high right now. Mostly due to the buzz word AI.

Revenue down 21%
Expenses up 27%
Operating Income down 58%
Net Income down 53%
FCF down 53%
Trading at 120x earnings

Below $100 you so adamantly said it would certainly hit IIRC in that other thread a while back, right?

Edit:

I do, and it'll be back down to $33 a share like the good old days, before the crypto boom.

Oh wait it was all the way down to $33 you predicted

Good thing you didn't buy those puts 👍
 
Last edited:
Nvidia's stock is super high right now. Mostly due to the buzz word AI.

Revenue down 21%
Expenses up 27%
Operating Income down 58%
Net Income down 53%
FCF down 53%
Trading at 120x earnings
Good, considering I got in during 2020 when it hit lows... so I have seen it rise, split, and rise more... thanks nvidia for continuing to buy my video cards for me. ;)
 
Good, considering I got in during 2020 when it hit lows... so I have seen it rise, split, and rise more... thanks nvidia for continuing to buy my video cards for me. ;)

Loaded up before the split and Icarus'd myself because I didn't sell shit.

Think I was up like 130% or something ridiculous at some point. RIP
 
The GPU industry, Nvidia in particular has been engulfed in RTX/RT. Literally - that's all the think about and develop around
Not so sure how much it is true, would it be crypto yesterday, datacenter-AI, cars, mobiles market.

Have we ever seen a GPU for which more than say 6% of it was about hardware specialized for RT ?

Think about how much went on about upscaling, raw rasterisation speed (the 4080 is what about 3.5 times a 1080 in that regard ?)
 
People are really shorting it hard rn? 🤔 🧐
There's no reason why Nvidia's stock should be doing as well as it is other than the possibility of selling more chips due to AI. AI is the new buzzword like cloud computing, blockchain, and Meta. Which oddly enough Nvidia has been involved in all three.
d4v45l41pzja1.jpg
 
Literally there are reviewers justifying a $200 premium, pricing brackets be damned because of RT. So we can't spend an extra $50 to make sure your MB has PCI-E4.0 but we can spend an extra $200 because.....ray tracing (and actually more if you want a decent experience). It's bonkers. Could easily provide video after video of some pretty famous reviewers justifying insane premiums because of RT alone.

Your top of the line 2080TI is playing this game at no more than 54 FPS @ 1080P!!!! The lows are worse. You're gonna feel every RT hit while playing and it's going to be obvious. The 3060TI? LOL. Where you top out at 34 FPS at 1080P w/ RT on?

View attachment 551279
Guess what? Raytracing can be used in conjunction with lower than ultra settings, and still be a net gain in visuals. Also, dlss. EDIT: Oh yeah, lol.. Single game example on a known buggy game that just released.
 
Last edited:
Literally Hardware Unboxed did an entire video on it. It does. No one is moving goalposts here. You mentioned the 2080TI. I looked at its performance. It's not great. That's about it.
You looked at one game, that has known issues with any NV GPU under 16GB video memory, and it also runs at half fps on nvidia compared to AMD, with GPU utilization not reaching 50%. Yeah very representative of RT overall.
All of this. We go from 20 - $50 is a completely different price bracket to if we are talking about video cards a $200 dollar difference is no big deal. It's mental. It's partly responsible for the insane GPU prices.
I don't know where did the $200 number came from, but for a GPU that cost $300 it would be a dififerent price bracket, but for anyone already spending close to $2000 $200 is no big deal.
It's been over four years man. The hype and price increases for minimal effect are real, thanks Jensen.
Do you honestly believe that if RTX didn't exist they would not have increased prices? It sure has nothing to do with the crypto bubble and AI buzz, it is that damn RTX why we can't have a 4090 for $700.
 
You looked at one game, that has known issues with any NV GPU under 16GB video memory, and it also runs at half fps on nvidia compared to AMD, with GPU utilization not reaching 50%. Yeah very representative of RT overall.

I don't know where did the $200 number came from, but for a GPU that cost $300 it would be a dififerent price bracket, but for anyone already spending close to $2000 $200 is no big deal.

Do you honestly believe that if RTX didn't exist they would not have increased prices? It sure has nothing to do with the crypto bubble and AI buzz, it is that damn RTX why we can't have a 4090 for $700.

Not really worth arguing over it. Fact is ray tracing is going to be the future, whether they want to believe it or not. It won't became standard for a long time though. Not until another 2 GPUs generations at least and the next console generation. Hopefully the next consoles are more comparable to higher end GPUs next go around, because games will be built around them.

So, again, to me this looks like n intentionally sabotaged raster screenshot, to make the saturated RT one look better, and the supposed RT effects to me look bad. Like they are too perfectly reflective.

Saturate the left pic a little more, and I think I might actually prefer it.

I feel like this is similar to Physx when it first came out. Overdone particle effects and debris flying around. Was it technologically better? Yes, but it looked over done.

I think in part it has to do with them trying to create aspects that pop out more because most games tend to only have some things ray traced. Until we get better hardware and better consoles games will always need to have raster available, and ray tracing will look more tacked on and over done.
 
Last edited:
Not really worth arguing over it.
It is always worth arguing with misinformation, even if I can't change the mind of those spreading them at least I can prevent it from spreading verbatim.
Fact is ray tracing is going to be the future, whether they want to believe it or not. It won't became standard for a long time though.
It seems to be already a standard as many new AAA games support it. It will take more time until it trickles down to the low end, but it is a standard.
Not until another 2 GPUs generations at least and the next console generation. Hopefully the next consoles are more comparable to higher end GPUs next go around, because games will be built around them.
Some console games already have some RT, so by next console gen I think it will be default for every game, and the ones that have no RT will be the odd ones.
 
I am a graphics whore and do wet my pants on all kinds of new visual eyecandy gimmicks. Sure today the glued on ray tracing effects on top of raster do not offer much especially considering how heavy it is but the potential in the future is immense.
The RTX Minecraft, which is uses proper path tracing, is simply stunning. Just look at a water and see how it bends the light making gauging water depth really hard, just like real water in a swimming pool. Or how it can simulate a camera effect, an object in a well lit room casting an upside down image through a pinhole into a dark room. Do any of this add anything to the game? Probably not but it is amazing stuff none the less and another tool in the tool box on getting over the "video game look" uncanny valley.

Sure this is all very GPU heavy but it will get better over time, just like all graphic effects. Hell I am pretty sure we have gone this exact same discussion in this forum over Ambient Occlusion when that came. At first it was quite heavy on the GPU's of that time. Not as heavy as RT is today in comparison but many considered it a gimmick still. Today we cannot think of video game graphics without AO.
 
Guess what? Raytracing can be used in conjunction with lower than ultra settings, and still be a net gain in visuals. Also, dlss.
Got the game. Not mind blowing. Does it add some effects that are above just raster? Yeah. Is it something that couldn't be replicated in raster? No.
EDIT: Oh yeah, lol.. Single game example on a known buggy game that just released.
The people shilling RT were the ones that brought that game into discussion as something amazing. Now it's oh don't look at that game. Makeup your minds.
 
Do you honestly believe that if RTX didn't exist they would not have increased prices? It sure has nothing to do with the crypto bubble and AI buzz, it is that damn RTX why we can't have a 4090 for $700.
No why would Nvidia and AMD ever try and recoup the billions of dollars spent on RTX/RT from the consumer? It's free, right? Crypto MADE these companies money, you are seriously lost about how economics works. LoL AI buzz.. Get your head out of the mainstream media's butthole and try and sense reality. Do you honestly believe that dedicated RTX silicon and millions of lines of RTX code are free? A super fast raster card only card that can do 99.9% of the effect of a 4090 sure the hell would have cost $700. It's the people buying $1500 4090 and like video cards for that RTX twinkle that created this mess. Jensen sucked you in good. You have no one to blame but yourself and like minded RTX fans for the current GPU market and pricing.
 
I don't know where did the $200 number came from, but for a GPU that cost $300 it would be a dififerent price bracket, but for anyone already spending close to $2000 $200 is no big deal.
A mid range card can do 1080 @ 60 FPS (actually quite a bit more). No? OK so if I wanted 1080 @ 60FPS /w RT can that happen with mid range prices? NO. How much would I have to spend? At least the cost of a 3080 right? How much is that? $800. That's just to play the game at 60FPS w/ RT on at 1080P. Now in raster where does a 3080 actually stack up?
1677247919741.png

Comparable to a 6800XT. Right? How much does that cost? Top end? $600, (but just saying $500 range is common). So you're spending an extra $200 (at the minimum) for a card that raster wise are pretty damn close. A $200 difference is another pricing tier all together. How much does a 3060Ti cost? About $500. Right? And where does it perform? Yup. This is why prices are crazy.

I think RT can be good thing. Definitely something to look at in the future. But is it here now? No. Is it here enough justify these prices? Absolutely not.

Until people take off their RT colored glasses the prices will continue to be like this and its not good for anyone.
 
Last edited:
Haters gonna hate.

Not sure what all of the fuss over Raytracing is really about... Soon after the AI sexbots revolutionize pleasure, we will all get ported into a raytraced virtual reality indistinguishable from real life (see Caprica).

So, you want that tech to get better, which requires interative improvements over X number of generations. Pay up.
 
Not sure what all of the fuss over Raytracing is really about... Soon after the AI sexbots revolutionize pleasure, we will all get ported into a raytraced virtual reality indistinguishable from real life (see Caprica).
I can't remember the name of it, but my wife would watch this weird show where the employers of this fantasy land would bang the robots. I'd be down for that. haha
 
No why would Nvidia and AMD ever try and recoup the billions of dollars spent on RTX/RT from the consumer? It's free, right? Crypto MADE these companies money, you are seriously lost about how economics works. LoL AI buzz.. Get your head out of the mainstream media's butthole and try and sense reality. Do you honestly believe that dedicated RTX silicon and millions of lines of RTX code are free?
*sigh*

You realize that no matter what they do they're going to do two things:
1.) Improve tech.
2.) Charge you for it.

I could change every sentence you state about "Ray Tracing" with "Raster" and it would read the same and be true. I could also insert DX1-DX11, bump mapping, FSAA (all the way to modern TXAA), global illumination, hardware T&L, SSAO, HBAO, or any other rendering technology and it would read the exact same. All of that raster tech takes die space. Why do we need SSAO? What a waste of silicon. Why did we have all of these companies write code for SSAO? You know that costs money right? (Or insert any raster tech you want, AA, GI, whatever it all reads the same). None of it is free. It never has been. Have you not bought any video cards ever?

For fun, you should scroll through all of nVidia's Gameworks SDK's. We can play a game about how many of them are for raster. https://developer.nvidia.com/gameworksdownload
What a waste of so much code.

You guys are boring. Just stop buying video cards. You no longer have an interest in computing as a hobby.

A super fast raster card only card that can do 99.9% of the effect of a 4090 sure the hell would have cost $700. It's the people buying $1500 4090 and like video cards for that RTX twinkle that created this mess. Jensen sucked you in good. You have no one to blame but yourself and like minded RTX fans for the current GPU market and pricing.
This is all because he can. This has zero to do with RT. nVidia could charge $800 for the 4080 easily right now with the RT on it and zero changes. The 4090 is a bit different, as it's literally a Titan class card. You have this weird notion (now referencing my #1, #2 above) that any other rendering technology doesn't cost money in R&D + coding + manufacturing, that GPU's haven't always been a bleeding edge technology, and that they therefore wouldn't charge you for it as if that hasn't happened for the past 20 years. All of this is an absolutely absurd assertion.
 
Last edited:
so you still playing Zork I and other text based games? I mean why have graphics at all?

Ray Tracing is still new (as far as main stream game development goes) and it will take time for developers to utilize it efficiently. DirectX 12 was launched in 2015 and 8 years later AAA games released almost 8 years later still don't fully utilize it.

I upgraded from a 1080 to a 3090 right before Christmas and find that Ray Tracing really is wonderful eye candy and am excited to see what games will look like over the next several years.
I'm with you - when I got my 3090 it was cool - then I ran out of games in a hurry. Control was the only one that stayed impressive, to be honest - the rest were "ok, cool - but this would still be a blast without RT too."
 
I could change every sentence you state about "RTX" with "Raster" and it would read the same and be true. I could also insert DX1-DX11, bump mapping, FSAA, global illumination, hardware T&L, SSAO, or any other rendering technology and it would read the exact same. All of that raster tech takes die space. Why do we need SSAO? What a waste of silicon. Why did we have all of these companies write code for SSAO? You know that costs money right? (Or insert any raster tech you want, AA, GI, whatever it all reads the same). None of it is free. It never has been. Have you not bought any video cards ever?
BS! No one paid an extra $200 for Hardware T&L, or FSAA, MSAA either. :ROFLMAO::ROFLMAO: I swear you guys will just make stuff up.
 
*sigh*

You realize that no matter what they do they're going to do two things:
1.) Improve tech.
2.) Charge you for it.

I could change every sentence you state about "RTX" with "Raster" and it would read the same and be true. I could also insert DX1-DX11, bump mapping, FSAA, global illumination, hardware T&L, SSAO, or any other rendering technology and it would read the exact same. All of that raster tech takes die space. Why do we need SSAO? What a waste of silicon. Why did we have all of these companies write code for SSAO? You know that costs money right? (Or insert any raster tech you want, AA, GI, whatever it all reads the same).
You guys are boring. Just stop buying video cards. You no longer have an interest in computing as a hobby.

You missed the point - raster effects never enveloped, took over and stagnated the graphics industry to the extent RTX/RT has AND never cost an arm and a leg. Fk new tech if it's cost and performance prohibitive. Yet here everybody is complaining day after day on GPU pricing. How hypocritical.

This is all because he can. This has zero to do with RT. nVidia could charge $800 for the 4080 easily right now with the RT on it and zero changes. The 4090 is a bit different, as it's literally a Titan class card. You have this weird notion (now referencing my #1, #2 above) that any other rendering technology doesn't cost money in R&D + coding + manufacturing, that GPU's haven't always been a bleeding edge technology, and that they therefore wouldn't charge you for it as if that hasn't happened for the past 20 years. All of this is an absolutely absurd assertion.

Tell me how RTX development with dedicated hardware and millions of lines of code and millions of hours of development equivalently compare to raster effects. Keep believing your fantasy because it fits your narrative and not reality.
 
BS! No one paid an extra $200 for Hardware T&L, or FSAA, MSAA either. :ROFLMAO::ROFLMAO: I swear you guys will just make stuff up.
I didn't assign any dollar value to any of this. But how do you know how much you paid for any given piece of tech. To a large degree you did if you ever bought a flagship card.
The difference between "Ultra" settings and not at every resolution has always been a cost. Maxing out Quake 3 took years. Running Crysis (implied at max settings) was literally a meme. I suppose you could conveniently also forget every flagship game for 20+ years.

You definitely skipped people buying 2x-4x Crossfire/SLI in order to run triple monitors. In those times folks were paying literally additional thousands to increase resolution and have ultra settings. I don't see how the situation we're in now is any different. And also to directly contradict you, FSAA has a massive penalty originally. Everything you're complaining about with RT was said about FXAA. (Not worth it, not enough of a visual enhancement, too much of a frame rate penalty - to the "T"). The irony is you either weren't around or you have very selective memory about each of those techs as they came out. T&L required specific cards when first launched. That's what the T&T was.

You missed the point - raster effects never enveloped, took over and stagnated the graphics industry to the extent RTX/RT has AND never cost an arm and a leg.
What the hell are you talking about? I remember when the 4000 series cards came out and nVidia pushed the pricing to $500 and we all thought it was insane.

I guess you also never were around for the nVidia FX5800 cards either. Now that was a generation of truly "stagnated" graphics. In contrast there is significant visual benefit with RT (which you disagree with, but whatever) and performance gain of that particular graphical improvement over the last 3 series of graphics cards.
Fk new tech if it's cost and performance prohibitive. Yet here everybody is complaining day after day on GPU pricing. How hypocritical.
Again, are you disconnected to reality? You're blaming RT but that isn't what caused this. You're focused on entirely the wrong thing. If we could magically make RT never have ever existed, nVidia after the last two years would've still pushed their cards to these prices because they're dicks. Them being dicks is completely independent of RT technology.
They saw a way to sell more cards and inflate pricing so they did it. That has nothing to do with RT. I literally stated in the last post they could sell the current 4080 for $800 if they wanted but they have chosen not to. You've also conveniently ignored that statement.
Tell me how RTX development with dedicated hardware and millions of lines of code and millions of hours of development equivalently compare to raster effects. Keep believing your fantasy because it fits your narrative and not reality.
Okay I will because it's the truth? Or are you saying we could remove all the entire portion of the GPU that handles traditional raster rendering and be fine? We have had 20+ years of raster. And maybe 5 of RT. I can tell you with 100% certainty which has costed the industry more over time even if we don't adjust for inflation.
 
Last edited:
I didn't assign any dollar value to any of this. But how do you know how much you paid for any given piece of tech. To a large degree you did if you ever bought a flagship card.
The difference between "Ultra" settings and not at every resolution has always been a cost. Maxing out Quake 3 took years. Crysis was literally a meme. I suppose you could conveniently also forget every flagship game for 20+ years.

Anybody with a sense of the GPU market and development can see RTX investment by Nvidia over the past five years far exceeds any ONE EQUIVALENT RASTER EFFECT. Stop comparing the entire raster development stack to RT - that's not a valid comparison.



What the hell are you talking about? I remember when the 4000 series cards came out and nVidia pushed the pricing to $500 and we all thought it was insane.

Jensen called - he wants his RTX development money back.. Oldest marketing trick in the book. The first couple hits are discounted. Once you're hooked - jack the price.

I guess you also never were around for the nVidia FX5800 cards either. Now that was a generation of truly "stagnated" graphics. In contrast there is significant visual benefit with RT (which you disagree with, but whatever) and performance gain of that particular graphical improvement over the last 3 series of graphics cards.

Significant visual benefit from RT? Fantasy land again. Please stop.
Again, are you disconnected to reality? You're blaming RT but that isn't what caused this. You're focused on entirely the wrong thing. If we could magically make RT never have ever existed, nVidia after the last two years would've still pushed their cards to these prices because they're dicks. Them being dicks is completely independent of RT technology.
They saw a way to sell more cards and inflate pricing so they did it. That has nothing to do with RT.

It has EVERYTHING to to with RTX. That is Nvidia's only selling point over AMD cards. And everyone including you ate it up. When a company owns 80% of the market, yeah they have some pricing control to say the least. This is what happens.


Okay I will because it's the truth? Or are you saying we could remove all the entire portion of the GPU that handles traditional rendering and be fine? We have had 20 years of raster. And maybe 5 of RT. I can tell you with 100% certainty which has costed the industry more over time even if we don't adjust for inflation.
Again you are comparing the development cost of one lame sprinkle lighting effect (that really only works well in a couple games) over 5 years to the entire raster development stack over 20 or more. WTF?
 
Last edited:
Status
Not open for further replies.
Back
Top