Metro 2033 Exodus and Ray Tracing...

Just did a quick look at Steam, if NVIDIA's RTX is a failure...then AMD is a total catastrophy judging by the numbers:

RTX 2060 - 0.68% (gained 0.14%)
RTX 2070 - 0.95% (gained 0.1%)
RTX 2080 - 0.64% (gained 0.04%)
RTX 2080 Ti - 0.35% (gained 0.05%)

That gives us the sum of 2,62% RTX enabled GPU's on Steam with a growth of 0,29% last month
The top 3 AMD GPU's represented are these:

AMD RX580 - 1.25% (gained 0.1%)
Radeon R7 - 0.90% (lost 0.08%)
Radeon R5 - 0.73% (lost 0.04%)

That gives AMD's best selling GPU's have Steam share of 2.88% with a growth of -0.02% last month
So what...next month we can add AMD's 4th best selling GPU to the list...and countdown for when NVIDA's "failing" RTX line has outsold the 4 best selling AMD GPU's.
It's all a matter og perspective right? ;)


So if we extrapolate the whole un-needed nv vs amd stuff, it also tell us this...

at best - the raytracing crowd is about ~3% - and this gigantic percentage is split into 4 performance brackets - and the top bracket, well it doesn't excell at it.. this sounds like a developers worst nightmare..

Raytracing might be worth paying for in the next generation or the one after that.. right now? not so much.
 
So if we extrapolate the whole un-needed nv vs amd stuff, it also tell us this...

Not needed...people are calling it a failure, facts tells a different story....bottom line is:
You don't matter sunshine...not matter how buthurt you are ;)

at best - the raytracing crowd is about ~3% - and this gigantic percentage is split into 4 performance brackets - and the top bracket, well it doesn't excell at it.. this sounds like a developers worst nightmare..

Funny...I linked to a Metro Exodus video running raytraining..on a RTX card....don't lie so obviously...make you look like a retard ;)
Besides butthurt fanboys traying to claims developers don't want this are funny...to following some developers on social media...they don't agree with all the buthurt, infact they are exploring Raytracing like there is no tomorrow (with out RT)
Why are you buthurt over facts?

Raytracing might be worth paying for in the next generation or the one after that.. right now? not so much.

People seem to don't care for your whines...and the numbers are going up, unlike AMD's numbers...try stop posting blatant lies....no one cares for you being buthurt...
 
Not needed...people are calling it a failure, facts tells a different story....bottom line is:
You don't matter sunshine...not matter how buthurt you are ;)- uhhmm what? so people calling it a failure is a fact??



Funny...I linked to a Metro Exodus video running raytraining..on a RTX card....don't lie so obviously...make you look like a retard ;)
Besides butthurt fanboys traying to claims developers don't want this are funny...to following some developers on social media...they don't agree with all the buthurt, infact they are exploring Raytracing like there is no tomorrow (with out RT)
Why are you buthurt over facts?

i took the data from your steamchart - you delievered it, but thank you for calling me a retarded butthurt fanboy - it does wonders to bring your side of the point out in the open, as i said to begin with - que the zealots!!! :)



People seem to don't care for your whines...and the numbers are going up, unlike AMD's numbers...try stop posting blatant lies....no one cares for you being buthurt... -Well, it's pretty obvious that you cared..

But in essence... i care about as much about your oppinions as it seems you do about mine.. keep on trucking big fan-boy :)
 

Which part of the RTX series is overtaking AMD's numbers does not compute?
It's not rocket science, it's not even raytracing...it's maths 101

Data and facts beats buthurt every day ;)
 
I don't give a shit about RTX, I bought the 2080 Ti because it's 25-35% faster than a 1080Ti. I'm almost done with Metro and haven't even bothered turning on RTX. I got 75FPS in Quake II RTX and played around with it for about 5 minutes before saying "meh". At worst it's a gimmick and at best it's a novelty glimpse into the future of PC gaming. Yes I would have liked to see a lower price on the 2080Ti, who wouldn't? It's still a relatively inexpensive hobby we have here. Upgrading to the top-dog card every gen only sets me back the difference after selling the "old" card. Not exactly bank-breaking.
 
I don't give a shit about RTX, I bought the 2080 Ti because it's 25-35% faster than a 1080Ti. I'm almost done with Metro and haven't even bothered turning on RTX. I got 75FPS in Quake II RTX and played around with it for about 5 minutes before saying "meh". At worst it's a gimmick and at best it's a novelty glimpse into the future of PC gaming. Yes I would have liked to see a lower price on the 2080Ti, who wouldn't? It's still a relatively inexpensive hobby we have here. Upgrading to the top-dog card every gen only sets me back the difference after selling the "old" card. Not exactly bank-breaking.

and thats all i was trying to say.. RTX have the potential to be great one day..
Which part of the RTX series is overtaking AMD's numbers does not compute?
It's not rocket science, it's not even raytracing...it's maths 101

Data and facts beats buthurt every day ;)

Sure it does, especially on your scoreboard.. just, well i don't have a copy - so it's only you who knows the winner... :) and well i was not competing in the AMD vs NVIDIA fight you started.. so you won again :)

i was only commenting on the 3% RTX hardware owners - nothing else.. reading can be tough when done through a nV-biased filter..
 
I don't give a shit about RTX, I bought the 2080 Ti because it's 25-35% faster than a 1080Ti. I'm almost done with Metro and haven't even bothered turning on RTX. I got 75FPS in Quake II RTX and played around with it for about 5 minutes before saying "meh". At worst it's a gimmick and at best it's a novelty glimpse into the future of PC gaming.

Careful, Factum is labelling anybody who thinks Ray Tracing is 'meh' as an AMD butthurt Fanboy.
 
Let's list the lies/FUD so far...from several threads, but this is the jist of the buthurt:

  • Doesn't work - Blantant lie, working fine...just not on AMD (and there lies the problem....buthurt)
  • Not worth it - Yet RTX cards rose 0.29% this month on steam, their rasterization/DXR performance is cleary enough to drive sales.
  • Developers don't want this - Again a blatant lie, developers want this, games engines a chiming in with support..I guess the developers are not buthurt over DXR.
  • Raytracing is taking up 30-50% of the die, would have been better used on CUDA cores. - Again a lie...RT-cores take up a MAXIMUM of 15% of the die space.
  • RTX CUDA cores have no progress in performance - Again..a lie..they are ~40% better than Pacal CUDA cores, but the buthurt over DXR has drowned this.

So the buthurt is making these claims pop up...several threads have been started with what I would call "fake-news"/ "alternative facts".
Can you come up with more fake posts? :D
 
Well, why buy this generation then - for the expensive and slow experience?

You mean besides the fact that RTX cards are also the fastest cards at everything non-RT?

There’s also the fact that expensive is relative. Your expensive will be somebody else’s pocket change. Obviously people are buying the cards.

Yes RT is hard and performance is much lower than rasterization but it’s far from unplayable. You may not see the value of raytraced games at 1080p but guess what? A lot of people do.
 
You mean besides the fact that RTX cards are also the fastest cards at everything non-RT?

There’s also the fact that expensive is relative. Your expensive will be somebody else’s pocket change. Obviously people are buying the cards.

Yes RT is hard and performance is much lower than rasterization but it’s far from unplayable. You may not see the value of raytraced games at 1080p but guess what? A lot of people do.

Well, for pete sake then just buy it for the performance..

and you are also right about your second point...

but these points does not change my view - and that is the only thing i have written.

I believe that RTX is poor value atm, and i - personally would wait for the next generation, if i was on the market, and i also, again on a personal level feel like nVidia sold me a turd, had i know then, what i know now - i would have skipped this generation (but that is only my own personal - AMD Butthurt Edition Oppinion.)
 
And again, the OP needs to go see a doctor, becaase I can see the difference...but the OP (like on most other "I doooon't like NVIDI*chough*Raytraicing")...has made a hit and run...post thread...leave it and don't come back...because defeding FUD is way harder that just posting it:

 
Well, for pete sake then just buy it for the performance..

and you are also right about your second point...

but these points does not change my view - and that is the only thing i have written.

I believe that RTX is poor value atm, and i - personally would wait for the next generation, if i was on the market, and i also, again on a personal level feel like nVidia sold me a turd, had i know then, what i know now - i would have skipped this generation (but that is only my own personal - AMD Butthurt Edition Oppinion.)

I bet you see no value in DXR...until AMD supports it.
Prove me wrong.
 
The purpose of RTX cards is only to set the stage for wider RT game adoption down the road, regardless of how shitty performance is initially. It took balls for Nvidia to do this, knowing how pathetic performance is and the flak they would receive over it, but it needed to be done. Anyone who bought RTX cards, myself included, were under no illusions. We ALL knew RT performed horribly, but the main fact remains that with normal rendering these are the fastest GPUs available. And for now that is ALL that matters. I waited to see what R VII could do, but got a 2080 for about SAME price, and its faster, cooler, less power... and most importantly no vacuum cleaner DB levels when I'm gaming. These are the areas that matter to gamers currently, not some half-baked features that are not yet practically feasible yet. RT at this stage is not even worth mentioning. Everyone knows this, but some shit stirrers want to make it the only thing to focus on regardless of how good the RTX cards are in traditional rendering.
 
The purpose of RTX cards is only to set the stage for wider RT game adoption down the road, regardless of how shitty performance is initially. It took balls for Nvidia to do this, knowing how pathetic performance is and the flak they would receive over it, but it needed to be done. Anyone who bought RTX cards, myself included, were under no illusions. We ALL knew RT performed horribly, but the main fact remains that with normal rendering these are the fastest GPUs available. And for now that is ALL that matters. I waited to see what R VII could do, but got a 2080 for about SAME price, and its faster, cooler, less power... and most importantly no vacuum cleaner DB levels when I'm gaming. These are the areas that matter to gamers currently, not some half-baked features that are not yet practically feasible yet. RT at this stage is not even worth mentioning. Everyone knows this, but some shit stirrers want to make it the only thing to focus on regardless of how good the RTX cards are in traditional rendering.

well, dollar for dollar - the new generation seem to be performing the same as the old one..

"We ALL knew RT performed horribly," cool tought bro :)

the big problem here as i see it.. i have a RTX card - i tried the games.. and i still think it's meh - it has NOTHING to do with AMD at all..

It's actually ok to be meh about something i bought.. next time - i will wait for the software, before buying the hardware.
 
Last edited:
The purpose of RTX cards is only to set the stage for wider RT game adoption down the road, regardless of how shitty performance is initially. It took balls for Nvidia to do this, knowing how pathetic performance is and the flak they would receive over it, but it needed to be done. Anyone who bought RTX cards, myself included, were under no illusions. We ALL knew RT performed horribly, but the main fact remains that with normal rendering these are the fastest GPUs available. And for now that is ALL that matters. I waited to see what R VII could do, but got a 2080 for about SAME price, and its faster, cooler, less power... and most importantly no vacuum cleaner DB levels when I'm gaming. These are the areas that matter to gamers currently, not some half-baked features that are not yet practically feasible yet. RT at this stage is not even worth mentioning. Everyone knows this, but some shit stirrers want to make it the only thing to focus on regardless of how good the RTX cards are in traditional rendering.

The shit stirres are people like the OP and alike.
If they stopped posting blantly false claims...we wouldn'tbe having the false-news threads...

Spot the pattern, it is real easy:
1. Post an anti-NVIDIA RT video making false claims
2. Leave thread.
3. ???
4. Profit.

This is not the first thread to be posted...and it won't be the last.

It might stop when AMD supports DXR...and we will have some fun foot-in-mouth moments though.
 
It gets even more fun when loooking at what GPU's are the biggest "increasers" ^^
upload_2019-6-27_14-48-11.png


Looks like RTX is doing well...
 
I went all in for a Strix 2080TI($1500) coming from one rig with a 1080TI($750) that could sometimes, maybe, do some 4k stuff at 60 fps if it wasn't too demanding. The other rig made for 4k/60 fps had 2x 1080's($1100) in SLI and mostly achieved the goal as long as the game supported SLI and didn't exceed the 8GB ceiling. The 2080TI pretty much holds that sweet 60fps in just about everything without RT and for less demanding upwards of 80-90 fps in 4k. Before those I had 2x970's($700) in SLI for 1440p and some 4k. For RT it's the usual balancing act of settings I've seen with every generation since mid 2000's. So I went from $700 to $750 to $1100 to $1500 to get this performance level. Yeah the price sucks but it's obvious that a performance vs monetization model has been in action long before Turing. Those who want cheap prices for max performance need to look beyond the days of Maxwell for those polished memories.

Back to the topic(more or less). My experiences with Metro has been subtle but I find also using the ultra setting gives the most dramatic effects. I found this to be true with SOTTR too. It's the downside for most audio and visual upgrades I've done over the years. Once you get used to the newer more maxed out version of something going back doesn't look or sound right. I'm glad they got DLSS figured out as it was rough in the beginning. HDR is still screwed so anyone trying to test this game needs to know that. Probably won't ever get fixed at this point. That patch a couple of months ago was a smoke screen. Do enough testing turning off MS HDR and in game and you'll see it's doing nothing when toggeld on/off in game. Ultra uses RT to do a lot more in the game than high does but even then it's still using RT in limited capacities. There's some great reviews(https://www.thefpsreview.com/2019/06/08/quake-ii-rtx-performance-review/) on the Quake II RTX demo out there explaining the differences in various implementations of RT that I'd recommend for anyone to learn about.
 
I bet you see no value in DXR...until AMD supports it.
Prove me wrong.

LOL you are one sad panda. Show me on the doll where AMD touched you?

There is no value in DXR until every company has it and more importantly that the mainstream market can use it. Until then it's only a feature on overpriced cards that a lot of people don't seem to care about at the moment. And that's coming from owners of said cards, not AMD butthurt fanboys.

It gets even more fun when loooking at what GPU's are the biggest "increasers" ^^
View attachment 170470

Looks like RTX is doing well...

More laughs, you are funny!! RTX is doing so well that the 1070 increased by a higher percentage than any of the RTX cards in that table. In fact if you compare the increase in percentage of the non RTX cards and compare it to the RTX cards, well you can see that the non RTX cards have increased by more. Which is quite an achievement considering that there are so many more Pascal cards out there.

If you are using this table to show that RTX is doing well, then your definition of 'doing well' is a lot different than mine.
 
Yeah blame me again for not sharing your "vision". I'm getting used to it. You just ignore everything I write , on this matter I will tell you again it is near impossible to tack on things in GPU development so they knew they were doing ray tracing 3 years ago Nvidia with plans of ray tracing people doing engines do know about ray tracing because it has been there for more then a few decades.

All you are doing is digging a deeper hole, to reveal more of your ignorance. The long existence of Ray Tracing as conept/algorithm and programs has ZERO to do with incorporating it in games. Incorporating new technology in games is about learning the new APIs that didn't even previously exist and how to incorporate them well into their games. It takes experience to do that well, and experience takes time.

Dice also showed you that you are wrong , they patched battlefield V to not ray trace as many things as they started with just to accommodate better framerates. If software so complex they would not be able fix it that fast.

Actually, that reinforces what I wrote in the very post you quoted. I can try to explain it to you but you seem so (willfully) ignorant to the realities of SW development that you will likely fail to get why.

.. because you really have no prior experience with the kind of bugs you will encounter, or what kind performance expectations to have, leading much more trial error to figure things out.

What you are seeing with DICE, is that trial and error working itself out in public, as they release patches to improve performance, while they gain experience with the new technology.

In your strange fantasy that experience/time aren't needed for new technology, why has Dice needed to patch their RTX work to improve performance, shouldn't have been perfect from the beginning?

Really you should stick to arguing something defensible. This weird stance that incorporating new technology takes no time or experience, is just pure ignorant nonsense.

That you want to keep drawing attention to this ignorant nonsense is just baffling.
 
LOL you are one sad panda. Show me on the doll where AMD touched you?

AMD didn't tocuh me...because they are absent from the DXR race ;)

There is no value in DXR until every company has it and more importantly that the mainstream market can use it. Until then it's only a feature on overpriced cards that a lot of people don't seem to care about at the moment. And that's coming from owners of said cards, not AMD butthurt fanboys.

Yet it is the buthurt fanboys whining...making threads with false claims.
And DXR does have value...it not NVIDIA DXR....it's Microsoft DXR...you seem to be keep forgetting that ;)
And it's a lot more that what you see...go talk to developers...the guys making the next games...you find their view does not align with yours ;)

Buthurt fanny in one corner....devs in another...I know what corner I will go sit in ;)



More laughs, you are funny!! RTX is doing so well that the 1070 increased by a higher percentage than any of the RTX cards in that table. In fact if you compare the increase in percentage of the non RTX cards and compare it to the RTX cards, well you can see that the non RTX cards have increased by more. Which is quite an achievement considering that there are so many more Pascal cards out there.

If you are using this table to show that RTX is doing well, then your definition of 'doing well' is a lot different than mine.

If they are doing "bad"...what is AMD then doing?
Think before you post next time ;)
The 1070 can btw also do shader based DXR...did you just cite a shader DXR capable card of being proof that DRX is a failure...lol
 
What you are seeing with DICE, is that trial and error working itself out in public, as they release patches to improve performance, while they gain experience with the new technology.
In your strange fantasy that experience/time aren't needed for new technology, why has Dice needed to patch their RTX work to improve performance, shouldn't have been perfect from the beginning?
Really you should stick to arguing something defensible. This weird stance that incorporating new technology takes no time or experience, is just pure ignorant nonsense.
That you want to keep drawing attention to this ignorant nonsense is just baffling
.
What I am seeing is that Battlefield V is not able to get enough frame rate for the very limited amount of ray tracing that is done. This is prolly also true of cards in the lower spectrum it needs better hardware for it to work. In other words the things that were being ray traced take to much processing power another problem. Dice patched it by ray tracing less :) this has nothing to do with experience in programming. It tells you the hardware ray tracing is just not as good as Nvidia claims it to be.

What needed to be done is that Nvidia would use the same hardware across their line up thus having less of these problems that frame rate drops are "killing" the gameplay.

Keep calling me names it works well for others to see that you just can't handle other peoples opinion nor make yours valid.
 
What exactly is the real ray tracing that "Nvidia offering" is not being capable of?


Who exactly knew Nvidia will do ray tracing 3 years ago?
What is the source of this information?

EDIT://
Which game companies receive money from AMD to implement AMD's ray-tracing solution?
Limited usage of ray tracing if you have a whole scene of objects (10 to 20) and only 1 or 2 are ray traced then what is the value?
Nvidia would know they are making hardware ray tracing I hope ? That means if they were serious they could have started sharing this knowledge with many of their developers(internal & external).

I have no idea on what AMD does with their money but as it is now I would definitely not be spending it on ray tracing in games.
 
What I am seeing is that Battlefield V is not able to get enough frame rate for the very limited amount of ray tracing that is done. This is prolly also true of cards in the lower spectrum it needs better hardware for it to work. In other words the things that were being ray traced take to much processing power another problem. Dice patched it by ray tracing less :) this has nothing to do with experience in programming. It tells you the hardware ray tracing is just not as good as Nvidia claims it to be.

As usual, you didn't answer the question. If no time/experience is needed, why didn't Dice get it right the first time? Why have they needed to patch it twice so far?

You can continue to ignore reality, but they are patching as they learn more and gain experience. That should just be common sense, not even needing knowledge of SW development.

Keep digging that hole.
 
Limited usage of ray tracing if you have a whole scene of objects (10 to 20) and only 1 or 2 are ray traced then what is the value?
What are you talking about? What do those numbers represent and where did you get them from?

Nvidia would know they are making hardware ray tracing I hope ? That means if they were serious they could have started sharing this knowledge with many of their developers(internal & external).
Provide source of information they did not share this knowledge with game developers or STFU

I have no idea on what AMD does with their money but as it is now I would definitely not be spending it on ray tracing in games.
And why would Nvidia spend their money on this?
 
What was the exact Nvidia claim that you are referring to?
real time ray tracing?
"Limited usage of ray tracing if you have a whole scene of objects (10 to 20) and only 1 or 2 are ray traced then what is the value?"\

You can claim there is support for ray tracing but what reality tells you it is pretty limited.
What are you talking about? What do those numbers represent and where did you get them from?




And why would Nvidia spend their money on this?
I am talking about that if everything is ray traced you don't get any frame rate the example that I gave is pretty striking when you have a scene that you needed rendered with ray tracing and of those 20 objects you can only ray trace 2 because of the limited hardware support there is right now what value has ray tracing as a whole.

You and others say real time ray tracing but the reality is different. It is limited because of the hardware not being able to handle it (beyond the point that frame rates become an issue).

Try reading some posts and then wonder why I am saying these things. The patches regarding battlefield V are documented it shows that everything they patched was related to less ray tracing rather then anything else.

Provide source of information they did not share this knowledge with game developers or STFU
Ehm I said they would have shared knowledge 3 years ago when they started the RTX project and you shut the fuck up ?
 
Last edited:
What was the exact Nvidia claim that you are referring to?

I suspect it only excists inside his skull...he should read up on what is done in DXR, as he now claims it's done on a per object basis :confused:

I suggest he starts here:
https://www.nvidia.com/en-us/geforce/news/geforce-gtx-dxr-ray-tracing-available-now/

Code examples can be found here:
https://developer.nvidia.com/rtx/raytracing/dxr/DX12-Raytracing-tutorial-Part-1

And here:
https://developer.nvidia.com/rtx/raytracing/vkray


But I will almost bet he will not educate himself...but just keep whining.
 
real time ray tracing?
"Limited usage of ray tracing if you have a whole scene of objects (10 to 20) and only 1 or 2 are ray traced then what is the value?"\

You can claim there is support for ray tracing but what reality tells you it is pretty limited.
RT acceleration is just additional hardware that calculates ray intersections. What can be done with that is up to developers. How many objects can be in scene is limited by available memory.
Full scene path-tracing can be used (and was proven with Quake2 RTX) and all effects known from any 3d rendering software can be implemented because DXR is fully programmable and RTX are fully compliant to DXR.

Only limiting factor to how it can be used in games is performance and available memory. Hybrid approach is game developer choice. They could as well implement full scene path tracing if they wanted...
Real time ray tracing does not mean one could do full scene path tracing in place of rasterization and get the same frame rates.

Performance improvement with RTX is clear:
Untitled-1-3.jpg


You might be dissatisfied with more than 7x performance improvement over shader based approach but that does not mean Nvidia did not deliver what they promised. There is full scene path tracing and it works. It works pretty well I might add.

I am talking about that if everything is ray traced you don't get any frame rate the example that I gave is pretty striking when you have a scene that you needed rendered with ray tracing and of those 20 objects you can only ray trace 2 because of the limited hardware support there is right now what value has ray tracing as a whole.

You and others say real time ray tracing but the reality is different. It is limited because of the hardware not being able to handle it (beyond the point that frame rates become an issue).
What 20 objects? What are you blabbering about?
Provide link to source that supports your 20 ray traced objects claim.

Try reading some posts and then wonder why I am saying these things. The patches regarding battlefield V are documented it shows that everything they patched was related to less ray tracing rather then anything else.
What posts? And what about "less ray tracing" in these paths being significant? Does it proves anything you say?
No one claimed RTX hardware can do unlimited rays with cinema like quality at 4K 120fps. It should be obvious ray tracing is computationally expensive and even with hardware acceleration it won't be as fast as some simple effect like screen space reflection it replaces in BFV and thus performance will be reduced.

BTW. Most posts on internets are from angry people who complain about things. They do not get the same performance with RTX effects as they would without them and then complain and accuse Nvidia for lying to them and causing global warming and other bullshit. Internet is full of idiots whose only source of information is other angry idiots.

Ehm I said they would have shared knowledge 3 years ago when they started the RTX project and you shut the fuck up ?
Where does this 3 years ago figure comes from? You like to throw random numbers around...
Please provide source of information NV started RTX project 3 years ago and information about how they withheld providing information about it.
World heard about DXR few moths before RTX cards release (link: https://devblogs.microsoft.com/directx/announcing-microsoft-directx-raytracing/) and official DXR support appeared AFTER that in Windows 10 October 2018 update

As far as I know Nvidia is supporting developers which want their support and it seems to be going well. We already have DXR support in Unreal Engine and Unity support is coming. Few games already can take advantage of DXR.
In the mean time what does competition do? Release GPU without hardware DXR for the same price?

From what I gathered in your mind it is Nvidia responsibility to not only write all DXR routines (even before DXR was finalized !!! and irregardless from having source code for these games or not) and then pay developers to include them and then make sure that these effects do not degrade performance in any way, otherwise they are liars and generally evil company. On the other hand AMD does not have to support anything software/hardware-wise or support game developers in any way and generally are good guys whose complete inactivity is the greates virtue the world have ever seen. Did I miss anything? :)
 
AMD didn't tocuh me...because they are absent from the DXR race ;)



Yet it is the buthurt fanboys whining...making threads with false claims.
And DXR does have value...it not NVIDIA DXR....it's Microsoft DXR...you seem to be keep forgetting that ;)
And it's a lot more that what you see...go talk to developers...the guys making the next games...you find their view does not align with yours ;)

Buthurt fanny in one corner....devs in another...I know what corner I will go sit in ;)





If they are doing "bad"...what is AMD then doing?
Think before you post next time ;)
The 1070 can btw also do shader based DXR...did you just cite a shader DXR capable card of being proof that DRX is a failure...lol


What the hell? Your post is basically a rant against AMD. Hint. This isn't an AMD vs Nvidia thread, it's wondering if Ray Tracing is worth it at present thread.

I know it's not Nvidia DXR, I never said it was. Suggest you try reading my post again.

Yes, you are completely and utterly a Nvidia diehard and is defending them to the last, I get that. But, this thread isn't about what Ray Tracing will become, that is something everybody wants. It's about what Ray Tracing is now and there is a lot of people thinking that it's not worth it at present.

Calling Archaea an AMD butthurt fanboy, LOL, you are clueless. I take it you missed all his posts complaining about AMD, He has actually been accused of been a Nvidia Shill.

And bringing how up shader based DXR on a 1070 as proof that DXR is doing well, lol. There is nobody buying a 1070 to do Ray Tracing. What kind of fantasy world are you living in?

All that table shows is that in 9 months the total percentage of RTX cards on Steam doesn't even match the 1070. And that said card, the 1070, increased in percentage more than any of the RTX cards last month.

There is a logical conclusion to make, that the high priced of RTX cards are making more people buy the cheaper GTX cards. Or, your absurd conclusion, that people are buying 1070s in droves to try out DXR.
 
What the hell? Your post is basically a rant against AMD. Hint. This isn't an AMD vs Nvidia thread, it's wondering if Ray Tracing is worth it at present thread.

I know it's not Nvidia DXR, I never said it was. Suggest you try reading my post again.

Yes, you are completely and utterly a Nvidia diehard and is defending them to the last, I get that. But, this thread isn't about what Ray Tracing will become, that is something everybody wants. It's about what Ray Tracing is now and there is a lot of people thinking that it's not worth it at present.

Calling Archaea an AMD butthurt fanboy, LOL, you are clueless. I take it you missed all his posts complaining about AMD, He has actually been accused of been a Nvidia Shill.

And bringing how up shader based DXR on a 1070 as proof that DXR is doing well, lol. There is nobody buying a 1070 to do Ray Tracing. What kind of fantasy world are you living in?

All that table shows is that in 9 months the total percentage of RTX cards on Steam doesn't even match the 1070. And that said card, the 1070, increased in percentage more than any of the RTX cards last month.

There is a logical conclusion to make, that the high priced of RTX cards are making more people buy the cheaper GTX cards. Or, your absurd conclusion, that people are buying 1070s in droves to try out DXR.

I said the OP needed to go see an eye-doctor, get your "facts" straight....and again, RTX GPU's are selling, DXR games are comming...and there is nothing you can do about it :)
 
So in essence, wait for it...

Uh, well, yes? Wait for it to be implemented from bottom up. You can also enjoy what is available now that has been back-patched into games late in their development cycles (including after release). And you can run the fastest GPUs available!

What's not to like?
 
And again, the OP needs to go see a doctor, becaase I can see the difference...but the OP (like on most other "I doooon't like NVIDI*chough*Raytraicing")...has made a hit and run...post thread...leave it and don't come back...because defeding FUD is way harder that just posting it:



The shit stirres are people like the OP and alike.
If they stopped posting blantly false claims...we wouldn'tbe having the false-news threads...

Spot the pattern, it is real easy:
1. Post an anti-NVIDIA RT video making false claims
2. Leave thread.
3. ???
4. Profit.

This is not the first thread to be posted...and it won't be the last.

It might stop when AMD supports DXR...and we will have some fun foot-in-mouth moments though.

I have 9000+ posts on this forum and you call my opinion a hit and run?

No, I just don’t like squabbling like junior high boys.

I made my point after trying the card and the best example of ray tracing currently available. My opinion was that I couldn’t tell the difference. This wasn’t about AMD, it wasn’t about the price. It was about the fact that in 1/2 to 1 hour of gameplay and switching back and forth time after time I couldn’t tell the difference except on one scene.

My post was clear. I didn’t say the tech was dead, I didn’t say I was mad I bought the card, I didn’t say screw NVidia. I didn’t say that tech would never get better. I have nothing more to add until my opinion changes based on something much more exciting than what I’ve seen (or rather didn’t see) with Metro 2033 Exodus. My eyes don’t need checked. There’s nothing to see...yet.
 
Last edited:
I see only one game that is raytraced and that is Quake II. All the other games are virtually 100% rasterized in a traditional sense. I've have yet to see an example where I say wow! that is consistent across a significant part of a game. If one has to hunt for a reason then that is no reason at all but a reason to ignore the feature. As the number of objects increase so does the number of rays, calculations, interactions, color etc. Something like Quake II the 2080Ti is sufficient at 1080p - take that to 1440p and the performance tanks. Go into a modern game and there is just not enough processing ability yet to do the lighting ray traced.

Most games will use a very very limited hybrid approach such as reflections, GI, shadows etc. due to performance reasons which just one thing like reflections in BFV tanks the performance. That is not a ray traced game, only the reflections are when turned on. To call that a ray traced game is showing no real understanding at all.

Still it is valid that developers may do some fabulous things with DXR with current hardware and hopefully it will start proving the extra cost, well at least for those games. Maybe some rather drink the Koolaid instead. Of course Nvidia will have a good reason to compare how sucky current generation is to next generation ray trace ability, gota keep those sells up - watch.
 
It gets even more fun when loooking at what GPU's are the biggest "increasers" ^^
View attachment 170470

Looks like RTX is doing well...

So a whopping 2.62% of Steam is doing really well for all RTX cards. Damn thats a pretty low bar for success, especially when you consider the 1080Ti is at 1.73% all by itself and the lowly 1060 is 16.30%. Reality is the market has spoken, they are not upgrading for low RT performance at a massive price tag. Those 2000 series cards have been out long enough to see a trend, people are not upgrading.
 
Limited usage of ray tracing if you have a whole scene of objects (10 to 20) and only 1 or 2 are ray traced then what is the value?
Nvidia would know they are making hardware ray tracing I hope ? That means if they were serious they could have started sharing this knowledge with many of their developers(internal & external).

I have no idea on what AMD does with their money but as it is now I would definitely not be spending it on ray tracing in games.
Uh, well, yes? Wait for it to be implemented from bottom up. You can also enjoy what is available now that has been back-patched into games late in their development cycles (including after release). And you can run the fastest GPUs available!

What's not to like?

hmm, whats not to like - the price mostly.. and the fact that i bought it, when i could have waited.. and missed out on nothing.. at all - but thats on me for listening to mr. Jensen.. i should have known better :)
 
So a whopping 2.62% of Steam is doing really well for all RTX cards.

...that's hundreds of thousands of GPUs.

And you're also arguing 'all of Steam', which is absurd. If you had any intention of making a point, you'd be comparing the discrete marketshare, and doing so by performance level.
 
I have 9000+ posts on this forum and you call my opinion a hit and run?

No, I just don’t like squabbling like junior high boys.

I made my point after trying the card and the best example of ray tracing currently available. My opinion was that I couldn’t tell the difference. This wasn’t about AMD, it wasn’t about the price. It was about the fact that in 1/2 to 1 hour of gameplay and switching back and forth time after time I couldn’t tell the difference except on one scene.

My post was clear. I didn’t say the tech was dead, I didn’t say I was mad I bought the card, I didn’t say screw NVidia. I didn’t say that tech would never get better. I have nothing more to add until my opinion changes based on something much more exciting than what I’ve seen (or rather didn’t see) with Metro 2033 Exodus. My eyes don’t need checked. There’s nothing to see...yet.

You can tell the diffrence in the video I linked to? Yes/No...appeal to "post count" doesn't alter what you wrote..
 
...so you failed to do your research? That sounds like a personal problem ;).

it sure is, dont think i ever blamed anybody else, i just admitted that i made a poor purchase.. should have stayed away from RTX, could have saved sweet dollars for hookers and blow (or orther ill-fated ideas).. and if you feel otherwise about your purchase, cool - am not trying to change your mind.

But that is not the OP's question... now is it?

i for one, as an owner - think it's highly overrated - and that it will properbly be cool down the road.. but the current implementation is lackluster and is well mostly buzz-words. - now i know that goes against the NV Zealots way of thinking.. and so be it.. it does not chance how i see it, at all.


it's just my 2 cents.
 
Last edited:
Back
Top