Metro 2033 Exodus and Ray Tracing...

So we're still waiting for a real showcase title for RTX a full year after release now. Bravo, Nvidia, well played....

In what parallel universe does it take less than a year to develop showcase titles? How long did it take for showcase DX9 titles to show up? DX11?

Come on guys at least try to pretend to be objective. Raytracing wasn’t invented by nvidia so no need to shit all over it just because your favorite team isn’t in the game yet.
 
Is that the future where you still have a spare kidney that you would like to sell to be able to afford your future Nvidia purchase, you know for those ray(s) that needs to be traced ?

Note that I said when RT cards are common from NVidia/AMD/Intel. That's 3 choices, but sure use it for another excuse to slam NVidia.
 
In what parallel universe does it take less than a year to develop showcase titles? How long did it take for showcase DX9 titles to show up? DX11?

Come on guys at least try to pretend to be objective. Raytracing wasn’t invented by nvidia so no need to shit all over it just because your favorite team isn’t in the game yet.

What you're missing the boat on is that they are selling it as a "feature" and part of the reason for the price increase. If there was a moderate price increase and a feature that might be useful in the future, no harm no foul. If you sell something as a "feature" and then have nothing to use with it and are charging an arm and a leg for it? That's a whole different ballgame.

This has absolutely nothing to do with one "team" or the other. This is a critique of the extravagant prices Nvidia thinks their cards are worth.
 
This has absolutely nothing to do with one "team" or the other. This is a critique of the extravagant prices Nvidia thinks their cards are worth.

But it's totally OK that AMD prices theirs just as extravagantly, but without the extra feature.

It looks to me, like that feature you incessantly complain about, is essentially free.
 
What you're missing the boat on is that they are selling it as a "feature" and part of the reason for the price increase. If there was a moderate price increase and a feature that might be useful in the future, no harm no foul. If you sell something as a "feature" and then have nothing to use with it and are charging an arm and a leg for it? That's a whole different ballgame.

This has absolutely nothing to do with one "team" or the other. This is a critique of the extravagant prices Nvidia thinks their cards are worth.

Everyone is free to make their own value judgement when spending their money. That has nothing to do with the technical merits of the tech though so I don’t see how the two are related.

If your main gripe is that nvidia is asking people to pay a premium for features that won’t be useful until some time in the future well let me be the first to welcome you to the tech industry. It’s always been like that and it’s never going to change.
 
But it's totally OK that AMD prices theirs just as extravagantly, but without the extra feature.

It looks to me, like that feature you incessantly complain about, is essentially free.

Basically AMD is better for not having something it's competition has seems to be the Mantra here as of late. Not to mention the fact that they are still late on everything else GPU related.

Sorry but I lost faith after the R VII.
 
In what parallel universe does it take less than a year to develop showcase titles? How long did it take for showcase DX9 titles to show up? DX11?

Come on guys at least try to pretend to be objective. Raytracing wasn’t invented by nvidia so no need to shit all over it just because your favorite team isn’t in the game yet.


Nvidia pays developers for Gameworks and other things so all of a sudden no one in the industry is not accepting cash/programmers from Nvidia any more these days ? I think you got your answer right there.


Everyone is free to make their own value judgement when spending their money. That has nothing to do with the technical merits of the tech though so I don’t see how the two are related.

If your main gripe is that nvidia is asking people to pay a premium for features that won’t be useful until some time in the future well let me be the first to welcome you to the tech industry. It’s always been like that and it’s never going to change.
That was not the case when Nvidia decided to use tessellation no segmented performance and worked well on every card.
AMD did pretty well with Mantle if you need something comparative .

No one that I take seriously has a problem with ray tracing from any company regardless of how it is implemented but get this straight your selling point is ray tracing for the cards the money they spend on it and the feature is something that losses out on the lower series of the RTX cards and that is the biggest drawback by design.

You can't blame other people for doubting Nvidia intentions with ray tracing when there is no software and the software that is there requires lower resolution to function with ray tracing the thing Nvidia is so proud of having more frames per second....

And to come back to the first part of the message quoted , now you know why development is slow.
 
And that's fine, when they are powerful enough to be useful in 5-10 years. The issue is that they aren't right now, and I think many people are a little miffed at the price hike for what is more or less a useless feature at this point. Now I fully realise that it wasn't a causal thing like "raytracing happened so price jumped" but nVidia certainly tries to use it as a selling point to justify the higher price. They also hyped it hard, and continue to hype it. It isn't like they added the feature and noted it was new and talked about possible applications for high end computation (as they did when they first added FP64), no they went on about how amazin' it would be for games and all that.

Also you have to understand some skepticism from people when it comes to ray tracing as we have been hearing about it for a loooooong time. Ray tracing as we think of it now came about in 1979 and it seems like every since then you've had poorly educated CS students go about how it is O(log n) and thus is the One True Way. Likewise real-time raytracing has been talked about for a long time. Intel got real big on it in 2008 as a kind of "you should want to CPU cores not more GPU" thing.

Combine that with the fanboyism you see and it shouldn't be a surprise that some people, like me, shit on it a bit. Not because I hate the concept of ray tracing but because I know how intense it is to do right. Goes double if you want to have the idea of a really easy solution whereby shadows, caustics, occlusion, indirect lighting, and such aren't things you want to have to think about but just have automatically done by the rendering engine. Traditional eye-based ray tracing doesn't do a good job with that, you need something like photon mapping, which is of course much more intense and thus even further off from a realtime implementation.

The prices suck (and I say this as a 2080 ti owner) and Nvidia's marketing is going quite overboard with the RTX stuff. Sadly, there is nothing to motivate Nvidia to lower prices. AMD is going along with Nvidia's pricing tiers and Navi doesn't really look like something that is going to cause Nvidia to do anything. So we're stuck.

A little skepticism is fine. Nvidia is talking this up as some "magic" solution and using a ton of misleading (or outright untrue) marketing to hype it up. Nvidia's marketing and promises around the tech are just going to make people wary about it and that is totally understandable.
 
Nvidia pays developers for Gameworks and other things so all of a sudden no one in the industry is not accepting cash/programmers from Nvidia any more these days ? I think you got your answer right there.

The primary thing NVidia gives developers is technical aid, testing, driver support, and marketing, straight from one of the Devs:
https://steamcommunity.com/app/234630/discussions/0/613957600528900678/


What can I say but he should take better stock of what's happening in his company. We're reaching out to AMD with all of our efforts. We've provided them 20 keys as I say. They were invited to work with us for years.
Looking through company mails the last I can see they (AMD) talked to us was October of last year.

Categorically, Nvidia have not paid us a penny. They have though been very forthcoming with support and co-marketing work at their instigation.
We've had emails back and forth with them yesterday also. I reiterate that this is mainly a driver issue but we'll obviously do anything we can from our side.

Money or no Money, you need to get to a developer early enough in their design cycle for Ray Tracing technology to be better integrated, and you also have to factor developer resources. Many games are in crunch mode chasing deadlines. Trying to incorporate Ray Tracing at that point, would blow target dates.

Also developers need to gain experience with Ray Tracing. The first generation games with RT are going to a shadow of what comes later when developers tackle RT from the ground up after they have some RT experience under their belt.

Even first generation console games pale compared to a couple of years down the road when Developers are more familiar and that isn't even a real fundamental technology change, just more of the same.

Ray Tracing, is one of the most fundamental shifts in how visuals are created to take place in GPUs. It's going to take time to incorporate that change. Pretending otherwise, is just biased fanboy rhetoric. Naturally it will magically be the right time for Ray Tracing GPUs when AMD releases them. :rolleyes:
 
Because, once the hardware power is there for it, it should be easier on the developers. Even if it is near identical if one way is easier and the other is harder then the easier solution is what should be used. "Work smarter, not harder" as the saying goes. Ray-tracing was a HUGE boon to the movie industry. Even with all the raw power needed to render scenes, it made things so much easier on CG studios. I've been saying since these cards were announced that we're probably 5-10 years from it being more than a "gimmick". Someone had to get the ball rolling or it would take even longer for RT to be viable in games, but anyone expecting miracles right now is in for some serious disappointment.
That is a solid argument if the hardware can get to point A to B in doing all the lighting, shadowing, reflections etc. and not have to bake (ray traced off line) the textures and light maps for different parts of a game, each room etc. Except even the 2080 Ti struggles with just one aspect of DXR such as reflections or just shadows on modern games. I hate to see if the whole modern game was 100% raytraced, having Quake 2 as an example is nice but utterly shows current limitations. Even if the ray tracing ability was doubled, it would pale in what would be needed for developers not to have to worry about lighting. To me it looks like a factor of 6x-8x of the 2080Ti RT ability to start ignoring current game methods for rendering - at what cost and how low of an end would sufficiently support that ability? Just having $6000 video cards that can ray traced in real time 60fps 4K images with a few tricks needed will not do game developers much good. To be clear, full ray trace lighting, shadows, GI (bounce light, color bleed etc), sub-surface scattering, caustics, refractions . . .

How much can software improve? Hardware improve with the limited process nodes available now and in the future? I don't see Nvidia Arch scaling well here in the current implementation. I would like to see how Navi or really Navi 2 does ray tracing, if it is more integrated into the pipeline or will it have a more dedicated chiplets design using lightmaps, shadowmaps etc. for the rasterizer to render with.

The other avenue is having enhancements like shadowing, reflections which may make it a little bit easier for developers but I don't see that as being a game changer in the end. I think Nvidia utterly fail short of their promise and initial hype they promoted. The results speak for themselves. Will Cyberpunk or Doom Eternal show a more mature software approach to RTX? I sure hope so, so far it is not that impressive for RTX.
 
Last edited:
Nvidia pays developers for Gameworks and other things so all of a sudden no one in the industry is not accepting cash/programmers from Nvidia any more these days ? I think you got your answer right there.

Even if nvidia was bribing devs to be early adopters, money is no substitute for time and experience.

That was not the case when Nvidia decided to use tessellation no segmented performance and worked well on every card. AMD did pretty well with Mantle if you need something comparative.

Tessellation took forever to become useful and is still missing from big budget titles today. Not a good comparison. Mantle even less so.

You can't blame other people for doubting Nvidia intentions with ray tracing when there is no software and the software that is there requires lower resolution to function with ray tracing the thing Nvidia is so proud of having more frames per second....

Their intention is to separate you from your money. Just like every other company that sells something. What other intention do you need?
 
Even first generation console games pale compared to a couple of years down the road when Developers are more familiar and that isn't even a real fundamental technology change, just more of the same.

Ray Tracing, is one of the most fundamental shifts in how visuals are created to take place in GPUs. It's going to take time to incorporate that change. Pretending otherwise, is just biased fanboy rhetoric. Naturally it will magically be the right time for Ray Tracing GPUs when AMD releases them.

I would say that these things are not true , you segment the market you segment performance on ray tracing that does not rhyme with what you are saying and what is done with ray tracing today, performance is only on the top end of the hardware and that one does not sell as much as the lower end of the RTX cards. That means that the adoption rate is so slow that without help from Nvidia the company that could not stop funding Gameworks titles have to pay developers to get support and it is not like that gpu development is like Lego bricks you just add cores that work somewhere just before launch ....

I would say that the only thing I find more compelling in how graphics are rendered today is High Dynamic Range and that does not need much work at all.

Telling people that if they do not agree with you that you are a fanboy is rather silly calling biased fanboy is a double on that one but you enjoy your name calling when you can't win an argument ;) .
 
I would say that these things are not true ,

Which things are not true? What are the actual points do you disagree with? Because you addressed ZERO of them, and instead went off on a tangent.

Developers need to gain experience with Ray Tracing to do it well. You disagree?

Adding Ray Tracing to games that were already well into development without will push out deadlines. You disagree?

The bottom line is that this is new technology, and a large shift in how visuals are created, it will take time to developers to learn it and it will take time to implement it, and obviously that extra time would delay time to market.

Which is why many developers with games in progress, won't interrupt current production to introduce new delays.

What you disagree with is obviously how the world actually functions. AKA reality denial.
 
Last edited:
Which things are not true? What are the actual points do you disagree with? Because you addressed ZERO of them, and instead went off on a tangent.

Developers need to gain experience with Ray Tracing to do it well. You disagree?

Adding Ray Tracing to games that were already well into development without will push out deadlines. You disagree?

The bottom line is that this is new technology, and a large shift in how visuals are created, it will take time to developers to learn it and it will take time to implement it, and obviously that extra time would delay time to market.

Which is why many developers with games in progress, won't interrupt current production to introduce new delays.

What you disagree with is obviously how the world actually functions. AKA reality denial.

It's tacked on eye candy and will remain that way until developers see lots of people with the tech to run something more. The masses will be quite fine living without it at the current pricing level, so better strap in for a long wait.
 
It's tacked on eye candy and will remain that way until developers see lots of people with the tech to run something more. The masses will be quite fine living without it at the current pricing level, so better strap in for a long wait.
Well if AMD incorporates the ability in the next generation consoles, developers use it then we should see a larger game base after the new console launch at the end of 2020. So maybe 2021 will be the era of ray traced gaming where a large number of folks can use it.
 
Well if AMD incorporates the ability in the next generation consoles, developers use it then we should see a larger game base after the new console launch at the end of 2020. So maybe 2021 will be the era of ray traced gaming where a large number of folks can use it.

I suspect we'll see movement sooner than that. We know where the vector points, and every engine developer is working out how best to handle varying levels of hw offload right now. Some will do it better than others. Some will have it ready sooner than others.

Personally, I want to see what id / Tiago do with current gen hw in their upcoming titles. I think that data point will be informative for near-term expectations.
 
Which things are not true? What are the actual points do you disagree with? Because you addressed ZERO of them, and instead went off on a tangent.Developers need to gain experience with Ray Tracing to do it well. You disagree?Adding Ray Tracing to games that were already well into development without will push out deadlines. You disagree?The bottom line is that this is new technology, and a large shift in how visuals are created, it will take time to developers to learn it and it will take time to implement it, and obviously that extra time would delay time to market.Which is why many developers with games in progress, won't interrupt current production to introduce new delays.

What you disagree with is obviously how the world actually functions. AKA reality denial.
I deny being stuck in your version of reality , yes :) .

Development of gpu hardware takes 3 years I clearly stated this before so Nvidia knew 3 years ahead of time that they were doing ray tracing it is not Lego bricks where you just tack on ray tracing bricks at the end of the process so yes in that time developers would have known about this.
That means that the adoption rate is so slow that without help from Nvidia the company that could not stop funding Gameworks titles have to pay developers to get support and it is not like that gpu development is like Lego bricks you just add cores that work somewhere just before launch ....
See I explained it before you just don't want to read what I post or take it seriously , that is a direct reply to one of your posts.

About experience with ray tracing comes down to the same problem I explained before hardware requires software the level of the hardware determines the amount of software is there to take advantage of the hardware. When there are 4 levels of the same hardware then the software has to keep track of it.

This is a problem of Nvidia's making and it does not make things easier for development it hinders it because of their own mantra that fps means everything in gaming.

The current reality is that fps is king and something as ray tracing on hardware is slowing fps unless you bought the 2080TI but you may still need to swap to a lower resolution sometimes ....
Well if AMD incorporates the ability in the next generation consoles, developers use it then we should see a larger game base after the new console launch at the end of 2020. So maybe 2021 will be the era of ray traced gaming where a large number of folks can use it.

And that is on an even basis where all the hardware is the same. I would not be surprised (_not_ the EA surprise mechanic) if it is still basic stuff and doesn't amount to a whole lot of ray tracing extravaganza.
 
I think that Nvidia did just 'Tack on' RTX at the last minute. I think they were developing it, but, it was never meant for 14nm Turing, it was supposed to be for their 7nm cards and that's why they had nothing ready at launch.

I think Turing was supposed to be released earlier in 2018, but, with the mining boom still going, Nvidia kept producing Pascal cards. But, when the bubble burst, Nvidia realised that they would need to sell off a lot of inventory so they needed something to make Turing stand out and that was the addition of the RTX hardware.

Remember all the developers complaining that they only had access to Turing hardware for a couple of weeks before launch. That's why games are so slow in coming. RTX was due to arrive in 2019 not 2018. Metro Exodus was supposed to be a launch game and it was always slated to be a 2019 release. That they had no DLSS game at launch is another reason why I think that the RTX hardware was early. After all the talk of how easy it is to implement in games that they couldn't have on game at launch is extremely suspect, considering their strong developer relations. DLSS is something Nvidia do on their supercomputer, there was no reason for them not to have one game available at launch.

It just felt rushed to me and maybe my reasoning is way off, but, I don't think so, especially because of DLSS.
 
See I explained it before you just don't want to read what I post or take it seriously , that is a direct reply to one of your posts.

No it isn't. It's changing the subject, and refusing to answer the questions, again, even when more directly stated.

I'll take you seriously when you stop arguing in bad faith.
 
No it isn't. It's changing the subject, and refusing to answer the questions, again, even when more directly stated.

I'll take you seriously when you stop arguing in bad faith.

look don't waste your time arguing with ppl here.. for most of the guys here what AMD do is good and Nvidia do it's evil, that's all. if it were AMD pushing RayTracing through a dedicated hardware in their GPUs at increased prices because it just more expensive then that would be glorious, pushing the envelope, leading the graphics industry, a revolution of new technologies and innovative bringing everything new to the market.. that's the truth, but just because it is nvidia, there's always something wrong with it, it's evil, there are always more efficient methods of do what nvidia do, and so on.. just look at NAVI.. navi for AMD people it's perfect in every sense and price.. you know because it's 7nm and 7nm it's expensive and they are not a charitable company and AMD it's fine because it's just AMD.
 
The other avenue is having enhancements like shadowing, reflections which may make it a little bit easier for developers but I don't see that as being a game changer in the end. I think Nvidia utterly fail short of their promise and initial hype they promoted. The results speak for themselves. Will Cyberpunk or Doom Eternal show a more mature software approach to RTX? I sure hope so, so far it is not that impressive for RTX.
Where did Nvidia promise unchanged performance with ray-tracing? Or 4K 60fps? Or immediate massive software support?

Making it work in real-time at all was their promise.
 
look don't waste your time arguing with ppl here.. for most of the guys here what AMD do is good and Nvidia do it's evil, that's all. if it were AMD pushing RayTracing through a dedicated hardware in their GPUs at increased prices because it just more expensive then that would be glorious, pushing the envelope, leading the graphics industry, a revolution of new technologies and innovative bringing everything new to the market.. that's the truth, but just because it is nvidia, there's always something wrong with it, it's evil, there are always more efficient methods of do what nvidia do, and so on.. just look at NAVI.. navi for AMD people it's perfect in every sense and price.. you know because it's 7nm and 7nm it's expensive and they are not a charitable company and AMD it's fine because it's just AMD.
...and apparently after all the ranting is done they go to shop and buy Geforce XD
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
 
It's amazing how the "poor selling" RTX 2070 already outranks every single AMD card except the RX 580.

That’s insane. Though it is funny how forum gerbils assume they know better than the companies who are actually slinging these products. Nvidia obviously knows how much product they’re moving at current prices and will drop prices of cards are sitting on shelves.
 
In what parallel universe does it take less than a year to develop showcase titles? How long did it take for showcase DX9 titles to show up? DX11?

Come on guys at least try to pretend to be objective. Raytracing wasn’t invented by nvidia so no need to shit all over it just because your favorite team isn’t in the game yet.
You must have a poor memory, DX10:
https://m.hardocp.com/article/2006/11/08/bfgtech_geforce_8800_gtx_gts/18

That was the last time we made a big jump in render tech (from pixel, vertex and triangle setup to unified shaders).

DX9 had the whole debacle of DX9A, DX9B and DX9C (Besides going away from the assembler like DX8.1 and aiming more high level programming).

Now we have the latest "offspring" of DX12...RTX...and you want to to run from the start?

It's running faster that DX10 from launch already...
 
You must have a poor memory, DX10:
https://m.hardocp.com/article/2006/11/08/bfgtech_geforce_8800_gtx_gts/18

That was the last time we made a big jump in render tech (from pixel, vertex and triangle setup to unified shaders).

DX9 had the whole debacle of DX9A, DX9B and DX9C (Besides going away from the assembler like DX8.1 and aiming more high level programming).

Now we have the latest "offspring" of DX12...RTX...and you want to to run from the start?

It's running faster that DX10 from launch already...

You obviously didn’t read my post before replying ....
 
look don't waste your time arguing with ppl here.. for most of the guys here what AMD do is good and Nvidia do it's evil, that's all. if it were AMD pushing RayTracing through a dedicated hardware in their GPUs at increased prices because it just more expensive then that would be glorious, pushing the envelope, leading the graphics industry, a revolution of new technologies and innovative bringing everything new to the market.. that's the truth, but just because it is nvidia, there's always something wrong with it, it's evil, there are always more efficient methods of do what nvidia do, and so on.. just look at NAVI.. navi for AMD people it's perfect in every sense and price.. you know because it's 7nm and 7nm it's expensive and they are not a charitable company and AMD it's fine because it's just AMD.

I just wish these people would stop taking what companies do so personally.

When AMD does well, I'm an 'Intel / Nvidia hater', and when Intel or Nvidia do well, I'm an 'AMD hater', apparently.
 
Anyway - turned off Ray Tracing while turning a desk lamp on and off and low and behold...
COULD
NOT
TELL
THE
DIFFERENCE

If you could keep the lamp on while waving it around the room, you would. The promise of ray tracing at its base is to allow for the very best pre-baked lighting to be done dynamically in real-time. It goes significantly further than that, and we'll get there, but that's more or less step one.
 
No it isn't. It's changing the subject, and refusing to answer the questions, again, even when more directly stated.

I'll take you seriously when you stop arguing in bad faith.
Your own version of things none of them are valid, you are doing what most people do when they lose out on arguments just plainly blame others. You keep calling names for what reason is that.

You want me to agree on stuff that is not only questionable it is wrong you come to conclusions no one shares with you. Your assessment of the situation is that developers need time when they implement ray tracing this is false when I explained to you that the segmentation is the most important reason for developers not to do it. Your game would look so cool to everyone in the $1200 segment of videocards which does not appeal to developers that means ray tracing does not need time it needs funds from Nvidia.
The people whom promised ray tracing.

I simply do not agree with how you keep painting a single sided view that game development would only need time. I am saying that game development needs hardware for the masses to allow such features for it to become mainstream this is the single most important reason for why there is no sign of ray tracing taking of.

You don't like what I say and that is fine by me but don't turn this around in a kindergarten squabble.

look don't waste your time arguing with ppl here.. for most of the guys here what AMD do is good and Nvidia do it's evil, that's all. if it were AMD pushing RayTracing through a dedicated hardware in their GPUs at increased prices because it just more expensive then that would be glorious, pushing the envelope, leading the graphics industry, a revolution of new technologies and innovative bringing everything new to the market.. that's the truth, but just because it is nvidia, there's always something wrong with it, it's evil, there are always more efficient methods of do what nvidia do, and so on.. just look at NAVI.. navi for AMD people it's perfect in every sense and price.. you know because it's 7nm and 7nm it's expensive and they are not a charitable company and AMD it's fine because it's just AMD.

This is pretty hardware agnostic it does not matter which side introduces anything it is about segmentation of hardware which makes this feature impossible (without Nvidia funds) to become mainstream.

If you simply replace the words Nvidia with AMD to win the argument you are wrong again. I am saying that it is up to the manufacturer to provide hardware that is mainstream for it to work, this approach does not work.
It does not work if AMD does it.
It does not work if Intel does it.
It does not work when Nvidia does it.

And yes preferably hardware that functions the same across all levels from high end to low end.
 
  • Like
Reactions: noko
like this
ray tracing does not need time it needs funds from Nvidia.
What Nvidia funds?
If DXR was NV-only feature then it might make some sense to pay companies to implement it but it is standard extension of DX12 API.

The people whom promised ray tracing.
And the last time I checked they delivered real time ray-tracing on consumer level hardware just fine
 
You want me to agree on stuff that is not only questionable it is wrong you come to conclusions no one shares with you.


No, I didn't want you to agree with me. I wanted to you clarify your position, instead of snidely side stepping and implying it.

This is why I asked repeatedly, clearly stating a condition and asking if you disagree, and you just refused to answer.

This time you kind of state it. I will bold the issue:

Your assessment of the situation is that developers need time when they implement ray tracing this is false when I explained to you that the segmentation is the most important reason for developers not to do it.

If you think developers don't need time, then you don't have a flipping clue what you are taking about, I spent >15 years developing software. Every new technology has a learning curve. Adding new unplanned technology to the middle of a project is an absolute nightmare, doing that all but guarantees blowing the schedule.

Even starting a new project with a new technology would make planning extremely difficult because you really have no prior experience with the kind of bugs you will encounter, or what kind performance expectations to have, leading much more trial error to figure things out.

It is difficult to have any kind of rationale discussion about this with you, when you so obviously don't understand SW development. You seem to think you just insert money and out pops whatever complex SW you want. It doesn't work that way.

Now if you want to ignore that obvious problem, then you can argue developers don't have a huge incentive, but that is a separate issue from the time it will take.
 
What a load nonsense I have been ray tracing ever since I had an Amiga 500 the software is not new the methodology is not new it is just marketed that way. Where you needed workstations to get anywhere or a render farm for ray tracing the hardware part is now smaller but in no way is that the same ray tracing as what Nvidia is offering.

Better put it has a limit to what it is capable of people who claim that Nvidia offering real time ray tracing are kidding themselves. You might even call it a sneak peak into ray tracing.

It is difficult to have any kind of rationale discussion about this with you, when you so obviously don't understand SW development. You seem to think you just insert money and out pops whatever complex SW you want. It doesn't work that way.

Yeah blame me again for not sharing your "vision". I'm getting used to it. You just ignore everything I write , on this matter I will tell you again it is near impossible to tack on things in GPU development so they knew they were doing ray tracing 3 years ago Nvidia with plans of ray tracing people doing engines do know about ray tracing because it has been there for more then a few decades.

Dice also showed you that you are wrong , they patched battlefield V to not ray trace as many things as they started with just to accommodate better framerates. If software so complex they would not be able fix it that fast.

Unless of course you mean that the really gifted people in programming the ones that work at Microsoft have to reinvent the wheel so to them it is quite new before Nvidia they did not even know about it, certainly will take them a good amount of time of catching up.
 
Last edited:
What a load nonsense I have been ray tracing ever since I had an Amiga 500 the software is not new the methodology is not new it is just marketed that way. Where you needed workstations to get anywhere or a render farm for ray tracing the hardware part is now smaller but in no way is that the same ray tracing as what Nvidia is offering.

Better put it has a limit to what it is capable of people who claim that Nvidia offering real time ray tracing are kidding themselves. You might even call it a sneak peak into ray tracing.

Funny, I started with an Amgia 500...and if you think 24 fps, 4096 colors I really have to laugh...let us take a look at what you are talking about:

And


I'd say NVIDIA solution is not only more powerfull...it also does way more.

Now if it is "so simple"...why is AMD absent?



Yeah blame me again for not sharing your "vision". I'm getting used to it. You just ignore everything I write , on this matter I will tell you again it is near impossible to tack on things in GPU development so they knew they were doing ray tracing 3 years ago Nvidia with plans of ray tracing people doing engines do know about ray tracing because it has been there for more then a few decades.

Unless of course you mean that the really gifted people in programming the ones that work at Microsoft have to reinvent the wheel so to them it is quite new before Nvidia they did not even know about it, certainly will take them a good amount of time of catching up.

Intel has been talking raytracing for years:


But the first one to deliver a working gaming solution (supporting Microsoft's DirectX 12 DXR extension) is NVIDIA...not matter your claims about it not being "real".

So you need to learn to things:
No one cares what you think about GPU prices.
NVIDIA don't care, the VAST majority of the world don't care...the market will dictate the price...economics 101.
No matter how buthurt you are over the prices....no one but you cares.
Get over it.

The second thing is that being buthurt over NVIDIA is supporting a new DirectX 12 feature while AMD is playing catchup means that you are digging your own grave.
Your "arguments" will apply with the same "validity" to AMD when they support Microsoft's DX12 DXR feature...you want you "/backpaddle" now...or shall we wait untill AMD step up to the plate?


For the OP:
If you cannnot see the difference between rasterazation and Raytracing in Metro - Exodus...buy a console or go see a eye-doctor:

This is another useless "BAaaaWHHHHAAA...NVIDIA has DXR and AMD hasn't so I am mad and make up stuff" thread!
 
What a load nonsense I have been ray tracing ever since I had an Amiga 500 the software is not new the methodology is not new it is just marketed that way. Where you needed workstations to get anywhere or a render farm for ray tracing the hardware part is now smaller but in no way is that the same ray tracing as what Nvidia is offering.

Better put it has a limit to what it is capable of people who claim that Nvidia offering real time ray tracing are kidding themselves. You might even call it a sneak peak into ray tracing.
What exactly is the real ray tracing that "Nvidia offering" is not being capable of?

Yeah blame me again for not sharing your "vision". I'm getting used to it. You just ignore everything I write , on this matter I will tell you again it is near impossible to tack on things in GPU development so they knew they were doing ray tracing 3 years ago Nvidia with plans of ray tracing people doing engines do know about ray tracing because it has been there for more then a few decades.
Who exactly knew Nvidia will do ray tracing 3 years ago?
What is the source of this information?

EDIT://
Which game companies receive money from AMD to implement AMD's ray-tracing solution?
 
Last edited:
Just did a quick look at Steam, if NVIDIA's RTX is a failure...then AMD is a total catastrophy judging by the numbers:

RTX 2060 - 0.68% (gained 0.14%)
RTX 2070 - 0.95% (gained 0.1%)
RTX 2080 - 0.64% (gained 0.04%)
RTX 2080 Ti - 0.35% (gained 0.05%)

That gives us the sum of 2,62% RTX enabled GPU's on Steam with a growth of 0,29% last month
The top 3 AMD GPU's represented are these:

AMD RX580 - 1.25% (gained 0.1%)
Radeon R7 - 0.90% (lost 0.08%)
Radeon R5 - 0.73% (lost 0.04%)

That gives AMD's best selling GPU's have Steam share of 2.88% with a growth of -0.02% last month
So what...next month we can add AMD's 4th best selling GPU to the list...and countdown for when NVIDA's "failing" RTX line has outsold the 4 best selling AMD GPU's.
It's all a matter og perspective right? ;)
 
In what parallel universe does it take less than a year to develop showcase titles? How long did it take for showcase DX9 titles to show up? DX11?

Come on guys at least try to pretend to be objective. Raytracing wasn’t invented by nvidia so no need to shit all over it just because your favorite team isn’t in the game yet.

Well, why buy this generation then - for the expensive and slow experience?
 
If you could keep the lamp on while waving it around the room, you would. The promise of ray tracing at its base is to allow for the very best pre-baked lighting to be done dynamically in real-time. It goes significantly further than that, and we'll get there, but that's more or less step one.

So in essence, wait for it...
 
Back
Top