ShuttleLuv
Supreme [H]ardness
- Joined
- Apr 12, 2003
- Messages
- 7,295
Still need to factor in overclocking, general tweaking, driver updates, game patches, and SLI...as all of these things can and should help RT performance.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
And you have on developer who after 3 days is getting 90-130fps with 4K. One of these results isn't using Real time Ray Tracing.
Gamechanger when it can be achieved at 4k at 60-fps, they have to start out somewhere so i'm not really surprised this first iteration of it is on on the slow side.
So 3 out 4 developers are struggling to get 60fps on 1080p, including one developer who has been working on RTX since January, was involved in the Demo in March and is aiming to have their game running at 60fps @ 1080p when Ray Tracing is enabled for the release next February.
And you have on developer who after 3 days is getting 90-130fps with 4K.
One of these results isn't using Real time Ray Tracing.
To be sure we just have to wait on benchmark testing not Nvidia claim, looks good on video shot at 1080p But what about 4K UHD where it will push the card to it's MAX? then oh boy that looks beautiful then CTD because of the bugs ?
How complacent have gamers become when real time raytracing is considered "on the slow side"?
View attachment 99362
Been gaming on 4K for about a year. Even with a 1080 Ti, 4K max settings is not possible in every game (Deus Ex, for example).
Funny because many people were previously saying that 4K wasn't worth it, now all of a sudden people only care about 4K.
The hardest thing for me is grasping the trade offs made to call this ray-tracing. The whole bottom/top acceleration structure makes me feel this is still in the environmental light arena compared to a recursive sampling from the camera's point of view.
Tracing a ray from one source to another is technically ray tracing, this just may be a bit different than camera centric algorithm.
The hardest thing for me is grasping the trade offs made to call this ray-tracing. The whole bottom/top acceleration structure makes me feel this is still in the environmental light arena compared to a recursive sampling from the camera's point of view.
Tracing a ray from one source to another is technically ray tracing, this just may be a bit different than camera centric algorithm.
If Nvidia has been working on this for 10 years, I'm a little worried if or when AMD can catch up. Doing that for the next consoles seems overly optimistic.
If Nvidia has been working on this for 10 years, I'm a little worried if or when AMD can catch up. Doing that for the next consoles seems overly optimistic.
Everybody has been working on Ray Tracing for years. Remember Intel's big thing with Larrabee was supposed to be Ray Tracing. I believe the ATI 2900 xt cards were used to draw scenes from the first Transformer movie in real time.
They also have been working on Radeon Rays and Radeon Rays 2.0 was released last March.
Lastly, AMD has traditionally been strong in Compute performance which should help with Ray Tracing.
No at much a fail as not having raytracing hardware...the train is starting to go now...and again, I amuse myself with the influx of new posters at every launch...always very opinionated in a almost predictable manner
Makes me wonder about that Intel 2020 Dedicated GPU.
I think everyone is wondering about that!!
Sure - us new posters can't have an oppinion if it dosen't mirror the old posters.. makes sense
Sure - us new posters can't have an oppinion if it dosen't mirror the old posters.. makes sense
When you start talking about 'Nvidia gospel', you're already outed your bias.
It ain't because you're new.
So, if the 2080ti has 21% more cores over the 1080ti. And the 2080 has 15% more cores over the 1080.
Logic says according to nVidia’s 2080:1080 charts the 2080ti should be ~50% faster than the 1080ti (2080 was 47% faster than the 1080) and more than 2.1x with DLSS.
That’s my story and I am sticking with it since we have no other data. Bahahhaha
You're comparing apples to skyscrapers.Well considering theyre pushing 4k gaming at over 60fps on the same card, then yes, the current ray tracing implementation would be considered slow. That's just the way it is.
Movies are vastly different than video games. The cgi you don't realize it is there is only the background decoration which is usually made trough photogrammetry. If it were a game and you could actually go close to those backgrounds, it would stand out like a sore thumb. The methods that work in movies would be useless for gaming in that form. So there is no point in doing comparisons with movies.Moving to raytracing in video games is a natural evolution, and, in the absence of a third alternative, inevitable. There has been talk of it for a long time now but no hardware for it. Aside from performance (for the time being) raytracing is superior to rasterization in every way - just look at CG in movies compared to video games. And I'm not talking about fully animated movies, I'm talking about the stuff you don't even realize is there because it's so realistic. Eventually the tech will be doing that in real-time.
It doesn't look all that impressive right now, but a lot of first gen hardware technologies don't.
He has a point though. You can't daily a show car that looks pretty, but does 20mph flat out.You're comparing apples to skyscrapers.
Well considering theyre pushing 4k gaming at over 60fps on the same card, then yes, the current ray tracing implementation would be considered slow. That's just the way it is.
People actually underestimate how much movies (well, action movies) are nearly entirely cg these days. See this for example, including many foreground elements.Movies are vastly different than video games. The cgi you don't realize it is there is only the background decoration which is usually made trough photogrammetry.
It makes the games look better? But I don't think it will be a game changer. The graphics will be prettier. If you stop to notice that is with bullets flying everywhere and explosions.
I never said there weren't cgi, I said they look awful, and I wish they used practical effects instead on foreground objects and characters. Like iron man's suit looks exactly like it came out the game mass effect, looks plastic and fake. Or Ultron in the previous avengers movie, plastic fake looking robot, with over exaggerated human like animations. It was just painful to watch. CGI in these movies is like taking my immersion out the back door and shooting it in the face, and then in the neck.People actually underestimate how much movies (well, action movies) are nearly entirely cg these days. See this for example, including many foreground elements.
I can see this in some movies. An example for me is the original Clash of the Titans. I loved it and when the new ones came out, they tried to substitute graphics and effects for story line.I never said there weren't cgi, I said they look awful, and I wish they used practical effects instead on foreground objects and characters. Like iron man's suit looks exactly like it came out the game mass effect, looks plastic and fake. Or Ultron in the previous avengers movie, plastic fake looking robot, with over exaggerated human like animations. It was just painful to watch. CGI in these movies is like taking my immersion out the back door and shooting it in the face, and then in the neck.