Coreteks talks Traversal Coprocessors

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,598
Coreteks goes way out on a limb with some very interesting ideas about what we are going to see with NVIDIA'S next-gen.

There is a whole lot of interesting predictions here in this video, and I can assure you that most of what he says is on point. When we will see it exactly may not be on point.


A REVOLUTION in graphics is coming

 
Last edited:
When he say's "I don't think anyone would go so far to pull a lie" - has he ever met the internet? :p

All the other stuff he talks about would be crazy. I was gonna ask 'what about PhysX', but then speak of the devil. Would be cool to see that utilized with Ray Tracing both heavily in a title.

I'm not sold on his controller idea - at all. They already have that. The SHIELD TV. Everything he said could just be done on the current SHIELD TV (including the cheaper vape pen model). Along with Netflix and stuff that apparently wouldn't be in this controller. I guess I could see it being a SHIELD TV v2 (or 3 depending on how you count the 2019 models) - but I don't know - not such a radical reimagining 1 year after a just released 'new' one. And again, how would navigate apps - so it would have to do Geforce Now only on PC unless entire Android TV OS session is cloud hosted - but then TVs would need a thin client device plugged in still unless it's gonna go to existing Chromecasts - but yet again they have the SHIELD TV that does that. Doesn't fit neatly in my mind at all. Messing with my vibe.

The VR thing is too out of left field for me to have an opinion on one way or another lol I wouldn't like OLED though. Burn in will always be a concern, no matter how rare or unlikely. Micro LED would be better, sacrifice to black levels aside. This part of the debate is highly subjective though.
 
Last edited:
Most of that was really interesting. I still take pre-launch rumors with a grain of salt, I learned that much from Bulldozer, but interesting stuff.

One thing he said that made me doubt some of it though was how Nvidia was considering using OLED for the front chroma layer of the dual layer screen. That doesn't seem like it would make any sense if what he said prior to that (a high res luminescence screen in the back and a lower res chrome screen in the front) Based on my (albeit laymans) knowledge of how these t things work, LCD would be a perfect technology for adding color to a light source. That's how it has always worked. OLED would be a great technology for the higher res luminescence screen in the back.

Interesting stuff though. As always I wish they'd write it down in a damned article, instead of making a video, but I guess that ship has sailed with the financial compensation shifting to video :/
 
Interesting as it maybe was for 30 or 60 seconds before I could no longer stand that monotonous voice
and string of irrelevant video clips padding time for extra monotony. I agree, just write a damn article
about culling rays or whatever the completely lost point was. Even if he eventually gets to some point,
it doesn't count cause I was driven to quit watching before it happened...
 
I checked that out yesterday. It's pretty interesting. I was thinking that the coprocessor was something that was likely on 5nm when Hopper comes out since that's supposed to be MCM. However, if it comes this gen as he is speculating then Ampere is going to really rock.
 
I agree, just write a damn article

I miss the days when random speculation was text that you could easily scan and move on. Now they must pad it longer and longer, so they stuff in more ads to their clickbait.

But video clicks pay a LOT more than text clicks, so we aren't going back to that time.

I assume he stumbled on this patent and let the speculation clickbait flow.
https://patents.google.com/patent/US10580196B1/en/
 
Last edited:
Some people get into the "See patent, jump to conclusion it's coming soon", but it's common for patents to never have implementation.

In Google Patents you can click on the "type" of patents and get break down Ray Tracing Patents:
https://patents.google.com/?q=G06T15/06

At the bottom of the page you can see graphs of the amount of RT patents over time. 2007 seemed to be peak RT patents. It also has graphs and stats on assignees.

IBM actually has the most RT patents in the last 30 years. See a lot of IBM RT gear? In the past 5 years it looks like most of them are from Imagination Tech.

NVidia is a small player in RT patents in the listings at the bottom of the page (AMD isn't listed at all).
 
All we have are some mockups, not officially related tech papers and patents. The heatsink design doesn't make sense in some cases unless using a very complex stacked fin production effort. Some designs I do not see how they can be easily mass produced, so curious why they are testing the mockup..
The dual board design is plausible and it solves the die size issue, as 7nm defect rate means no 800mm² dies, however it's cheaper to use one board. Problem is latency when they do need to communicate, internal NV-Link?
If they do something like that i'd consider it from sheer novelty let alone tech jump.
Also the patent would be just enough time by now to implement in a real product, if they already were working on a processor prior, so this might be plausible but I'd say release is still a while off then; late q3 or q4 at best.
 
I miss the days when random speculation was text that you could easily scan and move on. Now they must pad it longer and longer, so they stuff in more ads to their clickbait.

But video clicks pay a LOT more than text clicks, so we aren't going back to that time.

I assume he stumbled on this patent and let the speculation clickbait flow.
https://patents.google.com/patent/US10580196B1/en/

I hope that’s not where he’s getting it from. In that patent the coprocessor is on-chip. You can consider the TMU a coprocessor too. It has nothing to do with add in boards.

Talk about clickbait for real.
 
I hope that’s not where he’s getting it from. In that patent the coprocessor is on-chip. You can consider the TMU a coprocessor too. It has nothing to do with add in boards.

I am not sitting through his clickbait to find out, but from other discussions I read about this video, he seems to think BVH traversal is currently done in software, when NVidia white paper makes it explicit that it's done on the RT cores. So he doesn't even know what is currently done, and seems to assume that a BVH traversal in HW is a new thing...
 
If nVidia does make this a reality then it would be a boon, especially for me since RT doesn't interest me.
 
Then you should quit gaming...because the future entails more and more raytracing...

I have mixed emotions on this.

First gen raytracing is kind of meh.

Unless you were looking at the raster version and the raytracing version side by side it was difficult to tell the difference.

I'm all for improvements in quality, but the performance hit for that minor quality improvement just didn't make it worth it to me. I'd rather use higher resolution.

Now, the rumor is that this traversal coprocessor design reduces the performance impact. If that is the case, I might take it.

I still don't feel like real time raytracing is the "wow" technology Nvidia seems to think it is, though.
 
Last edited:
Personally, I would much rather thave GPUs get into some serious frame rate amplification tech than more ray tracing tech. We see a little bit of that with DLSS2.0 and with Occulus' new tech. But I would be great to be able to put out 240fps with a load like it was 60fps now.
https://blurbusters.com/frame-rate-...es-frat-more-frame-rate-with-better-graphics/

I know there are differing opinions on the usefulness of crazy high framerates, but come on. Haven't we beaten this one to death yet? Surely there is very little if any difference at all beyond - say - 165hz?

I wouldn't mind being able to reliably get 4k ultra up to 120hz though.
 
Personally, I would much rather thave GPUs get into some serious frame rate amplification tech than more ray tracing tech. We see a little bit of that with DLSS2.0 and with Occulus' new tech. But I would be great to be able to put out 240fps with a load like it was 60fps now.
https://blurbusters.com/frame-rate-...es-frat-more-frame-rate-with-better-graphics/

You can just lower resolution or settings to get higher fps if that's your priority. There's no switch we can flip to get better IQ.
 
Surely there is very little if any difference at all beyond - say - 165hz?

Human vision doesn't go much past 30Hz, actually. You can still see tearing, obviously, but if you can do v-sync at 60Hz there's no benefit, no.
 
Human vision doesn't go much past 30Hz, actually. You can still see tearing, obviously, but if you can do v-sync at 60Hz there's no benefit, no.

I used to believe this, but I have been convinced otherwise over the years.

It's not that I SEE much of a difference. I can watch others game at 60hz and it looks the same as much higher framerate to me. Mouse motion does feel smoother to me up to about 90-100hz somewhere. I personally wouldn't bother with anything above 120hz, but to each their own I guess.

This is in typical FPS games.

There are corner cases where the framerate differences are more noticible, like in that UFO test, but in actual games I find the difference in going from 60hz to 90hz to be relatively small, and the difference in going from 90hz to anything above 90hz (up to he max of my screen, 120hz) to be unnoticible.
 
I think the difference between 60Hz with v-sync and 90 or 120Hz with v-sync is that you're more likely to notice frame drops, lagging, and other instances where the refresh rate dips.

I actually talked to this with a brain and vision specialist and there are times when your eyeballs see something above 30Hz and the information gets overlaid with other information (persistence of vision-style) but the chemistry and physics of brains and vision doesn't really keep up with super-high frame rates.

The persistence of vision effect may trick your eyes into seeing more smooth motion, but that's nothing that couldn't be achieved with other effects like motion blur at a lower refresh rate.

Same thing goes for super high resolution; at some point (and it's below 8K I'm sure) at distance you can't make out individual pixels.
 
I think the difference between 60Hz with v-sync and 90 or 120Hz with v-sync is that you're more likely to notice frame drops, lagging, and other instances where the refresh rate dips.

I actually talked to this with a brain and vision specialist and there are times when your eyeballs see something above 30Hz and the information gets overlaid with other information (persistence of vision-style) but the chemistry and physics of brains and vision doesn't really keep up with super-high frame rates.

The persistence of vision effect may trick your eyes into seeing more smooth motion, but that's nothing that couldn't be achieved with other effects like motion blur at a lower refresh rate.

Same thing goes for super high resolution; at some point (and it's below 8K I'm sure) at distance you can't make out individual pixels.

I'd agree with your comments regarding V-Sync, but various forms of adaptive refresh have been on the market for almost a decade now, which not only eliminate tearing, but also reduce input lag, by displaying a rendered frame as soon as it is available.

Anyone who enjoys games in 2020 really ought to have a G-Sync or FreeSync screen.
 
Back
Top