AMD Is Developing a DXR Capable GPU and More Third Generation AMD Ryzen News

InquisitorDavid

[H]Lite
Joined
Jun 27, 2016
Messages
91
Okay, since AMD just announced Radeon VII on 7nm with performance around 1080ti for $699, what does that mean for Navi? Navi will be 7nm too, and while a new arch getting perf gains is expected, I can't imagine possibly doubling shader power without another shrink.

So that means GTX1080 perf at $250 (which is what a lot of buzz about Navi) is kind of a stretch. Assuming an increase in clocks and efficiency, a generous 40% more performance per core would still require quite a lot of cores to make GTX1080 performance...and more will mean bigger die (especially with DXR support incoming), so selling Vega 64-like performance for $250 without losses is going to be hard. Unless they're OP'ing RVII, in which case it's entirely possible.
 

smarenwolf

[H]Lite
Joined
May 7, 2018
Messages
109
and more will mean bigger die
No. While RTX-series of cards are so damn expensive because of their astoundingly huge dies, AMD is moving to chiplet designs. Navi _could_ very well be a chiplet design GPU, meaning you can basically use all of the silicon yield, and not have to throw away defective dies / sell a 2080Ti die with a markdown as a 2070.
 

Riccochet

Off Topic Award
Joined
Apr 11, 2007
Messages
22,927
Smart move by AMD. Help mature and develop the ray tracing market by working with partners before releasing a product that's currently useless and under powered.

For all we know AMD could have something up their sleeve that will compete, rasterization wise, with the 2080Ti and release it with Navi. They're not that far off with Radeon7.
 

M76

[H]ardForum Junkie
Joined
Jun 12, 2012
Messages
10,273
Nvidia has been doing this for years.

RIVA TNT and 32 bit 3D color
Geforce 256 and T&L
Geforce 3 and Shaders
Geforce FX and Shader Model 2.0

All of the above were overkill for games at the time of release, and was widely used by the more popular next gen card.

Nvidia always plays a long game when it comes to features. By the time the 3080Ti comes out, most AAA games will be using the tech. AMD is betting that their implementation will be out at the same time and their game console lead will keep developers from widely developing the tech since the PS4 and Xbox one can't do raytracing either.
Overkill or not, they ended up as industry standards. As opposed to Radeon's pixel shader "0.5" or Savage2000's T&L that never could achieve DX standard compatibility.
 

InquisitorDavid

[H]Lite
Joined
Jun 27, 2016
Messages
91
No. While RTX-series of cards are so damn expensive because of their astoundingly huge dies, AMD is moving to chiplet designs. Navi _could_ very well be a chiplet design GPU, meaning you can basically use all of the silicon yield, and not have to throw away defective dies / sell a 2080Ti die with a markdown as a 2070.
That's assuming they'll use chiplets.
 
Joined
Jan 27, 2015
Messages
520
We have to see how they handle this, using specialized computational cores or with generalized hybrid cores.
 

steen

n00b
Joined
Mar 24, 2018
Messages
9
Nvidia has been doing this for years.

RIVA TNT and 32 bit 3D color
Geforce 256 and T&L
Geforce 3 and Shaders
Geforce FX and Shader Model 2.0
Overkill or not, they ended up as industry standards. As opposed to Radeon's pixel shader "0.5" or Savage2000's T&L that never could achieve DX standard compatibility.
Recidivist revisionism...

RIVA TNT and 32 bit 3D color -> Rendition Verite
Geforce 256 and T&L -> Rendition Pinolite et al -> DX7
Geforce 3 and Shaders -> You mean register combiners? -> DX8
Geforce FX and Shader Model 2.0 -> You're shitting me? Register combiners #2 with abysmal performance & came after R300 with DX9 SM2.0

A bit like Jen-hsun & all his "inventions", as though no prior art existed. NV make some great stuff, but let's keep our knickers on...
 
Top