Nvidia Killer

Status
Not open for further replies.
with Raja gone i can almost believe it but we all know nvidia's just sitting on their next gpu waiting for AMD to release something so they can launch a new card that's a little faster than it..
 
They kinda need it:
Add-in-Board-GPU-Market-Share-2002-to-Q1-2019.png
 
The wildcard is going to be RT performance imo. I think we can presume that Big Navi is going to be very close to the 2080Ti in traditional performance if RX5700 vs RTX2070 is any indication, but given that AMD is seemingly going to use a slightly different approach to DXR acceleration than NV, it's going to be quite interesting to see how the approaches compare given that RT effects are the future whether anyone likes it or not.
 
AMD's reportedly working on a GPU they're calling "Nvidia Killer"

Apparently AMD is working on RDNA 2 on 7nm+ process with hybrid raytracing, supposedly faster than anything nvidia has today.

Coming in 2020

https://www.overclock3d.net/news/gp...OlAZpgoywuzZuyaM8MFvDaVVacaz_ULS0skSwIc_sV218

GamerX is gonna love this :D:D
Good to see that AMD is now just one generation behind NVIDIA instead of two :cat:.
Poor Ampere
Have heard that one before.

upload_2019-8-9_12-34-14.png
 
The wildcard is going to be RT performance imo. I think we can presume that Big Navi is going to be very close to the 2080Ti in traditional performance if RX5700 vs RTX2070 is any indication, but given that AMD is seemingly going to use a slightly different approach to DXR acceleration than NV, it's going to be quite interesting to see how the approaches compare given that RT effects are the future whether anyone likes it or not.

And let's not forget that Big Navi real competitor will be Ampere.
I don't know if AMD can do better than 1st gen RTX on thier first try, and then there's Intel. So who knows.
Hopefully nvidia learned a lot from Turing and Ampere should be a lot better.

2020 sound like a great year for gaming, no matter who comes on top.
 
In other news, NVidia leaks indicate they are working on something they internally call "The AMD killer".

I have no doubt AMD is working on their next big thing. It will as always be a question of how well it competes and for how long. I think we're all hoping for another R300-esque display of power, but we are all of course quite skeptical until the rubber hits the road.
 
Big Navi has been what I have been waiting for to see what it can do and will look to see what Nvidia comes up with next year as well. More concerned about the pricing levels then possible performance tho.
 
Entirely possible and likely true. Radeon VII did match or sometimes beat 1080 Ti, but it was too little too late.

Could AMD make a card to reach the 2080 Ti? I think they can. But that assumes Nvidia does nothing in the next year.

One wild card would be the ray-tracing performance. If AMD's method is vastly faster than Nvidia, that could be a big deal.
 
If AMD actually puts out a Nvidia killer who to say Nvidia doesn't have something sitting on their dusty shelf that would smash AMD again. Nvidia not even using the 7nm tech yet.
 
If AMD actually puts out a Nvidia killer who to say Nvidia doesn't have something sitting on their dusty shelf that would smash AMD again. Nvidia not even using the 7nm tech yet.

Well you just said it... no they don't have anything on the shelf unless they have another 2000 series firmware update. They do however have ampere coming in 2020. What Navi 20 series represents is AMDs second gen Navi coming earlier then expected cause Dr Su is annoyed she doesn't have high end GPUs to hold in the air.

Yes Navi 20s are going to battle Nvidias first gen 7nm part ampere. The question really becomes what is NV doing with Ampere ? It is another monolithic beast chip designed to go in both gaming and AI cards. Or are they going to start with a easier to produce mass market chip.

Next year is going to be interesting in the GPU wars. AMD is probably going to release a second gen Navi before first gen Navi is a year old, NV has a first gen 7nm planned... and Intel also has a first gen XE launch coming. I would hate to be the NV chi p and product designers right now. If they go big and monolithic they bleed margin, and take a huge risk that 7nm issues could delay launch or create a dud. If they go small they may end up with a great price to performance ratio and make consumes happy but end up losing that high end king of the hill marketing flag. I expect AMD is going to refocus on the GPU business now that the CPU business (at least form a design perspective) is squared away for a few years.
 
I'd venture to guess that nVidia has had working silicon in some way, shape, or form of their next gen in their engineering lab since RTX was released.

It's typical MO for any tech company to do so.

Develop
Test
Refine
Rinse, repeat
This. To think that a company can fully research, develop, produce, and bring to market a new chip every 15-24 months is asinine. Which is why I scoff every time someone claims whatever Intel does is a direct response to a new AMD release.
 
This. To think that a company can fully research, develop, produce, and bring to market a new chip every 15-24 months is asinine. Which is why I scoff every time someone claims whatever Intel does is a direct response to a new AMD release.

Yeah, if we go back NVIDIA stated that the G80 (GeForce 8800 GTX) was 4 years in development.
That means that they started development on it when they were selling the NV25 (GeForce Ti4600)
 
You can design a chip years in advance. But working engineering samples require a fab to produce them. NV just signed a contract with Samsung for their 7nm process a few months ago. Samsungs 7nm EUV has only been ready for production for a few weeks now. I would expect NV has ampere samples coming sometime this fall.
 
That would be nice, i just hope its not a one off thing that get a repeat every 10 - 12 years.
And hopefully something filter down the the lesser cards that will be what i am interested in.
Of course if anyone give me a sweet 4K screen, that will change.
 
Yeah, if we go back NVIDIA stated that the G80 (GeForce 8800 GTX) was 4 years in development.
That means that they started development on it when they were selling the NV25 (GeForce Ti4600)

RTX was 10 years in the making. Which is more or less when Intel announced Larrabee. Coincidence?
 
To many variables to guess on Nvidia side, Samsungs 7nm process needs to go smooth, Nvidia design has to have no major issues or bugs and then it has to be able to run designed clocks. I think Nvidia is a year out at least before they could have product to sell for Ampere. Nvidia is just now likely finally getting some silicon to test, if it goes smooth then likely will start hearing rumors in a couple of months.
 
Well you just said it... no they don't have anything on the shelf unless they have another 2000 series firmware update. They do however have ampere coming in 2020. What Navi 20 series represents is AMDs second gen Navi coming earlier then expected cause Dr Su is annoyed she doesn't have high end GPUs to hold in the air.

Yes Navi 20s are going to battle Nvidias first gen 7nm part ampere. The question really becomes what is NV doing with Ampere ? It is another monolithic beast chip designed to go in both gaming and AI cards. Or are they going to start with a easier to produce mass market chip.

Next year is going to be interesting in the GPU wars. AMD is probably going to release a second gen Navi before first gen Navi is a year old, NV has a first gen 7nm planned... and Intel also has a first gen XE launch coming. I would hate to be the NV chi p and product designers right now. If they go big and monolithic they bleed margin, and take a huge risk that 7nm issues could delay launch or create a dud. If they go small they may end up with a great price to performance ratio and make consumes happy but end up losing that high end king of the hill marketing flag. I expect AMD is going to refocus on the GPU business now that the CPU business (at least form a design perspective) is squared away for a few years.
I meant they had tech that they haven't even showed yet collecting dust.
 
  • Like
Reactions: ChadD
like this
You can design a chip years in advance. But working engineering samples require a fab to produce them. NV just signed a contract with Samsung for their 7nm process a few months ago. Samsungs 7nm EUV has only been ready for production for a few weeks now. I would expect NV has ampere samples coming sometime this fall.
https://www.tomshardware.com/news/nvidia-ampere-gpu-graphics-card-samsung,39787.html

No leading-edge company in this era is so stupid as to put all of their chips in one fab, especially after what happened to Intel.
 
Last edited:
in the past I wouldn't have taken this seriously but with the huge success of their recent CPU line I'm hoping it bleeds over to their GPU division
 
Now I am wondering if "hybrid raytracing" is fancy PR wording for not full DXR support?

unlikely that's what it means-- it's all hybrid raytracing right now, both in the sense that in-game RT effects are used alongside conventional rendering and also in the sense that in the hardware, DXR is partially accelerated in dedicated HW and partially computed on the shader cores.

A "non-hybrid" DXR approach where all rendering is done via pathtracing and all parts of the pathtracing are accelerated by dedicated hardware isn't something we're likely to see soon if ever.
 
I'll believe it when I see it. AMD has yet to impress me with anything they've done with video cards.

Even if they do drop something huge, I'm almost certain that Nvidia has been drip feeding GPU upgrades. I wouldn't be surprised if they have some major improvements shelved until something threatening comes up.
 
I'll believe it when I see it. AMD has yet to impress me with anything they've done with video cards.

Even if they do drop something huge, I'm almost certain that Nvidia has been drip feeding GPU upgrades. I wouldn't be surprised if they have some major improvements shelved until something threatening comes up.

If they did they would have released it rather then the Super upgrade they did. No company holds back a better product that is set to be sold to wait and see if the competitor will catch up or not. Nvidia is doing what most companies do in a leading position, they try to use it in other markets by tweaking the base design to make more money off it.
 
unlikely that's what it means-- it's all hybrid raytracing right now, both in the sense that in-game RT effects are used alongside conventional rendering and also in the sense that in the hardware, DXR is partially accelerated in dedicated HW and partially computed on the shader cores.

A "non-hybrid" DXR approach where all rendering is done via pathtracing and all parts of the pathtracing are accelerated by dedicated hardware isn't something we're likely to see soon if ever.

You're completely right, but it may still be weasel-wording for not accelerating very many of the operations needed in most raytracing scenarios. Or at least, not doing so very effectively.
 
You're completely right, but it may still be weasel-wording for not accelerating very many of the operations needed in most raytracing scenarios. Or at least, not doing so very effectively.

I think it’d be tough to do that if it’s using DXR. Now, there may be some slight differences between nVidia and AMD. In the past devs basically said they’d have to optimize a title for a new vendor. RTX for RT is basically toolkits that help devs implement RT much faster and optimized for nVidia’s hardware. AMD should have their own toolkits.
 
I think it’d be tough to do that if it’s using DXR. Now, there may be some slight differences between nVidia and AMD. In the past devs basically said they’d have to optimize a title for a new vendor. RTX for RT is basically toolkits that help devs implement RT much faster and optimized for nVidia’s hardware. AMD should have their own toolkits.

RTX is really a hardware set which can accelerate the DXR API calls. While applications can ask hardware "can you do this", they cannot ask anything like "how WELL can you do this".

AMD and NV can accelerate this path to varying degrees, based upon the hardware available. Right now, NV's "RTX" hardware does this considerably better - even if not to the standards many would like.

Edit: And yes, vendors may supply libraries to make this work better on their system, or not.
 
Last edited:
RTX is really a hardware set which can accelerate the DXR API calls. While applications can ask hardware "can you do this", they cannot ask anything like "how WELL can you do this".

AMD and NV can accelerate this path to varying degrees, based upon the hardware available. Right now, NV's "RTX" hardware does this considerably better - even if not to the standards many would like.

Edit: And yes, vendors may supply libraries to make this work better on their system, or not.

Yeah, IIRC it was basically all under the RTX umbrella. The libraries, toolkits, and hardware. I am remembering back to interviews of the devs going back before launch, I could be slightly off.

RTX has more than DLSS and RT under it, but it’s not applicable to gamers.

Regardless... I am very interested in how AMD will approach RT. Whether it be chiplets or a completely different path. Brute force with shaders won’t end well.
 
I may be alone here, but I care extremely little about Raytracing. And I don't care so much about who has "the fastest" card - I care about who has the fastest card that fits my budget.

I care about HDMI 2.1 and DP 2.0 much more than I do Raytracing.
 
Status
Not open for further replies.
Back
Top