AMD might detail new RDNA2 graphics card on March 5

Hopefully some games that go along with it that work. Plus any other DXR enabled game made in the last year or so. Well if AMD is going to talk about it then hopefully it will be out by June. In this case, big Navi, AMD will not be cannibalizing any of their sells since they don't have anything in the upper highest end. So it will just hit Nvidia somewhat on people holding off. June is also time frame where TSMC 7nm fabs really open up from Apple moving to 5nm. A very interesting year for both AMD and Nvidia.
 
Always excited about new stuff. Lets hope some fun things are coming no matter the vendor, soon.
 
Check my sig: I'm =desperate= for a decent card (multi 1080 screens or single 1440) with decent thermals at a decent price by a decent company with decent drivers.

:)

Stop me from buying a 5600xt.
 
Check my sig: I'm =desperate= for a decent card (multi 1080 screens or single 1440) with decent thermals at a decent price by a decent company with decent drivers.

:)

Stop me from buying a 5600xt.

Your Vega 56 is just fine for what you need, and it is paid for. Just wait until something better comes out for the price. Unless, of course, you want to replace the other three and then, 3 x 5600XT is good.
 
  • Like
Reactions: c3k
like this
Not holding my breath. Things companies "might" do isn't really news. It just gets the hype up.
Been in tech long enough to know things come when they come.

So much is just perception. What will it be? When will it be? When does it ship? Can I actually have it in my hands (is it a paper launch)?
There are way too many variables to not by cynical about this process. Being patient and having a very long view is the only way to have any amount of sanity. Unless you enjoy just being on pins and needles for half-a-year-or-a-year-at-a-time-all-the-time.
 
Your Vega 56 is just fine for what you need, and it is paid for. Just wait until something better comes out for the price. Unless, of course, you want to replace the other three and then, 3 x 5600XT is good.

Yeah, Rig 1 is due. After the GPU upgrade, I may swap a Ryzen 3700x in for that i7 4790. You know, because. :) (And Rig 2 would get the cast offs. )
 
Yeah, Rig 1 is due. After the GPU upgrade, I may swap a Ryzen 3700x in for that i7 4790. You know, because. :) (And Rig 2 would get the cast offs. )

:) I have no place to setup a forth rig or I would build another one. I have a Vega 56, Vega 64 and RX5700 so I will be set for at least a couple more years. That said, RDNA2 cards, whenever they come out, may be significantly better but, the new XBox will be out later this year as well. Oh well, I do not need to spend money at this time and I have no reason for upgrades so, save, save, save........ :) :D
 
AnandTech has the Q&A session Q4
https://www.anandtech.com/show/15445/amds-fy2019-financial-report

What I found interesting is the following:
Q: Lisa, can you give us some idea of what new GPUs you're expected to launch to the rest of 2020, for PCs and for data center?

LS: Yes. In 2019, we launched our new architecture in GPUs, it's the RDNA architecture, and that was the Navi based products. You should expect that those will be refreshed in 2020 - and we'll have a next generation RDNA architecture that will be part of our 2020 lineup. So we're pretty excited about that, and we'll talk more about that at our financial analyst day. On the data center GPU side, you should also expect that we'll have some new products in the second half of this year.

So next gen RNDA will be talked about (revealed?) March 5. It is clear that data center new GPU's in 2nd half which to me infers gaming next Gen RNDA in 1st half with a refresh for 1st Gen products sometime in 2020. Is this not about the same time as Nvidia will reveal their Ampere cards? AMD launching first I am thinking, leading maybe for the first time the next generation in a long period of time.
 
  • Like
Reactions: N4CR
like this
I hope AMD has a home run in Big Navi and is able to successfully compete with Ampere...
Really want to upgrade my RX580 for a reasonable price. 5700xt and everything from Nvidia too expensive right now.
If Big Navi can "beat" Ampere for less than $1000, then maybe we can have $250 cards with 2080/5700xt performance
 
If AMD can put out a Big Navi / RDNA2 GPU that outperforms the RTX 2080 Ti (so there should be raytracing available to match that particular feature) & manage to offer it for US$899.99 that would be great...

If it could also equal the (forthcoming) Nvidia RTX 3080 Ti GPUs but still come in at the US$899.99 price point, that would be even better...!
 
If AMD can put out a Big Navi / RDNA2 GPU that outperforms the RTX 2080 Ti (so there should be raytracing available to match that particular feature) & manage to offer it for US$899.99 that would be great...

If it could also equal the (forthcoming) Nvidia RTX 3080 Ti GPUs but still come in at the US$899.99 price point, that would be even better...!
Only problem I see is that AMD current precedent with NAVI is slightly lower cost but better performance. If AMD has a 2080Ti competitive part and better performance, no current competition coming -> does not look like AMD will be selling way cheaper, more like $100 maybe. Unless they market it to compete with whatever the RTX 3080 will be presuming around 2080Ti performance at $899. Then yes that would be very cool, a great introduction into the very high end, then again AMD may have a very special higher end card like with HBM etc. to compete with the Titan Ampere and sell it over $1000 like $1399 or something. If that is the case I hope it is not another FE driver limited nightmare card but a card where you decide if you want the Pro drivers or gaming drivers and both are well supported. Hell Nvidia does such a great job on their pro drivers that they run games virtually as fast as their gaming drivers to begin with.

80Cu, 16gb HBM2e -> $1399???
72Cu, 12gb DDR6, 384bit bus -> $899??? beats 2080Ti
64Cu, 8gb DDR 6, $699??? Beats 2080 Super
56Cu, 8gb DDR 6, $499??? Beats 2070 Super
 
Last edited:
LOL, watch these "specs" show up as verified rumors on WCCFTECH soon. :p
i HOPE not, wild ass guess :D. AMD pricing will be affected by how soon Ampere comes out as well as when AMD can deliver and performance of each. Well considering a 5700XT (40 CU) is not far behind a 2070 Super, a 56CU version should slaughter it and beat the 2080 super most likely. So highly opportunistic here. 64CU should be close to a 2080 Ti and then 72Cu and 80Cu should be going against Ampere high end. Then again it would be better to see how everything actually performs vice putting on a tin foil hat or two.
 
...then again AMD may have a very special higher end card like with HBM etc. to compete with the Titan Ampere and sell it over $1000 like $1399 or something. If that is the case I hope it is not another FE driver limited nightmare card but a card where you decide if you want the Pro drivers or gaming drivers and both are well supported. Hell Nvidia does such a great job on their pro drivers that they run games virtually as fast as their gaming drivers to begin with.

80Cu, 16gb HBM2e -> $1399???

I will take one of those, configured with a hybrid cooling solution, a taller PCB to allow a larger axial fan on the card for the VRMs & a 120mm AIO for the GPU/HBM2 (HBM3?) please...

And an onboard BIOS switch, Gaming or Workstation mode...

Oh, and would it REALLY be too much to ask for some rock solid stable drivers (both Gaming & Workstation)...!?!
 
Last edited:
  • Like
Reactions: N4CR
like this
Yes I would take 1 or 2 as well, as for drivers I hope it is not cobbled like the Vega FE by not allowing pure game drivers to be installed (registry modification does allow the Vega FE to now use current gaming drivers but no official support). Their workstation drivers has no OCing ability and prevents utilities from also OCing them combined with very poor gaming support/performance. Nvidia ProDrivers are just miles ahead for gaming while supporting Pro Software stacks plus the huge Cuda eco-system as well as AI.

Some folks forget that AMD has had some of the best cooling cards out there. Like the Vega 64 LC which easily cooled the card when consuming over 400w of power and is relatively quiet, frankly I got up to almost 500w supplied to my Vega 64 LC and the cooler was able to keep up. So having a greater than 300w card, maybe with a bios switch position, is not out of the realm of what AMD can do again, if it can benefit a use case, hopefully more viable than Vega 64 LC where yes it did perform better but was utterly restricted by the memory bandwidth.

As for AMD pricing, it will have to be very tempting to release early and price high as possible. A 80Cu HBM2e configurations 16gb/32gb, their top model which would also go into professional cards, smashing Nvidia current solutions by miles may be a $3000 plus card. Which for the hardware maybe OK but on the software end it may just be rejected at that price unless purely a specialized type usage. Nvidia won the Big Red Supercomputer bid for Indiana University while AMD won the CPU side (meaning AMD did not have a better solution on the GPU side). It maybe more prudent for AMD to make their 40%-45% profit, sell them much lower than the competition to get back in that segment at a very fast rate, work on software and developers supporting. All conjecture and critical thinking. We don't know what AMD has and how high the stack will go. It could be a high end 60Cu part first with a 52Cu part all using DDR6, 384bit/320bit bus. Even those should perform well. Does AMD make one bigger chip and have many smaller CU skews? Or have a Pro GPU plus a Gaming GPU versions?

It has been said/rumored that AMD has a GPU with both HBM and DDR6 memory controller built in -> if so that would to me indicate a die that can be used in more configurations/skews, lower yielding complete dies or bigger dies could be viable if they are still functional. Navi goal was to be scalable with next gen memory support, so having fewer design dies that can be used in multiple ways would be very interesting and could give an advantage on usable yields, simplify manufacturing and board designs.

We just have to wait to see what really is going to be available, performance, worth. Could be another let down moment as well.
 
I would be more than happy with less than 80CU, but I would like to see a LC version with HBM2e (now I gotta go look up what the 'e' signifies)...

I specify HBM2e because then we can have a LC like the Frontier Edition, where the GPU, HBM, & VRMs were all under water, no need for a fan on the shroud...

One thing I would like to see on a hypothetical LC version of an as-yet-to-be-announced GPU; two 4-pin PWM fan headers, so I can push-pull (two Noctua NF-A12x25 PWM fans) on the 120mm AIO...!

And I would not even mind the housing to have the same ribbed & dented look as the reference 5700 XT...!
 
Last edited:
  • Like
Reactions: noko
like this
I would be more than happy with less than 80CU, but I would like to see a LC version with HBM2e (now I gotta go look up what the 'e' signifies)...

I specify HBM2e because then we can have a LC like the Frontier Edition, where the GPU, HBM, & VRMs were all under water, no need for a fan on the shroud...

One thing I would like to see on a hypothetical LC version of an as-yet-to-be-announced GPU; two 4-pin PWM fan headers, so I can push-pull (two Noctua NF-A12x25 PWM fans) on the 120mm AIO...!

And I would not even mind the housing to have the same ribbed & dented look as the reference 5700 XT...!
Samsung recently announced 3rd gen HBM2e:
https://videocardz.com/press-release/samsung-announces-3rd-generation-16gb-hbm2e-memory
A single stack is 16gb and bandwidth for that single stack at 410gb/s with a maximum speed of 538gb/s. These are probably meant more for very high end compute/AI cards. Their Gen 2 HBM2E(Enhanced) was 8gb per stack, probably more suitable for a high end desktop GPU with two stacks and bandwidth around 900gb/s, which also Hynix also makes HBM2e. Will be interesting if AMD will have some HBM2e versions for the desktop for gaming, pretty sure there will be professional cards using it for the desktop to get the memory density/capacity up.
 
  • Like
Reactions: Boil
like this
Way hardware DXR support ever confirmed by AMD?
Was any DXR support ever confirmed by AMD?
 
Way hardware DXR support ever confirmed by AMD?
Was any DXR support ever confirmed by AMD?

Latest leaks / info is saying that they are including hardware DXR but only because everything is pretty much going to need it moving ahead. Unfortunately they also acknowledged that it's pretty much just a gimmick right now due to the massive performance hit for the tiny image upgade that is currently available. If that's the case there is no way this console life cycle is going to have enough hardware raytracing to deliver that wow factor people are looking for, the way ray tracing was intended to just solve every problem. We'll continue to see it a little bit in some reflections and such that don't have quite as much of a performance penalty. By the time the next Gen of consoles is ready (if we haven't figured out game streaming by then), the ps6 will hit the ground running with full scene ray tracing from hardware to software. Until then, honestly it's just a gimmick, so I wouldn't make any buying decisions based on it until it is close to or on par with current gpu performance.

DLSS however, with the exponential rise in AI processing power, will get better, and eventually acheive parity with native image quality. That would be a massive cou for Nvidia considering the uptick we see in performance. By the time full scene ray tracing is ready for prime time, it might just be the difference between 30 fps at 4k on RTG and 60 fps at 4k on Nvidia.
 
  • Like
Reactions: noko
like this
What these companies say depends only on what they have and where competition is at.

I would not expect AMD to spend large amount of transistor budged on ray tracing.
 
Biggest game changer I see is Nvidia DLSS that works now and if Wolfenstein is not some sort of fluke where all the planets aligned for DLSS. Basically 30%-%40% performance increase for same or better native resolution quality. Does not stop there, less training required and can potentially be driver driven.

AI making first big splash into games. Combined that with Nvidia Amper generational improvements will be a tall order for AMD. Unless AMD has something brewing to counteract. Took Nvidia years to figure out, with hardware support, Tensor cores. AMD took a back seat for AI hardware designs and software, relied upon traditional hardware and open source software development.

AI has great potential to transform about everything computer, software development, gaming and I don't see AMD there. As for ray tracing, we may not need it, AI could figure out the lighting way better and faster in the end. Some huge quantum leaps maybe coming for everything dealing with games.
 
Right now if AMD can't fix their graphics card drivers, the new launch isn't going to play well regardless of the uptick in speed. I'd love a pop and play 144fps 4k card but not if it's constantly black screening/crashing on the desktop.
 
A high end AMD GPU will sell well for its intended audience, the AMD faithful. Regardless of driver issues.

It also doesn't have to beat the top Nv cards to sell within expectations. There are enough AMD focused people out there to make it worth their while.
 
A high end AMD GPU will sell well for its intended audience, the AMD faithful. Regardless of driver issues.

It also doesn't have to beat the top Nv cards to sell within expectations. There are enough AMD focused people out there to make it worth their while.

I'm not "AMD faithful", I'm just sick of Nvidia's bullshit. I have quite a few 10 series cards, but when I replace my 1080Ti, it will be with an AMD card.
 
Even if nvidia has a better performing option in the same price range?

That won't happen. More likely it will be similar performance for $50 more expensive because it's Nvidia...that's Nvidia's MO.

I'm not "AMD faithful", I'm just sick of Nvidia's bullshit. I have quite a few 10 series cards, but when I replace my 1080Ti, it will be with an AMD card.

Exactly. I would have gladly bought a 2080Ti at $800 at launch. At $1199? Get bent, Nvidia. A year and a half later you still have to spend $1199 unless you want a poorly binned, poor boosting card. $300 for a mid-range 6GB card AFTER a price drop? Ridiculous... I'm not spending $300 for a 1080p card.

Right now if AMD can't fix their graphics card drivers, the new launch isn't going to play well regardless of the uptick in speed. I'd love a pop and play 144fps 4k card but not if it's constantly black screening/crashing on the desktop.

Whining about AMD drivers unless you're using them with each regular release is getting about as old as me whining about Nvidia's pricing ;). I've actually had more problems with Nvidia's drivers than AMD's in the past year with crashing to the desktop and having to reinstall games and delete config folders just to get them to work with each new Nvidia driver release. I have had zero issues with the newest 20.2.2 driver.
 
Last edited:
Only mentally deficient people would pick an AMD gpu over nVidia in the same price/performance segment.

While I agree, I don't think we'll see this situation. Nvidia cards are going to cost more per FPS than whatever AMD brings out. You can all but guarantee it. As the market leader, Nvidia can get away with it.

And while I know it comes up constantly and I'm not looking to turn this into another AMD driver thread, I do agree it's a problem worth factoring in. Every time I've given AMD a shot in the last ~10 years, I've regretted it. Most recently with a Vega 64. I bought it because I got it on sale for a much better price than the equivalent Nvidia card. I did not like owning that card. I'm keeping an eye on what they are doing, cautiously. I'm in need of a real 4K capable card, and at this point i'm not paying up for a 2080 Ti. If AMD comes out with a part that matches or bests the 2080 Ti in performance and massively undercuts it on price (under $800), I might give them a shot. If AMD really tried to match Nvidia in price/performance, be it with a 2080 Ti or whatever Nvidia has coming this year, I won't give them a second thought. I'd go as far as to say that at equal performance, I'd pay up to 10% more for an Nvidia GPU before I'd start seriously considering something from AMD. That's not because I'm a shill for Nvidia, but because I don't recall ever having a complaint about a Nvidia card I've owned. I can't say the same for AMD. It's a question of risk, and with what these high end cards cost, I don't like taking a risk.

All that said, I'm eager to see if AMD has anything to show off today. Whether it's a card I'd consider buying or not, it'll still play into whatever Nvidia does this year... and I definitely agree that Nvidia also needs to be put in check before commas in GPU prices become standard.
 
I always pick amd because nvidia's performance is _always_ garbage because their open source graphics driver is crippled and hardly functional for 3d acceleration and I refuse to be hamstringed to a specific kernel because of a proprietary driver.

At the rate they're going, intel will have a more viable option before nvidia does ...until then it's amd and only amd.
 
Coincidentally, I had a dream last night that I replaced my 2080 Ti with a 5950 XT. Hopefully that means we should expect great things.
 
I always pick amd because nvidia's performance is _always_ garbage because their open source graphics driver is crippled and hardly functional for 3d acceleration and I refuse to be hamstringed to a specific kernel because of a proprietary driver.

At the rate they're going, intel will have a more viable option before nvidia does ...until then it's amd and only amd.
Windows 10 is the solution to your problem.
 
I have a hard time apt dist-upgrading all my software from console in windows 10. Maybe one day it'll catch up. Until then I'll use linux like I have for the past 20+ years.
You could dual boot and use Windows exclusively for gaming and nothing else.
 
Back
Top