Radeon 7 (Vega 2, 7nm, 16GB) - $699 available Feb 7th with 3 games

AMD in the Scott Herkelman interview has already stated that their direct sales will be at MSRP. Soooo..

Will AMD be directly selling them on their website? As it sits right now, all of their stock is sold through other retailers. That's where the price hike will happen.
 
Yes AMD used to sell directly to consumers in the past and they want to do it with Vega VII. Hopefully they limit purchases via names, address, credit card and or other means to prevent scalpers and bots from buying them out 2 screen refreshes after someone adds it to their cart.
 
One thing I'd like to see in VII reviews is if reaching Vram limit affects VRR for Nvidia. I believe it did with Fury X in Forza Horizon 3 (ie. running 1440p with high settings, even well within VRR range I would see stutter. Stutter went away by lower settings to have VRAM usage lowered). Not sure it was due to game or AMD, but lead me to assume VRR can still stutter even well in VRR range if game is video memory is maxed out or game CPU limited (pretty rare for me I think since I run native res 1440 or above).
 
Yes. Jesus read the fucking interview. Or even read my post, I used the phrase “their direct sales” as in sales directly from AMD to end users.

We'll see. All of their product right now is sold through other retailers. I can't find anything on their website that is sold directly from them.
 
We'll see. All of their product right now is sold through other retailers. I can't find anything on their website that is sold directly from them.
https://m.hardocp.com/article/2019/01/14/amd_radeon_vii_interview_scott_herkelman it’s the 4th question. Sure maybe they’ll change their mind in 3 weeks but as of now the stance is that AMD.com will sell the fucking reference Radeon VII card at MSRP.


lGUWhex.jpg
 
That card looks THICC. I was hoping for something I could fit in my ncase that also let me afford a freesync monitor so I don't have to pay the gsync tax.

The specs look awesome though and I am super unconvinced by RTX in real use after seeing the HUGE performance hit on my friend's system.
 
That card looks THICC. I was hoping for something I could fit in my ncase that also let me afford a freesync monitor so I don't have to pay the gsync tax.

The specs look awesome though and I am super unconvinced by RTX in real use after seeing the HUGE performance hit on my friend's system.

You won't have to worry for to much longer about the G-Sync tax. They are only hours away from having support for Adaptive sync monitors. How well that works we will see. It should be the same as AMD cards though.
 
AMD probably only released the Radeon VII to satisfy its fan base.

I mean, if you have $700 to spend and don't have any brand preference, why would you get the Radeon VII over the Geforce RTX 2080?

Now, if you are an AMD fanboy and always wanted an AMD's version of the Geforce GTX 1080 Ti, there you have it.
 
Really don't see them having control over it. If the demand is there, they're either going to have to raise prices or deal with shortages.
Well I suspect AMD will keep prices set at least until all the reviews go through. I remember the initial Vega reviews were less than stellar due to their general lack of availability at MSRP.
 
I remember the initial Vega reviews were less than stellar due to their general lack of availability at MSRP.

Well, AMD insists on rolling HBM with consumer cards- this Vega 2 // Radeon 7 is going to be more of the same, for the same reason. They're literally choosing not to compete with Nvidia by not making larger gaming-focused GPUs using more widely available parts (GDDR).
 
Well, AMD insists on rolling HBM with consumer cards- this Vega 2 // Radeon 7 is going to be more of the same, for the same reason. They're literally choosing not to compete with Nvidia by not making larger gaming-focused GPUs using more widely available parts (GDDR).

I'd imagine that has to do with r&d. HBM is great for professional workloads, and the bandwidth it affords translates nicely to gaming. I'd imagine it would be very difficult for AMD to retool vega to work with gddr (maybe not since it seems that vega cu's are used on APUs, though I've also read they're more similar to polaris) and then you lose the power efficiency gained by using HBM making an inefficient card more inefficient.
 
Well, AMD insists on rolling HBM with consumer cards- this Vega 2 // Radeon 7 is going to be more of the same, for the same reason. They're literally choosing not to compete with Nvidia by not making larger gaming-focused GPUs using more widely available parts (GDDR).
Choosing not to compete by literally competing with the 2080? Oh but muh 1080Ti 5 years ago for same price, well can you buy a new 1080Ti now for the same price? With 16gb? And longer driver support? No.
They are selling them through their own store now so that they can offer MSRP to end users, cost isn't as much of an issue, they are cut/binned Mi50s... they'll still profit on them around 400USD which is current V64 levels. Pricing has a long way to go thanks to god ngreedia.
 
But they're not. They're missing features and using a high-cost process and memory for a consumer product.

but they are. RT is in how many games? oh.

It runs how well? oh.

It's how noticeable? oh.

Don't get me wrong rt is awesome and will likely be great in the next few years, but calling it a missing feature is a bit of a stretch at this point in time.

Who cares if they're using a high cost process and memory for a consumer product if it's competitive on price?
 
We know HBM / HBM2 pricing is bad but wasn't there just a news article here on [H] saying how expensive DDR6 was compared to DDR5 also? So at this point which is worse? Is HBM2 still the most expensive?

With that in mind - I think it was simply too late in the design / production process for AMD to change course. Nvidia dropped Tensor / RT - which as far as I know was totally unexpected / came out of nowhere. AMD already had Vega VII in the bag / Navi wasn't finished yet, so they had to release something even if it was just "Kinda Okay" vs nothing at all.
 
I'd imagine that has to do with r&d. HBM is great for professional workloads, and the bandwidth it affords translates nicely to gaming. I'd imagine it would be very difficult for AMD to retool vega to work with gddr (maybe not since it seems that vega cu's are used on APUs, though I've also read they're more similar to polaris) and then you lose the power efficiency gained by using HBM making an inefficient card more inefficient.

I never bought the power efficiency bit since HBM is huddled next to the die. It actually makes heat density way worse.

Which does make me curious about how much custom loop cooling one of these would help. Might be able to tweak a good bit out of it.
 
Who cares if they're using a high cost process and memory for a consumer product if it's competitive on price?

Everyone who hopes that AMD has the ability to withstand any price competition ;).

We know HBM / HBM2 pricing is bad but wasn't there just a news article here on [H] saying how expensive DDR6 was compared to DDR5 also? So at this point which is worse? Is HBM2 still the most expensive?

We don't- and realistically can't- know what the price delta is. More important is volume; HBM isn't inherently expensive, it actually should be cheaper, if yields were not an issue for the whole part. You can have pristine GPU and memory silicon and still wind up with a broken assembly, etc. If you cannot make enough good final assemblies, that gets priced into your margins.

Nvidia uses HBM too, they're just smart enough to not try to use it for consumer products that have thin margins from the outset and depend on volume.
 
Which does make me curious about how much custom loop cooling one of these would help. Might be able to tweak a good bit out of it.

Generally not much, unfortunately. Not because it's a bad idea, but because AMD parts are usually already pushed over the top of their efficiency curve. They're usually reported to be great at undervolting, which can alleviate heat/noise complaints, however they usually just don't have stable overhead regardless of how much juice is pumped in and heat is pumped out.
 
HBM2 is far more of an elegant solution. Its just four chips around a GPU for a total of 16 Gb . If you look at the 1080 TI 2080TI it has 12 chips for a total of 12 Gb or 11. Increasing the chances of not only failure, but heat output especially in the case of a huge 2080Ti GPU.
A Vega 64 will clock on the memory all the way up to 1200 if you get lucky. HBM2 is simply a better solution when it comes to GPU"s that is why AMD co developed it.
 
Last edited:
  • Like
Reactions: noko
like this
HBM2 is far more of an elegant solution. Its just four chips around a GPU for a total of 16 Gb . If you look at the 1080 TI 2080TI it has 12 chips for a total of 12 Gb or 11. Increasing the chances of not only failure, but heat output especially in the case of a huge 2080Ti GPU.
A Vega 64 will clock on the memory all the way up to 1200 if you get lucky. HBM2 is simply a better solution when it comes to GPU"s that is why AMD co developed it.

This is all meaningless if they cannot produce them in volume.
 
The volume is a problem of the past. What is meaningless is the shit you keep posting in this thread.
 
  • Like
Reactions: Zuul
like this
This is all meaningless if they cannot produce them in volume.

I think HBM2 volume issues are overstated now. I mean we haven't seen any shortages for a while. For the HIgh end cards it is just fine when it comes to volume. Vega 64s have been selling for 399 ish for a while now. If HBM2 was an issue that wouldn't be the case. Has been over 18 months now since vega series launched, so I doubt there is a volume production issue.
 
  • Like
Reactions: N4CR
like this
But they're not. They're missing features and using a high-cost process and memory for a consumer product.
GDDR6 isn't exactly cheap either and these are 'waste cards' or cores that failed as full fat Mi50/60 cards. So if they can make still a few hundred on them, great. Remember now AMD doesn't have retail gouging to contend with as they sell them direct thank fuck.
Missing features? Nvidia hasn't got DLSS anywhere bar a game and a demo and RT is only found in a single game. I'd say the 20 series is also missing a much more critical feature that impacts more future looking games - VRAM to support RT let alone emerging titles in raster. 8Gb is already becoming a limit at 4k.
Most smart people buying the 2080Ti buy it for raster and even mention that here many times, so AMD focusing on actually viable technology (raster) instead of more unusable Vega bs like last time is the right choice. No one wants 1080p/1440/60 RT for FPS. No one wants promised features that turn out to be practically unusable or of dubious effect (Vega 1 ahem). On that point,, it's not even real RT, it's a pseudo partial scene RT and doesn't look that good as they are up against hardware limitations. Yawn. I'll stick with raster like most here until the tech is ready for mainstream, which will take consoles to adopt it and game programmers to have more experience. Next generation or two, maybe after 3000 series/Navi it'll be actually usable, I have a feeling AMD 'next Generation' architecture will focus on this. Same goes for nvidia. Even with the usual 25-35% performance bump, the top end cards will still be mediocre for RT in 3000 series.

Remember many on [H] are really wanting 4k/120+ experience on latest games. With RT, that's 3-5+ years off for decent 'most of the scene' RT usage... for now RT will remain as a niche/sponsored effect, nothing more. Why? Because less than a percent of all users can actually make use of RT in any meaningful performance metric. Right now is not the time to be that 1% who can RT, it makes VR look like a support wonderland. Sorry but I have been shooting video in 1080p since 2007 or so and certainly am not going to compromise to that with my gaming rig that has been running 1440p since 2011. Downgrades because of vapour-features is just stupid.
 
AMD probably only released the Radeon VII to satisfy its fan base.

I mean, if you have $700 to spend and don't have any brand preference, why would you get the Radeon VII over the Geforce RTX 2080?

Now, if you are an AMD fanboy and always wanted an AMD's version of the Geforce GTX 1080 Ti, there you have it.
It's AMDs version of the 2080. 1080Ti will not enjoy as long driver support as the 2080 or the Vega II and is not available new at a comparable price any more so I don't know why you nvidia guys keep bringing it up. Is the 2080 just Nvidias 1080Ti with RT? Lol, you don't see AMD people saying that.
But what really counts: twice the VRAM and faster VRAM at that. Long term, it's almost always a limitation as long as the GPU can drive it and this has held true in my experience since I was using an X800XT in doom 3 lol. Try running Ultra with 256mb... fail. So with this in mind, 8Gb vs 16Gb is no competition. Within a year or two that will be a major longevity issue and [H] even tested that recently showing that it is already an issue in one title. I think the 12Gb in the 2080Ti will last longer though, it's really the 'minimum' that the 2080 should have had too.
Partial scene RT in one game that barely scrapes below or around 60fps at 1080/1440 depending on game mode and map? Super meh.
DLSwhat? Meh. Cool looking tech if it works well but again practically nowhere to be seen either. Not very compelling features when you look at the elephant in the room, half the VRAM and an asshole of a CEO.
 
Real quick thought, my friend brought this up:

WHY is the 7nm Vega able to compete with the 2080/ 1080Ti when 14nm Vega could not do that with 4 more CUs at the same clock speed? Vega 64 Liquid hit 1700 on occasion, and overclockers could push it there reliably and it barely beats a founders edition 1080. You're saying the 60 CU 7nm version manages to do that, only 20% faster with small boost of AT MOST 100Mhz ?

Where is this extra performance coming from!?
 
Real quick thought, my friend brought this up:

WHY is the 7nm Vega able to compete with the 2080/ 1080Ti when 14nm Vega could not do that with 4 more CUs at the same clock speed? Vega 64 Liquid hit 1700 on occasion, and overclockers could push it there reliably and it barely beats a founders edition 1080. You're saying the 60 CU 7nm version manages to do that, only 20% faster with small boost of AT MOST 100Mhz ?

Where is this extra performance coming from!?


Radeon 7 likely sustains higher clock. When I had vega 64 overclocking the memory and getting more bandwidth actually would increase help alot. Vega 64 seem to have be really limited by memory bandwidth and now with 4 stacks of HBM2 and doubling the bandwidth really helped Radeon 7 I believe and on top its probably sustaining the boost clock much higher.

It was actually a very well known fact that Vega 64 really liked higher bandwidth but they couldn't do that with vega 64.
 
  • Like
Reactions: N4CR
like this
Real quick thought, my friend brought this up:

WHY is the 7nm Vega able to compete with the 2080/ 1080Ti when 14nm Vega could not do that with 4 more CUs at the same clock speed? Vega 64 Liquid hit 1700 on occasion, and overclockers could push it there reliably and it barely beats a founders edition 1080. You're saying the 60 CU 7nm version manages to do that, only 20% faster with small boost of AT MOST 100Mhz ?

Where is this extra performance coming from!?

For literally all the answers, watch the Buildzoid video talking about the architecture and the architecture changes on the previous page.

Alternatively GamersNexus while at CES also broke it down.
 
AMD probably only released the Radeon VII to satisfy its fan base.

I mean, if you have $700 to spend and don't have any brand preference, why would you get the Radeon VII over the Geforce RTX 2080?

Now, if you are an AMD fanboy and always wanted an AMD's version of the Geforce GTX 1080 Ti, there you have it.

There is a lot more bandwidth have you ever heard of a HBM based solution that did not like higher bandwidth?
16GB will come in handy as long as you persist on using as many frame buffer functions as you can on 4K.
I would say that it would beat the RTX 2080 without the empty promises hands down....
 
We know HBM / HBM2 pricing is bad but wasn't there just a news article here on [H] saying how expensive DDR6 was compared to DDR5 also? So at this point which is worse? Is HBM2 still the most expensive?

With that in mind - I think it was simply too late in the design / production process for AMD to change course. Nvidia dropped Tensor / RT - which as far as I know was totally unexpected / came out of nowhere. AMD already had Vega VII in the bag / Navi wasn't finished yet, so they had to release something even if it was just "Kinda Okay" vs nothing at all.

You hit the right marketing Nvidia segment talking about Tensor and RT and unless you loop BF5 24/7 that is all you get from having Tensor and RT you get to talk about them which is a great feature :) .

The problem lies that there is no redesign for Radeon 7 due to them not being able to alter the design in which it makes sense for selling to both compute and game markets. If they altered the design it would only work for 1 of the markets and not both. If the consumer version does not sell those chips will end up in the professional compute market.
 
Has anybody heard the rumors that AMD is only going to do a limited production run of these cards? Tweak Town and I think Red Gaming Tech are saying it will be less than 5,000 units, and AMD is loosing their ass on the production of these cards. Not to mention the ROPs are 64 not 128 (I think AMD confirmed this).
 
Has anybody heard the rumors that AMD is only going to do a limited production run of these cards? Tweak Town and I think Red Gaming Tech are saying it will be less than 5,000 units, and AMD is loosing their ass on the production of these cards. Not to mention the ROPs are 64 not 128 (I think AMD confirmed this).


It's just that, rumours. One site said 5000 cards, another said 20k cards with another 40k in the works. Wouldn't pay much attention to it. Just the usual click bait shite to get people talking.
 
AMD probably only released the Radeon VII to satisfy its fan base.

I mean, if you have $700 to spend and don't have any brand preference, why would you get the Radeon VII over the Geforce RTX 2080?

Now, if you are an AMD fanboy and always wanted an AMD's version of the Geforce GTX 1080 Ti, there you have it.

Because I already have Space Invaders and don't need a second copy. ;)
 
Back
Top