Radeon 7 (Vega 2, 7nm, 16GB) - $699 available Feb 7th with 3 games

See this is what I'm talking about. You guys are making up BS without any facts.

How about apply the same to your posts. Is it that easy that there is one game in beta for after 5 months? I have time as fact, rest you and me will never know. If it was that easy it would be on every game. Plus you ignored my entire comment and only zeroed in on one thing lol.

You are telling me I had no point to anything? Do you not agree had nvidia slapped more cuda cores instead of more tensor cores to do dlss. They wouldn't have to upscale and could easily have gotten another 15% or so in performance without upscaling. Anything that is not automatic is as good as not having it. That is why every bit of rasterization performance counts.
 
Last edited:
You are telling me I had no point to anything? Do you not agree had nvidia slapped more cuda cores instead of more tensor cores to do dlss. They wouldn't have to upscale and could easily have gotten another 15% or so in performance without upscaling. Anything that is not automatic is as good as not having it. That is why every bit of rasterization performance counts.
Same could be said after each and every major upgrade in features when new gpu's launched. Wouldn't eg. GeForce3 not be faster at 'rasterization' if die space they assigned for pixel and vertex shaders be alocated for more texture mapping units?

Market adoption of DXR is not very good and only this is an issue. AMD releasing cards without it does not help anything.
 
Same could be said after each and every major upgrade in features when new gpu's launched. Wouldn't eg. GeForce3 not be faster at 'rasterization' if die space they assigned for pixel and vertex shaders be alocated for more texture mapping units?

Market adoption of DXR is not very good and only this is an issue. AMD releasing cards without it does not help anything.

I wasn't even talking about RTX. I was talking about DLSS. DLSS is where its wasted space. Those tensor cores do upscaling while nvidia could have added more shaders and not needed that upscaling. or they should have dedicated those to to more ray tracing. Either way it would be Guaranteed performance.
 
I wasn't even talking about RTX. I was talking about DLSS. DLSS is where its wasted space. Those tensor cores do upscaling while nvidia could have added more shaders and not needed that upscaling. or they should have dedicated those to to more ray tracing. Either way it would be Guaranteed performance.
LOL
tensor cores is a killer feature

and besides tensor cores are reqiured for ray tracing because you need to do tons of advanced denoising for it to look reasonable and not like you lost reception on your TV set
 
I don't think people understand they had to release something. They have MI50 cores just sitting and couldn't go another 6-9 months with only vega 64 as their fastest card. This will be it until they make the console makers happy then we will see some attempt at more performance maybe Q4. They will be behind NVIDIA because of bad decisions under Radja, the question is how far they are behind what NVIDIA is holding back now. The 20 series was released because they were expected to release something because we asked for it. It wouldn't surprise me in the least if they moved the 20 series skus up one because they had very little competition. ( 2080ti was supposed to be the 2080, 2080 the 2070 and 2070 the 2060, 2060 the 2050)
 
LOL
tensor cores is a killer feature

and besides tensor cores are reqiured for ray tracing because you need to do tons of advanced denoising for it to look reasonable and not like you lost reception on your TV set

Yea tensor cores are killer if htey used the whol thing for RTX. You see what i mean. I am not arguing about tensor cores. I am arguing the fact that DLSS is not automatic upscaling and it requires work on nvidia's side and geforce experience then downloads updates from nvidia for dlss to work. So its not like plug and play. At no point I am arguing Tensor cores are bad. Either use them for RTX or add more cuda cores. WHat are they doing? Upscaling right? So would you rather have automatic performance with more cores or something that is not automatic. THat is my entire point. If tensor cores were doing the work without requiring game by game implementation it would be a different story. I would rather have them use them all for DXR to up the ray tracing.

There is no way to tell if Nvidia is using all the tensor cores for ray tracing. They call their ray tracing cores RT Cores dedicated to that. So not sure how their implementation works.
 
I don't think people understand they had to release something. They have MI50 cores just sitting and couldn't go another 6-9 months with only vega 64 as their fastest card. This will be it until they make the console makers happy then we will see some attempt at more performance maybe Q4. They will be behind NVIDIA because of bad decisions under Radja, the question is how far they are behind what NVIDIA is holding back now. The 20 series was released because they were expected to release something because we asked for it. It wouldn't surprise me in the least if they moved the 20 series skus up one because they had very little competition. ( 2080ti was supposed to be the 2080, 2080 the 2070 and 2070 the 2060, 2060 the 2050)

No they moved it up because those big dies are costing them alot and they could because people were desperate and when nvidia charges people justify it one way or another.

As far as AMDs next gen goes its coming. Arcturus is the code name and its actually been in the works since 2015-2016. That is the brand new architecture after GCN. Console part is likely already done as Navi for desktops is being tested in the labs already. 2020 is where its going to get exciting on the high end. Nvidia is not going to have any newer card until 2020 either I think since they are on a 18 months to 2-year cycle now. Plus they are going Samsung it seems for the 7nm parts. So 2020 is when they will likely replace the turing cards as well.

And you are absolutely right about the MI50 part. Its a pipe cleaner card and 7nm test subject lol. Why not get it out if they have it ready to make some AMD fans happy so they have something to buy and content creators looking for the large frame buffer and killer bandwidth.
 
  • Like
Reactions: Boil
like this
THat is my entire point. If tensor cores were doing the work without requiring game by game implementation it would be a different story.
It is a very good point
It should be possible to make universal DLSS-like scaling that would just upscale game to monitor resolution. There are also other users for RT cos that NV actually investigated like eg using neuron networks to smooth motion by adding (predicting) frames. With features like these there would be no one to say they should put more shaders instead of tensor cores
 
I agree with your posts. I am also skeptical of any data put out by AMD since they have a piss poor track record. I’ll wait for reviews... but I was more interested in the $250 Navi rumors I am still skeptical of, especially after this.
Actually you shouldn't because Nvidia's track record is only when they can completely cripple performance on competitors cards then the patches would already be there at launch. When Nvidia has to implement their own premium features there suddenly short on developers?
Why is 'just enough' hard?

RAM is a 'just enough' thing. Not enough is bad and tanks performance, but since it is expensive (and with HBM, that expense compounds with the interposer), too much RAM means more cost for no gain. Hell, with DDR more RAM usually means looser possible timings.

Putting 16GB of HBM on a 'gaming card' makes little sense.
Because you have no idea on what you are talking about to begin with. 4K uses larger frame buffer features that means when you do more features you never have to swap outside the 16GB you already have and it allows better throughput as well.
They're supporting Freesync as well as AMD ever has- half-assed implementations resulting from zero specification guidance have resulted in a multitude of very poorly thought-out experiences regardless of the brand of GPU used.

Nvidia fixed all of these before they introduced the world to VRR five years ago, and Freesync is still behind.
Nvidia just proved that they are not supporting Freesync they are supporting their own branding as g-sync compatible. Huge difference apparently because what AMD does Nvidia can not do ?

They're far from useless, they are game changing and bring something revolutionary to the market. I'm not a huge fan of ray tracing but DLSS when it gets going will be huge: https://www.kitguru.net/components/...rt-nvidias-ray-tracing-and-dlss-rtx-features/ The games will come and the hardware is there, what does AMD have for the same price? 2017 hardware that's pushed to the limits with nothing new to offer.
You don't have to be a fan of anything but those games get features pushed which are pretty trivial. You can claim you getting your money worth but there is no value you can claim faster frame rates but those go away when you enable ray tracing if there were features that would change gaming there not. The features that Nvidia has to implement only feature using hardware that you paid a premium price for and that is it.
 
It is a very good point
It should be possible to make universal DLSS-like scaling that would just upscale game to monitor resolution. There are also other users for RT cos that NV actually investigated like eg using neuron networks to smooth motion by adding (predicting) frames. With features like these there would be no one to say they should put more shaders instead of tensor cores

Maybe you should revisit this after 3 or 4 generations then wonder why they could not do it :).
 
Brent and Kyle already ran into 8gb ram limitations on the 2080 and 2070 in BF5. Causing stutter and sporatic low frame rates.

HDR uses 10 bit light maps, textures etc which will add to usage of vram. In BF5 up to 12gb of VRAM can be seen used for more ready assets for Smooth game play and that is without RTX.

The 6gb RTX 2060 s a bad joke and as far as I am concern DOA. Plus another $100 added to it. Seriously will it be able to really use RTX in anything? Ram is the what, that will make this card pointless and a disappointment, Fury 2.

As for Freesync, there are many happy folks using it. My Freesync 2 monitor works flawless and is HDR. There is another aspect that purturbs me, HDR on the 1080 TI makes the desktop look like utter crap. Same monitor on the Vega 64 with hdr desktop and it correctly converts sdr content over, videos etc. then you see zero analysis or review over this aspect.

I will see how VRR works with NVidia.
 
Last edited:
Brent and Kyle already ran into 8gb ram limitations on the 2080 and 2070 in BF5. Causing stutter and sporatic low frame rates.

HDR uses 10 bit light maps, textures etc which will add to usage of vram. In BF5 up to 12gb of VRAM can be seen used for more ready assets for Smoth game play and that is without RTX.

The 6gb RTX 2060 s a bad joke and as far as I am concern DOA. Plus another $100 added to it. Seriously will it be able to really use RTX in anything? Ram is the what will make this card pointless and a disappointment, Fury 2.

As for Freesync, there are many happy folks using it. My Freesync 2 monitor works flawless and is HDR. There is another aspect that purturbs me, HDR on the 1080 TI makes the desktop look like utter crap. Same monitor on the Vega 64 with hdr desktop and it correctly converts sdr content over, videos etc. then you see zero analysis or review over this aspect.

I will see how VRR works with NVidia.

I agree. Nvidia has fuckin nerves to launch gtx 2060 at +100 then 1060 and still 6gb of ram. It’s amazing to people justify anything Nvidia. I refuse to pay 350 for a card with less than 8gb of ram. It’s just hilarious to me. It just can’t be justified.
 
Last cards I owned from the red team were a couple of XFX 5870 in Crossfire. Good performance for the money. I'd like to see AMD up their game and release a competitive product. Nvidia needs a strong rival. Or, at least we as consumers need Nvidia to have a strong rival.
 
Actually you shouldn't because Nvidia's track record is only when they can completely cripple performance on competitors cards then the patches would already be there at launch. When Nvidia has to implement their own premium features there suddenly short on developers?

Because you have no idea on what you are talking about to begin with. 4K uses larger frame buffer features that means when you do more features you never have to swap outside the 16GB you already have and it allows better throughput as well.

Nvidia just proved that they are not supporting Freesync they are supporting their own branding as g-sync compatible. Huge difference apparently because what AMD does Nvidia can not do ?


You don't have to be a fan of anything but those games get features pushed which are pretty trivial. You can claim you getting your money worth but there is no value you can claim faster frame rates but those go away when you enable ray tracing if there were features that would change gaming there not. The features that Nvidia has to implement only feature using hardware that you paid a premium price for and that is it.

At the end of the day some of joker’s points are the same as my thoughts.

For the same price AMD offers worse VR (if the past holds true which I don’t see why it wouldn’t) and no DLSS/RT. Vega II is DOA at this price point for me. The only factor might be if a freesync panel comes out that Vega II supports that nVidia does not. That’s basically their only advantage.

For my relatives if Navi can match the 2070 rasterized, no DXR, at $250-300 I would choose that over the $500 2070. Plus reliability matters more for their rigs... I’d be in for a couple of those.
 
At the end of the day some of joker’s points are the same as my thoughts.

For the same price AMD offers worse VR (if the past holds true which I don’t see why it wouldn’t) and no DLSS/RT. Vega II is DOA at this price point for me. The only factor might be if a freesync panel comes out that Vega II supports that nVidia does not. That’s basically their only advantage.

For my relatives if Navi can match the 2070 rasterized, no DXR, at $250-300 I would choose that over the $500 2070. Plus reliability matters more for their rigs... I’d be in for a couple of those.

You have a 2080ti, no shit Vega II is not on your radar. Joker wants to believe that DLSS is game changing cause ray tracing is garbage and barely usable even on a 2080ti. I see the greater amount of ram more useful then a promised feature that not 1 company has brought forward to use in their game despite Nvidia insisting it would be easy to do. Heck [H] showed just recently in a article that they are chewing up more then 8 gigs and using HDR even uses more ram. Id rather have more then not enough the only thing I see is all the high end cards are over priced these days and the bigger issue. Hell they are so over priced I can go to my local Best Buy and they have 10 2080's in stock and they are collecting dust. Were also not even touching the fact that the 2000 series has a bad habit of failing. Price is the problem for both sides, but touting useless features to try to justify the price on one side is just silly.
 
  • Like
Reactions: jtm55
like this
You have a 2080ti, no shit Vega II is not on your radar. Joker wants to believe that DLSS is game changing cause ray tracing is garbage and barely usable even on a 2080ti. I see the greater amount of ram more useful then a promised feature that not 1 company has brought forward to use in their game despite Nvidia insisting it would be easy to do. Heck [H] showed just recently in a article that they are chewing up more then 8 gigs and using HDR even uses more ram. Id rather have more then not enough the only thing I see is all the high end cards are over priced these days and the bigger issue. Hell they are so over priced I can go to my local Best Buy and they have 10 2080's in stock and they are collecting dust. Were also not even touching the fact that the 2000 series has a bad habit of failing. Price is the problem for both sides, but touting useless features to try to justify the price on one side is just silly.
Actually the price problems are caused by people buying Nvidia hardware. The last time I bought a card it was the R9 290X for 550 Euro that was high end.
But if you look at the price of Radeon 7 with 16GB HBM2 you can understand at least why it is higher.
 
You have a 2080ti, no shit Vega II is not on your radar. Joker wants to believe that DLSS is game changing cause ray tracing is garbage and barely usable even on a 2080ti. I see the greater amount of ram more useful then a promised feature that not 1 company has brought forward to use in their game despite Nvidia insisting it would be easy to do. Heck [H] showed just recently in a article that they are chewing up more then 8 gigs and using HDR even uses more ram. Id rather have more then not enough the only thing I see is all the high end cards are over priced these days and the bigger issue. Hell they are so over priced I can go to my local Best Buy and they have 10 2080's in stock and they are collecting dust. Were also not even touching the fact that the 2000 series has a bad habit of failing. Price is the problem for both sides, but touting useless features to try to justify the price on one side is just silly.

Cheese and rice, you spent half your post ranting about my 2080ti. I have other rigs, one with a failing 1080ti, and relatives / friends I help.

I also indirectly mentioned failures when I said, “plus reliability matters more for their rigs”. So you are right, ontop of more freesync compatibility reliability is a concern based on Kyle’s experiences. It’s unfortunate we don’t have a large dataset on that.

Also ray tracing runs BFV at 75fps ultra/low on a 3440x1440 60Hz monitor for me. No impact for a better experience. I will agree it’s a strange title to start with.
 
Because you have no idea on what you are talking about to begin with. 4K uses larger frame buffer features that means when you do more features you never have to swap outside the 16GB you already have and it allows better throughput as well.

Why do you talk like you just discovered frame buffers. 4k is 3840x2160x32bits, or 265420800bits or 33177600 bytes or 31.64MB. That's <32MB! And I need another 8GB for that? Nope.

Nvidia just proved that they are not supporting Freesync they are supporting their own branding as g-sync compatible. Huge difference apparently because what AMD does Nvidia can not do ?

Nvidia is supporting Freesync as well as AMD ever has. They're calling it 'G-Sync compatible', and they're using the same Adaptive V-Sync protocol. Even better, Nvidia is providing a rubric for support so we know which monitors are actually close to G-Sync out of the multitudes that fall very short!
 
No they moved it up because those big dies are costing them alot and they could because people were desperate and when nvidia charges people justify it one way or another.

As far as AMDs next gen goes its coming. Arcturus is the code name and its actually been in the works since 2015-2016. That is the brand new architecture after GCN. Console part is likely already done as Navi for desktops is being tested in the labs already. 2020 is where its going to get exciting on the high end. Nvidia is not going to have any newer card until 2020 either I think since they are on a 18 months to 2-year cycle now. Plus they are going Samsung it seems for the 7nm parts. So 2020 is when they will likely replace the turing cards as well.

And you are absolutely right about the MI50 part. Its a pipe cleaner card and 7nm test subject lol. Why not get it out if they have it ready to make some AMD fans happy so they have something to buy and content creators looking for the large frame buffer and killer bandwidth.

I agree on the whole pricing thing,Nvidia is driving up pikes & then we have folks here that think "This is fine" as their PCs burst into flames...

Vega II is a solid stop-gap until Navi is ready, and the real excitement is Arcturus...

That last bit I highlighted, that is exactly the real audience for the Radeon VII...

Radeon VII (aka Vega II) is AMDs Titan...
 
I agree on the whole pricing thing,Nvidia is driving up pikes & then we have folks here that think "This is fine" as their PCs burst into flames....


this x1000000

It's hysterical how many people on this forum don't consider $500+ video cards to be high end. They claim $500 is the new mid range, it's ludicrous.
 
I agree on the whole pricing thing,Nvidia is driving up pikes & then we have folks here that think "This is fine" as their PCs burst into flames...

Vega II is a solid stop-gap until Navi is ready, and the real excitement is Arcturus...

That last bit I highlighted, that is exactly the real audience for the Radeon VII...

Radeon VII (aka Vega II) is AMDs Titan...

It's not a solid stop gap at $699. At $500, yes. But the prices of the RTX series and the new Vega are over the top.
 
Well, Vega 64 is selling for around $450 - $500 on the lower end, so a $600 - $650 price for the VII would make more sense, but $700 is not that crazy considering.
 
It's not a solid stop gap at $699. At $500, yes. But the prices of the RTX series and the new Vega are over the top.

Honestly I would love for it to be 500 but it has a solid cooler, 16GB HBM2 memory buffer. I doubt AMD can sell it for anything less right now. Maybe in 6 months, you see it drop another 100 or so. Also, AMD will probably have these in limited quantities this is just to get them buy until next gen. You don't hear many rumors or distractions coming out of RTG. Looks like they are busy trying to get the next gen out in a timely manner.

AMD is expecting not only gamers to buy it but content creators as well. I think this thing is a steal for 699 with that frame buffer. It's much better than original frontier edition that launched at 1000. Wouldn't be surprised if AMD is just losing money on this at 699. Since its a new process and all that ram. Regardless though I think its going to sell and it gives AMD some breathing room and concentrate on Navi for mid range and zen 2 and then true next gen next year. Nvidia won't be having 7nm until 2020 anyways and AMD has the most to gain from next gen since they haven't really had brand new architecture since GCN. It served its purpose but even surprised they managed to squeeze even 25-30% performance out of vega 2 lol.
 

Yea but AMD can't sell that card for less than 700 right now. It is what it is. But I can bet they will likely sell all of them since it appeals to more than gamers. All that frame buffer and memory bandwidth comes in handy for content creation and it will push Nvidia to increase their frame buffer. Can not believe the 2060 still gets 6GB for 350 bucks. Nvidia needs to get with times.
 
Can not believe the 2060 still gets 6GB for 350 bucks.

I'd like to see it be an issue before supporting detractions. 6GB seems like it'd be more than enough for 1440p and higher settings, depending on the application and what the 2060 can actually handle.

AMD can market the Radeon 7 for non-gaming workloads to the handful of people that could use it for that, but really, it's a mismatched product for gaming.

What's frustrating is that they have the technology to produce a GPU that would be faster than the 2080Ti at pure raster rending in games. They simply choose not to put their efforts there. Gaming remains a second-class citizen in AMD land.
 
I'd like to see it be an issue before supporting detractions. 6GB seems like it'd be more than enough for 1440p and higher settings, depending on the application and what the 2060 can actually handle.

AMD can market the Radeon 7 for non-gaming workloads to the handful of people that could use it for that, but really, it's a mismatched product for gaming.

What's frustrating is that they have the technology to produce a GPU that would be faster than the 2080Ti at pure raster rending in games. They simply choose not to put their efforts there. Gaming remains a second-class citizen in AMD land.
Always better to be a 2nd class citizen then just a run of the mill cow needed for milking ;)
 
I'd like to see it be an issue before supporting detractions. 6GB seems like it'd be more than enough for 1440p and higher settings, depending on the application and what the 2060 can actually handle.

AMD can market the Radeon 7 for non-gaming workloads to the handful of people that could use it for that, but really, it's a mismatched product for gaming.

What's frustrating is that they have the technology to produce a GPU that would be faster than the 2080Ti at pure raster rending in games. They simply choose not to put their efforts there. Gaming remains a second-class citizen in AMD land.

They do and likely coming next gen. Focus went into CPUs as that is the bread and butter for them for now and will likley lift them up and invest more in R&D. But I think its taken them longer due to more budget going in to CPU side. Arcturus has been in the works since 2015 or so shortly after LISA SU became CEO. So instead of 2-3 years its taking the 4-5 years to bring the architecture around. But we will have it soon though. If AMD didn't work on the CPU side we may not even have an AMD. I expect them to make a comback in GPU side as well. It may not seem like it but as AMD keeps coming up in the CPU market it will enable them to make better and better GPUs. Nvidia probably knows that as well. I am sure they are going to capitalize as much as they can for now, because AMD will have 7nm+ next gen card next year same time around Nvidia comes with 7nm stuff. Only this time it won't be GCN. So I am expecting the big leap in performance there, they have the biggest to gain from an architectural and rasterization standpoint.
 
I'd rather them do better on the CPU front anyways if I had a choice.

Yep same here. Some people don't realize how much that will contribute to better and better GPUs done the road. They can make a killing with better CPUs. I think 2020-2021 is going to be a sweet year for AMD when it comes to CPU market share.
 
#waitforNavi

Navi is just going to be mid range though.

I'm pretty sure only initially. I'm sure i read somewhere there will be a high end version coming late 2019 early 2020.

The rumored Radeon 3K series of cards are supposed to be mid-range, built on Small Navi; these should be showing up in the middle of 2019...

There is supposed to be a Big Navi, which I would guess is the replacement for Radeon VII (and should go up against the RTX 2080Ti), and this is supposed to be for late 2019...

Then we get Arcturus in 2020... This is all new "Next-Gen" architecture, probably on 7nm+...

#Arcturus4King
 
The rumored Radeon 3K series of cards are supposed to be mid-range, built on Small Navi; these should be showing up in the middle of 2019...

There is supposed to be a Big Navi, which I would guess is the replacement for Radeon VII (and should go up against the RTX 2080Ti), and this is supposed to be for late 2019...

Then we get Arcturus in 2020... This is all new "Next-Gen" architecture, probably on 7nm+...

#Arcturus4King

There were rumors big Navi will come in 2020 but not sure if that makes sense though. I mean if they can release vega 20 on 7nm already I suspect it will likely be later half of the year, not too late may be september release and mid range Navi being Computex launch. And then Arcturus coming out Q3 2020. I don't see Navi coming any later then that if Arcturus will come out in 2020. It could be one or the other if Navi is performing much better then we may see Arcturus pushed back to add some ray tracing to it? Who knows. But mid range Navi is coming for sure I guess lol.
 
There were rumors big Navi will come in 2020 but not sure if that makes sense though. I mean if they can release vega 20 on 7nm already I suspect it will likely be later half of the year, not too late may be september release and mid range Navi being Computex launch. And then Arcturus coming out Q3 2020. I don't see Navi coming any later then that if Arcturus will come out in 2020. It could be one or the other if Navi is performing much better then we may see Arcturus pushed back to add some ray tracing to it? Who knows. But mid range Navi is coming for sure I guess lol.

Yeah, we at least got that generalization of both Zen 2 CPUs & Navi GPUs headed our way summer of 2019...
 
The benchmarks look good.

RadeonVII-vs-Vega64.jpg
 
Back
Top