Radeon 7 (Vega 2, 7nm, 16GB) - $699 available Feb 7th with 3 games

That is why there are so many ray tracing titles because of Nvidia market share. Rather contradictory isn't it?

Not really. Typically, the hardware has to exist first, and then software implementations follow. It's because Nvidia has the market share as well as the lead in performance that we're seeing developer movement. It's not something shiny, like ATI did with tesselation before it was fully implemented in DirectX, but rather the next standard, and all engines are being updated to support it. AMD will catch up at some point.
 
At 2016 prices it is way overpriced. Nobody that is a gamer (who this card is intended) gives a shit about HBM2. Both the 2080 and this card cost $100 too much. At least the 2080 can command a bit more premium than this thing since it does have more features.

Useless features really dont matter, DLSS is MIA despite Nvidia saying it would be easy for devs to use and Ray Tracing takes more power then these cards have to use properly. I do agree that both cards cost too much but I think with the end of easy die shrinks it's to be expected now.
 
LOL people bitch at not having more ram and when you get more ram people bitch at that as well.

Why is 'just enough' hard?

RAM is a 'just enough' thing. Not enough is bad and tanks performance, but since it is expensive (and with HBM, that expense compounds with the interposer), too much RAM means more cost for no gain. Hell, with DDR more RAM usually means looser possible timings.

Putting 16GB of HBM on a 'gaming card' makes little sense.
 
Card look nice and all but it pretty much is equivalent of 1080Ti imho. The same feature-set and looking by power consumption of Vega64 and 1080Ti it looks like if NV really wanted they could easily compete with Vega VII using GP102 ...
Quoting Jensen Huang, it is "underwhelming" and "lousy"
Especially now since NV is going to support Adaptive Sync it doesn't really make much sense to get this card

At least they were right on the money with
f4fnktxiyqw11.jpg


...without major architecture changes there was very little to expect, really...
 
Card look nice and all but it pretty much is equivalent of 1080Ti imho. The same feature-set and looking by power consumption of Vega64 and 1080Ti it looks like if NV really wanted they could easily compete with Vega VII using GP102 ...
Quoting Jensen Huang, it is "underwhelming" and "lousy"
Especially now since NV is going to support Adaptive Sync it doesn't really make much sense to get this card

At least they were right on the money with
View attachment 134289

...without major architecture changes there was very little to expect, really...

Right, Nvidia is throwing a well worn bone and claiming they support Freesync, barely but hey, lets all run out and buy that 2080 with the free space invaders game. :D
 
Why is 'just enough' hard?

RAM is a 'just enough' thing. Not enough is bad and tanks performance, but since it is expensive (and with HBM, that expense compounds with the interposer), too much RAM means more cost for no gain. Hell, with DDR more RAM usually means looser possible timings.

Putting 16GB of HBM on a 'gaming card' makes little sense.

Because 8GB is no longer "just enough". But hey, you buy that high end 8GB card and have to replace it because 8GB was "just enough". LOL :D
 
Right, Nvidia is throwing a well worn bone and claiming they support Freesync, barely but hey, lets all run out and buy that 2080 with the free space invaders game. :D

They're supporting Freesync as well as AMD ever has- half-assed implementations resulting from zero specification guidance have resulted in a multitude of very poorly thought-out experiences regardless of the brand of GPU used.

Nvidia fixed all of these before they introduced the world to VRR five years ago, and Freesync is still behind.
 
whatever.. all i know is that i want this card - why the propaganda..do you fell that is it needed..? do you have buyers remorse?
RTX is dead in the water...
and also.. this does freesync over HDMI - the one killer feature for me - and i don't think nV will support that..

Ah, do not worry, he is Nvidia fanboy and true to form, at least from the way I normally have seen him post. Anything AMD was DOA for him, that is just the way he rolls.
 
i for one am interested in this card

i also love how all the nv fangirls are calling freesync adaptive sync they fail to realize gsync lost to freesync
 
i also love how all the nv fangirls are calling freesync adaptive sync they fail to realize gsync lost to freesync

New G-Sync monitors are still shipping, G-Sync is still a superior technology, Freesync monitors still have mostly poor VRR implementations...

Hell, I'm just happy that Nvidia is setting a decent standard for Freesync, after AMD failed to do even that ;)
 
New G-Sync monitors are still shipping, G-Sync is still a superior technology, Freesync monitors still have mostly poor VRR implementations...

Hell, I'm just happy that Nvidia is setting a decent standard for Freesync, after AMD failed to do even that ;)


Take your threadcrapping elsewhere. There's an Nvidia forum for you to hang out. Go there.
 
New G-Sync monitors are still shipping, G-Sync is still a superior technology, Freesync monitors still have mostly poor VRR implementations...

Hell, I'm just happy that Nvidia is setting a decent standard for Freesync, after AMD failed to do even that ;)

Freesync works flawlessly on my setup. hell Kyle did a blind test at his house and people preferred the Freesync setup to the gsync. i am definitely detecting some predefined bias in your post
 
Take your threadcrapping elsewhere. There's an Nvidia forum for you to hang out. Go there.

"This is the AMD forum, where we prefer to hide from facts!"

Please note the AMD GPU in the sig- it's powering a Freesync monitor right now ;)
 
Useless features really dont matter, DLSS is MIA despite Nvidia saying it would be easy for devs to use and Ray Tracing takes more power then these cards have to use properly. I do agree that both cards cost too much but I think with the end of easy die shrinks it's to be expected now.

They're far from useless, they are game changing and bring something revolutionary to the market. I'm not a huge fan of ray tracing but DLSS when it gets going will be huge: https://www.kitguru.net/components/...rt-nvidias-ray-tracing-and-dlss-rtx-features/ The games will come and the hardware is there, what does AMD have for the same price? 2017 hardware that's pushed to the limits with nothing new to offer.
 
They're far from useless, they are game changing and bring something revolutionary to the market. I'm not a huge fan of ray tracing but DLSS when it gets going will be huge: https://www.kitguru.net/components/...rt-nvidias-ray-tracing-and-dlss-rtx-features/ The games will come and the hardware is there, what does AMD have for the same price? 2017 hardware that's pushed to the limits with nothing new to offer.

They are useless as none of the consoles will support them and no dev is going to waste resources implementing a tech 1% of the gamers users will use with out buckets of money from nv and nv can do that indefinitely right
 
Why is 'just enough' hard?

RAM is a 'just enough' thing. Not enough is bad and tanks performance, but since it is expensive (and with HBM, that expense compounds with the interposer), too much RAM means more cost for no gain. Hell, with DDR more RAM usually means looser possible timings.

Putting 16GB of HBM on a 'gaming card' makes little sense.

AMD is not marketing the R7 strictly as a gaming card.

"According to AMD's own benchmarks, the Radeon VII excels in real-time 3D and compute applications and gaming.
AMD's is touting improvements up to 27 percent in Blender and DaVinci Resolve 15, 29 percent in Adobe Premiere, and 62 percent in LuxMark OpenCL in comparison to the aging Radeon RX Vega 64 graphics card.
In terms of gaming, the Radeon VII is 35 percent faster in Battlefield V, 42 percent in Strange Brigade, and 25 percent in Fortnite."

https://www.tomshardware.com/news/amd-radeon-vii-7nm-gpu-specs,38400.html

A
 
They're far from useless, they are game changing and bring something revolutionary to the market. I'm not a huge fan of ray tracing but DLSS when it gets going will be huge: https://www.kitguru.net/components/...rt-nvidias-ray-tracing-and-dlss-rtx-features/ The games will come and the hardware is there, what does AMD have for the same price? 2017 hardware that's pushed to the limits with nothing new to offer.

By the time DLSS is widely adopted and a real thing there probably will be a whole host of GPU alternatives that can handle it besides the RTX line.

A
 
AMD is not marketing the R7 strictly as a gaming card.

"According to AMD's own benchmarks, the Radeon VII excels in real-time 3D and compute applications and gaming.
AMD's is touting improvements up to 27 percent in Blender and DaVinci Resolve 15, 29 percent in Adobe Premiere, and 62 percent in LuxMark OpenCL in comparison to the aging Radeon RX Vega 64 graphics card.
In terms of gaming, the Radeon VII is 35 percent faster in Battlefield V, 42 percent in Strange Brigade, and 25 percent in Fortnite."

https://www.tomshardware.com/news/amd-radeon-vii-7nm-gpu-specs,38400.html

A

No surprise, Radeon 7 to me is essentially a rebrand of Instinct MI50. Nice to have a workstation card without the workstation price.
 
They're far from useless, they are game changing and bring something revolutionary to the market. I'm not a huge fan of ray tracing but DLSS when it gets going will be huge: https://www.kitguru.net/components/...rt-nvidias-ray-tracing-and-dlss-rtx-features/ The games will come and the hardware is there, what does AMD have for the same price? 2017 hardware that's pushed to the limits with nothing new to offer.

I agree with your posts. I am also skeptical of any data put out by AMD since they have a piss poor track record. I’ll wait for reviews... but I was more interested in the $250 Navi rumors I am still skeptical of, especially after this.
 
By the time DLSS is widely adopted and a real thing there probably will be a whole host of GPU alternatives that can handle it besides the RTX line.

A

That seems to be the patently false line parroted by a lot of types lately on these forums. Especially the noobie suspect posters.


They are useless as none of the consoles will support them and no dev is going to waste resources implementing a tech 1% of the gamers users will use with out buckets of money from nv and nv can do that indefinitely right

Gets linked a list of devs supporting the features and goes on to rant something useless about consoles and no devs supporting NVIDIA. SMH.

P.S. the console argument has been shown to hold no correlation with the PC time and again. People thought AMD's win with xbox and ps4 would shift everything in it's favor starting from the mantle hype to dx12 and none of that materialized.
 
Last edited:
I asked someone who works in RTG why I (not anyone else, but me personally) should get a Radeon VII instead of a Geforce RTX 2080.

He couldn't give me an answer.

...must be depressing
 
That seems to be the patently false line parroted by a lot of types lately on these forums. Especially the noobie suspect posters.

Suspect?

Why so paranoid?

Also I don't think it's patently false at all, as that would mean clearly obviously false. Also please note I said "probably".

A
 
They're far from useless, they are game changing and bring something revolutionary to the market. I'm not a huge fan of ray tracing but DLSS when it gets going will be huge: https://www.kitguru.net/components/...rt-nvidias-ray-tracing-and-dlss-rtx-features/ The games will come and the hardware is there, what does AMD have for the same price? 2017 hardware that's pushed to the limits with nothing new to offer.

It's a door stop until you can actually use it and see what DLSS can do. Showing me the same list since the cards launched and still no DLSS, which is funny since Battlefield V is out. You take Nvidia at it's word it will be great, if it was so great I think a few developers would have used it by now on current games.
 
starting from the mantle hype to dx12 and none of that materialized.
Well, to be fair, Mantle eventually became Vulkan, and the performance improvements have been proven with DOOM and Wolfenstein II.

It's just sad that most other studios have not invested in the technology and/or are not as skilled as ID in utilizing it.
 
Unfortunately, This card won't be able to run Real Time Ray Tracing at 1080p.... ... .... .... THEN AGAIN NEITHER CAN THE 2080!!! ZIIING
i for one am interested in this card

i also love how all the nv fangirls are calling freesync adaptive sync they fail to realize gsync lost to freesync
AMD was calling Adaptive Sync as "Freesync" and monitor manufacturers followed.
Now NV is actually calling Adaptive Sync not really as Adaptive Sync but "G-Sync compatible" or something like that. Let's see how monitor manufacturers will react to GPU market leader... will they start sticking "G-Sync" logo on their monitors boxes and brochures or not... ;)

BTW. Does this bew card HDMI 2.1 ? If yes then it is one single reason I can find to recommend Radeon VII over RTX 2080
 
AMD was calling Adaptive Sync as "Freesync" and monitor manufacturers followed.
Now NV is actually calling Adaptive Sync not really as Adaptive Sync but "G-Sync compatible" or something like that. Let's see how monitor manufacturers will react to GPU market leader... will they start sticking "G-Sync" logo on their monitors boxes and brochures or not... ;)

BTW. Does this bew card HDMI 2.1 ? If yes then it is one single reason I can find to recommend Radeon VII over RTX 2080

its just a sticker on a box/webpage/product description i doubt they will make much hay over it
 
It's a door stop until you can actually use it and see what DLSS can do. Showing me the same list since the cards launched and still no DLSS, which is funny since Battlefield V is out. You take Nvidia at it's word it will be great, if it was so great I think a few developers would have used it by now on current games.

You act as if game devs can shit out features in a few days. Talk about damage control.
 
Which games need more than 8GB of VRAM today?/QUOTE]
[Apologies if anyone else mentioned this, but I've yet to get through the other 3 pages, and it's 1am already]
Fallout 76 does. That's why the increase in performance in that game is 68% @ 4K! lol

I'm serious, too :\ My R9 390 8GB with NOT EVEN maxed settings @ 1080 results in exceeding my card's 8GB VRAM and swapping to RAM :( I've been fighting to figure out a setting to address this, but it seems to have been something that changed in a patch sometime in December.
(Note: I'm basing this on GPUz and the fact that I get heavy performance drops at times, where I generally am running >65FPS. GPUz shows 7844MB used max Dedicated, and then 200-800MB max for Dynamic. I'm assuming that implies it's allocating to my system RAM and it's those instances that my framerate tanks to the mid-20s. That being said, I'm not saying >8GB is needed, just that in this game, the optimization is so shitty that in this case it kind of is... heh)
 
You act as if game devs can shit out features in a few days. Talk about damage control.

you think a tech nv has been babying for 10 years they didnt shop this around to devs and likely got a not interested causing nv to come back with a bucket of money that EA (BFV) and SQUEENIX (FFXV) took to patch the features into their games after their launch. nv got ahead of themselves they should of waiting till the ground was laid with DXR being launched by Microsoft in the 1809 windows build and waited till the entire ecosystem was ready for it then launched a 7nm rtx
 
Last edited:
You act as if game devs can shit out features in a few days. Talk about damage control.


damage control? Never seen nvidia repeat a whole damn presentation, that seemed more like damage control to me. you do realize NVidia does the heavy lifting on DLSS right? not game developers. Game developers implement RTX. Now here is another thing, I can bet if AMD had DLSS that had no games after 5 months you would be throwing a fit. Proprietary crap is hit and miss. DLSS requires way too much work. AI Is nice. Nvidia could have slapped more cuda cores instead of tensor cores they wouldn't have to upscale. Or they should have dedicated more cores to RTX. That would have made much more sense. Swap the tensor cores for DLSS for actual Cuda cores or more RTX cores and you give users a much better experience. Rather take a more powerful card out of the box instead of cores and die space dedicated to upscaling.
 
Back
Top