Guys let's support AMD gpu, and boycott Nvidia, Asus, Gigabyte, MSI

I'm all for supporting AMD. Thing is, these retailers are grouging the crap out of us. $500 dollars for a RX 580? Fuck you.

The ole boycott list is getting long.
 
Says the people griping about Nvidia not releasing new cards.
Where have I complained about Nvidia not releasing new cards? My 1080 Ti is still tearing through anything I throw at it without even working up a sweat.
 
If by "Optimized" they mean "not hobbled by Gameworks" I will completely agree :)

lol, your PRO-AMD anti-Nvidia Bias it's pouring from your veins and it's getting out of control.. do you know that Far Cry Tittles have always being some kind of "neutral tittle" always trying to use the best technology available, Far cry 4 even with Gameworks favored A LOT AMD GPUs, with things as an R9 280X/HD7970 outperforming GTX 780s and near GTX Titan levels, crazy right? what kind of gimp there, and that game used A LOT of Gameworks features, such as HairWorks, HBAO+, shadow PCSS, TXAA, and NV God Rays.. and guess what? it played wonderfully on AMD cards EVEN WITH *GIMPWORKS* as even you said and edited ;).. Far cry Primal not used lot of Gameworks features and guess what? Ran equally wonderfully not better, not worse, and that's exactly what happen when a Developer take care on optimizing properly a game indifferently of the optional libraries used (gameworks DLL).. most games where Gameworks it's used and not favor AMD it's due Dev laziness as of course libraries are pre-optimized for Nvidia tessellation... far cry 3 also used top Nvidia (pre-gameworks) features and guess what? it ran beautifully on both AMD and Nvidia GPUs. Far cry 2 wasn't lot of different with the HD 4870 performing close of the GTX 280..

The witcher 3 it's another game that doesn't favor one or another vendor using gameworks. are you gona cry there too?. it's sad to see a [H]Staff member full of shit speaking of things that just doesn't understand because it doesn't favor his preferred GPU Vendor, go and keep posting Crap biased news and poorly made cooling reviews that's the best you can do for [H]OCP and stop embarrassing statements.
 
lol, your PRO-AMD anti-Nvidia Bias it's pouring from your veins and it's getting out of control.. do you know that Far Cry Tittles have always being some kind of "neutral tittle" always trying to use the best technology available, Far cry 4 even with Gameworks favored A LOT AMD GPUs, with things as an R9 280X/HD7970 outperforming GTX 780s and near GTX Titan levels, crazy right? what kind of gimp there, and that game used A LOT of Gameworks features, such as HairWorks, HBAO+, shadow PCSS, TXAA, and NV God Rays.. and guess what? it played wonderfully on AMD cards EVEN WITH *GIMPWORKS* as even you said and edited ;).. Far cry Primal not used lot of Gameworks features and guess what? Ran equally wonderfully not better, not worse, and that's exactly what happen when a Developer take care on optimizing properly a game indifferently of the optional libraries used (gameworks DLL).. most games where Gameworks it's used and not favor AMD it's due Dev laziness as of course libraries are pre-optimized for Nvidia tessellation... far cry 3 also used top Nvidia (pre-gameworks) features and guess what? it ran beautifully on both AMD and Nvidia GPUs. Far cry 2 wasn't lot of different with the HD 4870 performing close of the GTX 280..

The witcher 3 it's another game that doesn't favor one or another vendor using gameworks. are you gona cry there too?. it's sad to see a [H]Staff member full of shit speaking of things that just doesn't understand because it doesn't favor his preferred GPU Vendor, go and keep posting Crap biased news and poorly made cooling reviews that's the best you can do for [H]OCP and stop embarrassing statements.

Seems like your butt-hurt about him having his own opinion on a company. Also Hariworks did nothing good for the Witcher 3 other then tank performance, I have a 1080 now instead of a 290x and I still leave it off. If your tired of the Nvidia bad press then maybe Nvidia should stop acting like a asshole and perhaps actually engage the press on things rather then telling everyone to mind their own business. Between their issues with Tesla gpu's with math errors, possible issues with their automated car tech and the GPP thing I think their plate is a bit full and they still wont talk about anything other then "the more you buy the cheaper it is" yep sounds like they have gamers in mind. I have owned cards from both companies and right now Nvidia has earned every bit of their bad press lately and last I checked a staff member is allowed to have their opinions.
 
Seems like your butt-hurt about him having his own opinion on a company. Also Hariworks did nothing good for the Witcher 3 other then tank performance, I have a 1080 now instead of a 290x and I still leave it off. If your tired of the Nvidia bad press then maybe Nvidia should stop acting like a asshole and perhaps actually engage the press on things rather then telling everyone to mind their own business. Between their issues with Tesla gpu's with math errors, possible issues with their automated car tech and the GPP thing I think their plate is a bit full and they still wont talk about anything other then "the more you buy the cheaper it is" yep sounds like they have gamers in mind. I have owned cards from both companies and right now Nvidia has earned every bit of their bad press lately and last I checked a staff member is allowed to have their opinions.

I was going to say something but, this is a much better response, especially in my sleep deprived and head cold state of mind. :)
 
lol, your PRO-AMD anti-Nvidia Bias it's pouring from your veins and it's getting out of control.. do you know that Far Cry Tittles have always being some kind of "neutral tittle" always trying to use the best technology available, Far cry 4 even with Gameworks favored A LOT AMD GPUs, with things as an R9 280X/HD7970 outperforming GTX 780s and near GTX Titan levels, crazy right? what kind of gimp there, and that game used A LOT of Gameworks features, such as HairWorks, HBAO+, shadow PCSS, TXAA, and NV God Rays.. and guess what? it played wonderfully on AMD cards EVEN WITH *GIMPWORKS* as even you said and edited ;).. Far cry Primal not used lot of Gameworks features and guess what? Ran equally wonderfully not better, not worse, and that's exactly what happen when a Developer take care on optimizing properly a game indifferently of the optional libraries used (gameworks DLL).. most games where Gameworks it's used and not favor AMD it's due Dev laziness as of course libraries are pre-optimized for Nvidia tessellation... far cry 3 also used top Nvidia (pre-gameworks) features and guess what? it ran beautifully on both AMD and Nvidia GPUs. Far cry 2 wasn't lot of different with the HD 4870 performing close of the GTX 280..

The witcher 3 it's another game that doesn't favor one or another vendor using gameworks. are you gona cry there too?. it's sad to see a [H]Staff member full of shit speaking of things that just doesn't understand because it doesn't favor his preferred GPU Vendor, go and keep posting Crap biased news and poorly made cooling reviews that's the best you can do for [H]OCP and stop embarrassing statements.

Push this stuff all you want, Gimpworks was a real thing and did real damage, at least to the AMD camp. Oh well, not going to buy a Gimpworks card for my personal builds anyways, good for you. :)
 
Now you did it, they will tell you how that was optimized for AMD and thus not representative of what you will get in other games.

And that would be true. Not many games utilize AMDs rapid packed math and shader intrinsics unless AMD heavily sponsors the title.
 
And that would be true. Not many games utilize AMDs rapid packed math and shader intrinsics unless AMD heavily sponsors the title.

Nah, it is just a non lazy game developer not caving to Nvidiots pressure, that is all.
 
Nah, it is just a non lazy game developer not caving to Nvidiots pressure, that is all.

So AMD didn’t heavily sponsor far cry 5? How many other games that lack AMD $$ support rapid packed math and shader intrinsics? I bet you could count all of them on one hand. Even with all this Vega optimization it only brings Vega up to parity with two year old Gtx 1080. Pretty poor showing for AMD in general but that’s typical for them.
 
So AMD didn’t heavily sponsor far cry 5? How many other games that lack AMD $$ support rapid packed math and shader intrinsics? I bet you could count all of them on one hand. Even with all this Vega optimization it only brings Vega up to parity with two year old Gtx 1080. Pretty poor showing for AMD in general but that’s typical for them.

Vega 56 is ahead of the 1080 the 64 is 6 frames off the Titan X at 1080p at 1440p the 56 is still ahead of the 1080 and the 64 and the Titan are within 1 fps of each other. Now at 4K the 56 is 1 frame slower then the 1080 and the 64 is 1 frame slower then the Titan. It's very simply using all the power of the card instead of just running on the card. Also please show me where AMD cut them some huge check? More likely they reached out to them since their games come out on consoles and need all the performance they can get. The plus on this is it runs fine on Nvidia hardware unlike what Gameworks did as a whole. I will fully support any company working with someone to improve the experience on their hardware that doesn't sabotage the performance on my machine running different hardware.
 
It's misleading. The graph in the article cuts off at Vega to make it look like all of AMD is doing well, but in reality, Ubisoft used some Vega-specific features to make it run better. The tech being a year and a half+ newer than everything else on the market should give it the edge, frankly. It's just been such a disappointment in every other aspect that we actually forget it can be competitive when the stars align.

The game supports several advanced Radeon RX Vega-specific features such as Rapid Packed Math, Shader Intrinsics

... Which is great (for Vega) and if EVERY game used this, the high-end landscape would look a lot different. It's a shame nobody except a handful of miners actually bought the GPUs.

The rest of the AMD line-up does about average: FineWine poster children 390 & 390X are getting trounced by 970 & 980 -- strangely, nobody mentions that. Fury X loses to 980 Ti. 580 beats the 1060 6GB but the 480 isn't tested. With regards to "AMD Optimized" the results aren't notably impressive for anything other than Vega.

Also, to my memory, AMD has won every Far Cry benchmark since 3. Or at least performed 'better than usual'. Perhaps engine specific (Dunia engine?) or some other graphical tech the Far Cry devs like to use in their games. This may actually be the worst Far Cry result for AMD since 3.

Plus, it doesn't look like TechSpot used the GameReady driver (unless they updated the results at some point).

edit:
See: Far Cry 4. 290X beating 780 Ti, 290 beating 970, HD7970 beating 780, HD7870 beating 680. AMD basically jumped up 2 tiers on this one. The R9 290/390 and 970 have flipped positions.

HoEGLAb.png
 
Last edited:
If by "Optimized" they mean "not hobbled by Gameworks" I will completely agree :)

Now you did it, you blew the horn and lit the signal in the sky that brought out the NDF! LOL! Ah well, shame but hey, it will never change...……. :)
 
Not only did Techspot not use the latest driver(don't think it makes a meaningful performance difference), but computer and pcgameshardware who DO NOT use the built-in benchmark and use some other custom game loop to test performance found the results did not look the same at all.


1440p btw

113323_upload_2018-3-28_21-31-1.png


Having said that, as TaintedSquirrel said this an AMD aligned game with AMD centric features included. Can you guarantee AMD has the traction to ensure these features continue to be supported for every game ? No.

ManofGod The Holocaust was a real thing and did real damage, and there's evidence for that. Gameworks doing damage? Evidence?

Don't like Hairworks ? Disable it ? I don't see the issue.

"Let's simulate thousands of strands of hair and fur"

Ok.

"It runs like shit"

Ok, turn it off

"It runs worse on AMD"

Wait you're saying the GPUs with a fraction of the front-end throughput NV GPUs have are struggling with geometry heavy effects? The fuck?

I genuinely don't understand this. AMD boasts 13 tflops on their card that barely performs like a 9 tflop GTX 1080, they ***try*** to introduce a bunch of hacks and workarounds to use that huge shader throughput to accelerate the functions that are actually lacking in performance relative to NV. They fail because nobody will actually take the time to develop around these constraints and use their libraries to offset the geometry problems, which is reasonable given their market share... But if NV dares promote libraries that capitalize and their huge advantage in this area it's fucking heresy, but whiny kids on forums with AMD cards feel the need to shit on the whole library because they can't run it at acceptable framerates, therefore it is bad.

upload_2018-3-29_15-56-29.png


I am CERTAIN if the tables were turned and it was in fact a NV sponsored game in which the built-in benchmark produced ***far more positive results*** than actual gameplay there would have been another LETS BOYCOTT NVIDIA thread rofl.
 
Some of you really need to put the "AMD can't compete in performance" koolaid down.


https://www.hardocp.com/news/2018/03/27/far_cry_5_benchmarked_on_50_gpus


Umm based on their history farcry titles are not the best way to find out which cards perform better than others across the whole spectrum. they have a history of trying to utilize tech that gives AMD cards a huge boost in performance misleading AMD card owners once they step out of the realm of those games. While games that offer no advatnage to any brand typically show nvidia murdering amd across the board.

Sorry but AMD has NO answer for 4k@60 high/ultra and they wont for a long while. AMD simply doesn't have the technical/enginnering know-how to keep up with NV and rely very heavily on pity from AMD buyers. Buyers that think that options that are completly capable of being disabled (hairworks for example) some how are designed to intentionally gimp AMD games (whitout any proof of course).

AMDs problems with not ebing able to keep up whether it be from shitty drivers, drivers not ready for big game releases, or shit architecture all which leads up to shit performance is a AMD problem not Nvidia.

This would be like going to a racetrack with a dodge challenger and a mustang and when the mustang won (because its a better track car compared to the challenger) dodge fanbois saying that ford sabo'd the dodge. (i drive a challenger btw lol)

AMD made their cards, Nvidia made theirs better and more powerful. When people dont buy the best on the market they aren't going to get the best on the market performance.
 
Last edited:
If by "Optimized" they mean "not hobbled by Gameworks" I will completely agree :)

I'd like to ask honestly, can gameworks not be disabled? And if it is disabled, does it still hamper AMD cards?

I do not have AMD, haven't since the original Eyefinity Super card (I loved that card and setup)
 
I'm not really a branding guy. It has very little sway in what I purchase. This is probably true for us older guys that know it's all marketing and bullshit to start with.

The real branding behind a video card is what's under the hood.

I can't even tell you what MSI's branding is. And I just recently owned I guess their GTX 1080 Ti Gaming-X .... is "Gaming-X" the brand name on that?

Asus Strix is mid-tier branding so I've NEVER been excited about that.

I will say, I do prefer to own "ROG" products ... mostly the "Hero" branded motherboard line.

The truth is, it's all BS. There is nothing stopping these guys from introducing new branding.

I mean, who really fcking cares?

None of this branding shit matters at the end of the day.
 
  • Like
Reactions: 50Cal
like this
I'm not really a branding guy. It has very little sway in what I purchase. This is probably true for us older guys that know it's all marketing and bullshit to start with.

The real branding behind a video card is what's under the hood.

I can't even tell you what MSI's branding is. And I just recently owned I guess their GTX 1080 Ti Gaming-X .... is "Gaming-X" the brand name on that?

Asus Strix is mid-tier branding so I've NEVER been excited about that.

I will say, I do prefer to own "ROG" products ... mostly the "Hero" branded motherboard line.

The truth is, it's all BS. There is nothing stopping these guys from introducing new branding.

I mean, who really fcking cares?

None of this branding shit matters at the end of the day.

It's funny that you say you have no interest in the ASUS STRIX brand as it's "mid-tier" branding and yet you like the "Hero" branded stuff which is the bottom of the barrel in the standard ROG family. Hero stuff is stripped down compared to the Formula or Extreme offerings.
 
It's funny that you say you have no interest in the ASUS STRIX brand as it's "mid-tier" branding and yet you like the "Hero" branded stuff which is the bottom of the barrel in the standard ROG family. Hero stuff is stripped down compared to the Formula or Extreme offerings.

Why is that funny?

Hero->Strix

ROG because the warranty is handled better / differently than Strix.

Also, I resell shit. Kids it up ROG.

But I always buy my Motherboards from Microcenter so I can return the board when there are issues.
 
AMD has nothing great in VR. That's my main issue with the GPU side.
 
Why is that funny?

Hero->Strix

That's debatable. The differences are minimal at best. ROG STRIX and Hero motherboards are remarkably similar and certainly don't match up to proper ROG motherboards with Formula or Extreme monikers associated with them.

ROG because the warranty is handled better / differently than Strix.

Maybe. I haven't experienced ASUS customer service in years.

Also, I resell shit. Kids it up ROG.

But I always buy my Motherboards from Microcenter so I can return the board when there are issues.

You do realize that it says both ROG and STRIX on the same boxes right?
 
You do realize that it says both ROG and STRIX on the same boxes right?

Indeed, my lowly 1070 is branded as both ROG and STRIX. Here's the marketing blurb from my graphics card's home page:

ROG Strix is the newest recruit into the Republic of Gamers. A series of specialized gaming gear designed for the rebel in all of us, Strix exemplifies ROG's premier performance, innovative technology, and leading quality, but with its own confident and dynamic attitude. Featuring bold designs and bright colors, this exciting new series possesses a spirit of fierce individualism that charges every gaming experience with thrilling energy. ROG Strix equips players with the necessary speed and agility to dominate their game. A new generation of force has arrived. Join the Republic and experience the power of ROG Strix.

Source link: https://www.asus.com/us/Graphics-Cards/ROG-STRIX-GTX1070-O8G-GAMING/
 
Strix is the top-end for ASUS video cards, Maximus for motherboards. All are part of ROG.
Why are there no Maximus video cards? Who knows.
 
Strix is the top-end for ASUS video cards, Maximus for motherboards. All are part of ROG.
Why are there no Maximus video cards? Who knows.

Negative. The Maximus is the series, not the level of motherboard. For example: There is a Maximus VI Gene, Extreme, Impact, and Formula. "Maximus" indicates that it's a mainstream part. Rampage motherboards are HEDT. You can tell what level they are by the last part of the name, not the first. A Rampage V Extreme indicates that it's probably top of the stack although Formula and now APEX motherboards are sort of equal to the Extreme. They have different feature sets which emphasize performance and a less is more approach. They aren't stripped the way Hero or Ranger boards are. Even more difficult to understand that Extreme, APEX, or even Formula aren't always the top offering either. Sometimes you have Black Editions.

Budget Oriented ROG boards:
Hero
Ranger

M-ATX
Gene

Mini-ITX
Impact

Performance
Formula
APEX

Shit ton of Features and Performance
Extreme

*Special
Black Edition (These are HEDT only and aren't available in all generations. There is no Rampage VI Black Edition, but there is a Rampage III and IV.
Edition 10 (This probably won't ever happen again as it's a 10th anniversary board. We may one day see an Edition 20 or something if ROG sticks around that long.

Maximus or Rampage would indicate family, not product level. Here are examples:

Maximus
Rampage
Crosshair
Zenith

There are other families, but those are the main ones in the motherboard line. It breaks down like this:

[Family Name] + [Iteration] + [Level or Type]

Therefore we get something like this: Rampage III Black Edition. Rampage tells us that it's a family. Specifically, an HEDT motherboard as all Rampage motherboards have been HEDT offerings. The "III" indicates that it's the third motherboard generation from that family. Lastly, "Black Edition" indicates that it's a premium part that's above all other models. Usually Black Editions are late in a chipset life cycle and are the culmination of everything ASUS learned in that processor and chipset generation. The Rampage III Black Edition is the finest X58 chipset based motherboard ever produced.

STRIX family boards are part of ROG, but they fall close to the bottom of the pyramid with overlap. These are more or less along the lines of the Hero boards but with extra bling. The naming convention is much easier to understand as it's a blend of ROG and channel boards. ROG STRIX + [Chipset] + [Suffix]. So a ROG STRIX Z370-I would be a Z370 Express based mini-ITX motherboard. STRIX for motherboards isn't the top of the stack. Hero and Ranger boards are sure as shit cut way back from their Formula, APEX and Extreme counterparts. They aren't even close to the top of the ROG product stack.

11840_asus-motherboard-range-2017-revision.jpg


When you look at ASUS' own slide for the ROG family, it appears that ROG STRIX falls underneath all ROG motherboards, and that's basically true. However, some motherboards in the STRIX line aren't functionally that far off their Hero counterparts "above" them. I'm not sure what the fuck ASUS was thinking when they made "STRIX" the top end for graphics cards and made it mid-range for motherboards.
 
Last edited:
Negative. The Maximus is the series, not the level of motherboard. For example: There is a Maximus VI Gene, Extreme, Impact, and Formula. "Maximus" indicates that it's a mainstream part. Rampage motherboards are HEDT. You can tell what level they are by the last part of the name, not the first. A Rampage V Extreme indicates that it's probably top of the stack although Formula and now APEX motherboards are sort of equal to the Extreme. They have different feature sets which emphasize performance and a less is more approach. They aren't stripped the way Hero or Ranger boards are. Even more difficult to understand that Extreme, APEX, or even Formula aren't always the top offering either. Sometimes you have Black Editions.

Budget Oriented ROG boards:
Hero
Ranger

M-ATX
Gene

Mini-ITX
Impact

Performance
Formula
APEX

Shit ton of Features and Performance
Extreme

*Special
Black Edition (These are HEDT only and aren't available in all generations. There is no Rampage VI Black Edition, but there is a Rampage III and IV.
Edition 10 (This probably won't ever happen again as it's a 10th anniversary board. We may one day see an Edition 20 or something if ROG sticks around that long.

Maximus or Rampage would indicate family, not product level. Here are examples:

Maximus
Rampage
Crosshair
Zenith

There are other families, but those are the main ones in the motherboard line. It breaks down like this:

[Family Name] + [Iteration] + [Level or Type]

Therefore we get something like this: Rampage III Black Edition. Rampage tells us that it's a family. Specifically, an HEDT motherboard as all Rampage motherboards have been HEDT offerings. The "III" indicates that it's the third motherboard generation from that family. Lastly, "Black Edition" indicates that it's a premium part that's above all other models. Usually Black Editions are late in a chipset life cycle and are the culmination of everything ASUS learned in that processor and chipset generation. The Rampage III Black Edition is the finest X58 chipset based motherboard ever produced.

STRIX family boards are part of ROG, but they fall close to the bottom of the pyramid with overlap. These are more or less along the lines of the Hero boards but with extra bling. The naming convention is much easier to understand as it's a blend of ROG and channel boards. ROG STRIX + [Chipset] + [Suffix]. So a ROG STRIX Z370-I would be a Z370 Express based mini-ITX motherboard. STRIX for motherboards isn't the top of the stack. Hero and Ranger boards are sure as shit cut way back from their Formula, APEX and Extreme counterparts. They aren't even close to the top of the ROG product stack.

View attachment 62969

When you look at ASUS' own slide for the ROG family, it appears that ROG STRIX falls underneath all ROG motherboards, and that's basically true. However, some motherboards in the STRIX line aren't functionally that far off their Hero counterparts "above" them. I'm not sure what the fuck ASUS was thinking when they made "STRIX" the top end for graphics cards and made it mid-range for motherboards.

If you don't know WTF they were thinking with strix.. trust me idk what the heck they did to my beloved TUF sabertooth series sending it to the bottom to "first time builders/entry gamers.." TUF were always my favorite to find motherboard until z170.. the z77 sabertooth was probably the best mainstream motherboard I've ever bought.
 
I have done all the support I can, I own at R9 380 and Vega 56. MY cpu's and mainboards are Ryzen based and even my little computer has a Athlon 5350 that just keeps chugging along. :) I am not going to crossfire because more games do not do mgpu anymore. :( That and there is no where to buy a second reference Vega 56 at a good price either.
 
Nah, it is just a non lazy game developer not caving to Nvidiots pressure, that is all.

Damn, I might as well be a prophet, LOL! They are now stepping all over themselves to try to one up each other. Now we are talking about motherboards when Intel was not even mentioned.
 
You're right, I forgot about Rampage.


Should be more specific (ROG Non-strix), there are tons of Strix Z370 boards for example that are sub-$200. The hero is like $250.

https://www.newegg.com/Product/Product.aspx?item=N82E16813119035

ASUS' confusing schemes... They use the same name label for two different tiers.

ASUS' current naming scheme is confusing. Whenever they throw a new name at me I'm always like "WTF? I just learned the last scheme."

If you don't know WTF they were thinking with strix.. trust me idk what the heck they did to my beloved TUF sabertooth series sending it to the bottom to "first time builders/entry gamers.." TUF were always my favorite to find motherboard until z170.. the z77 sabertooth was probably the best mainstream motherboard I've ever bought.

I know. I hadn't reviewed any TUF boards since that happened. However, I installed one for my girlfriend's father and I couldn't believe the crappy looking budget board that sat in front of me carried the name "Sabertooth." Those motherboards used to be every bit as good as anything in ROG, they just had a different configuration and appealed to a slightly different market.

Asian company, bad english. Maybe it makes more sense in Chinese or whatever they speak in Taiwan.

Global marketing definitely comes into play. I can't recall which name it was, but there was some product Intel skipped a sequence on a few years back as it translated to something sexual in one of the Asian languages.
 
Damn, I might as well be a prophet, LOL! They are now stepping all over themselves to try to one up each other. Now we are talking about motherboards when Intel was not even mentioned.

Neither was Ryzen yet here you are...
 
This is stupid.

Buy the gpu that does better in the applications you're running. Don't reward failure.

Fanboys and their bullshit... "Wait, wait, this time AMD will be competitive..."

The last AMD/ATI gpu I owned was a 290x. All I've heard for years has been whining and "Wait, wait, Polaris will be awesome. It'll overclock so high..."

"Wait, wait. We didn't mean Polaris. Vega! Vega is gonna be awesome..."

"Navi... Navi is gonna be the shit..."

I game at 4k. I am a heavy user of VR. I like not needing a window AC unit in my office. AMD will get my money once they've earned it.

Till then, I'll keep buying Nvidia.
 
Back
Top