Did the crazy, out with Titan Xp, in with Vega 64 Liquid...

nEo717

Limp Gawd
Joined
Jun 2, 2017
Messages
358
Pre GPP I was solid Nvidia Fan boy I guess - Post GPP, I did the crazy and pulled my Titan Xp (AIO) for the RX Vega 64 Liquid...

Currently (on this PC) I mostly play COD WWII, BF 1 Alpha and CTE, and couple other Games to come soon - And I have to say after seeing so many (posts and reviews) hammer on the Vega 64 a bit about being slow, not really gaming card etc.... I expected to have to put up with some lag and low frame rates, but that has not proven itself to the case.

Its not only fast, its smooth, I do feel the images are definitely different than what I'm used to with my Titan Xp - In my opinion (old eyes) colors seem to have more depth (LG 850 GSync 165Hz Monitor, 2560x1440). The AMD Radeon Settings Panel seems a bit foreign to me... though its growing on me, and I don't miss gsync (this monitor with Vega 64 is working great at 165Ghz).

Game Play, once Multi-Player maps load (WWII) its almost always hitting 167FPS (have it capped 165), which what I've noticed is that this Vega 64 seems to be doing getting min. frame rates than my Titan Xp, however the Titan Xp peak frames (when turn frame limits off) will pump out 3 to 5 higher peak frames here and there. I've found this very interesting, in my mind it opens up the debate of (for FPS game) is min. frames or average frames more desired...

I've just figured out how to get to (and found) Global WattMan), lol - I don't care about noise to I pumped fans up to 2800 rpm min. with 3300 target.

Power limit, its at 25%, looks like I can go up to 50% if I turn custom on... I don't care about using less power, however is this a good or bad idea to bump the power up?

Also notice the max frequency MHz of memory currently is at 965 - on the liquid Vega 64 how far safely up (what's a good base) can one bump this up to, or just leave it alone?
 
I'm sorry for your loss.

As noted above, actually not a loss... I'm very impressed actually (for my limited world I play in) - I have no doubt there are many things (I don't do) where the Titan Xp would be missed, but so far, I'm pleased (only been few days though so far)...

I'm guessing by looks of it most everyone here bleeds nvidia (I was one of you) - For any AMD fans I'd love to hear thoughts on the Power and Memory settings in opening post.
 
Sounds like you didn't need the Titan in the first place.

The 1080 before it nearly ever topped frames I was looking for, the Titan Xp has its hands full staying above 165 frames in most of the games I play - That is why my surprise, as all the talk about how bad AMD is, I thought the worst, and to be frank even many of the reviews barely put vega 64 above the 1080 which fell short for me - maybe I have a lucky Liquid Vega 64.... but its OK actually.
 
As noted above, actually not a loss... I'm very impressed actually (for my limited world I play in) - I have no doubt there are many things (I don't do) where the Titan Xp would be missed, but so far, I'm pleased (only been few days though so far)...

I'm guessing by looks of it most everyone here bleeds nvidia (I was one of you) - For any AMD fans I'd love to hear thoughts on the Power and Memory settings in opening post.

I was just ribbing you a bit.

You've already bought deeply into Nvidia's world by buying both a Titan and a Gsync screen. Your 'moral' decision does nothing to their bottom line and cost you a stack of cash, a little top end performance and VRR.

And yeah, I really don't give a shit who made the GPU I play on. All that matters is that the drivers don't crash and the frames come smoothly.
 
There is one note from nEo717 that I somewhat experienced going from Nvidia, AMD, to Nidia. The quality of colors seemed different in a good way on the AMD cards I had. Then changed slightly different when going to Nidia in a not so good way. Not sure how to explain it but I noticed a difference in colors, this is without any sort of color calibration or anything of the sort, more of an out of box thing and have just left it that way over the years. Other wise, I'd never go from TitanXp to Vega64 but that's just me any my two cents....
 
I was just ribbing you a bit.

You've already bought deeply into Nvidia's world by buying both a Titan and a Gsync screen. Your 'moral' decision does nothing to their bottom line and cost you a stack of cash, a little top end performance and VRR.

And yeah, I really don't give a shit who made the GPU I play on. All that matters is that the drivers don't crash and the frames come smoothly.


You nailed it... Its been so long since I've had ATi in my main system, and with GPP crap, I figured why not play around. I had massive widow maker heart attack and post recovery combined with getting little older I look for all smooth frame rates I can get...
 
There is one note from nEo717 that I somewhat experienced going from Nvidia, AMD, to Nidia. The quality of colors seemed different in a good way on the AMD cards I had. Then changed slightly different when going to Nidia in a not so good way. Not sure how to explain it but I noticed a difference in colors, this is without any sort of color calibration or anything of the sort, more of an out of box thing and have just left it that way over the years. Other wise, I'd never go from TitanXp to Vega64 but that's just me any my two cents....


I'm seeing what you mean... I'm so used of Nvidia now though, I've logged hours and hours with every Titan so far - The Vega 64 color clarity or something of the like is different, and think my old eyes may favor the Vega as well.
 
If it's a placebo, a lot of us are seeing it. There is an ever so slight saturation difference. On my NV systems I usually end up setting the Digital Vibrance to 55% or 60% to get the same pop I see on AMD cards.

I have no clue if it's accurate. In fact it probably isn't. This is far far different from the old days where the analog circuitry on ATI and Matrox cards were obviously superior to Nvidia and 3dfx.
 
I have no clue if it's accurate. In fact it probably isn't. This is far far different from the old days where the analog circuitry on ATI and Matrox cards were obviously superior to Nvidia and 3dfx.

With default settings, i.e. before calibration, AMD/RTG cards are more, I think the most accurate term would be, vibrant. Once calibrated, this difference goes away. Even nudging up the brightness is usually enough to negate the perceived difference.
 
Calibrate your display. I have both AMD and nVidia cards in my workstations for video work (I'm a video engineer) and I can tell you factually that when they're calibrated they display the same.
There is one note from nEo717 that I somewhat experienced going from Nvidia, AMD, to Nidia. The quality of colors seemed different in a good way on the AMD cards I had. Then changed slightly different when going to Nidia in a not so good way. Not sure how to explain it but I noticed a difference in colors, this is without any sort of color calibration or anything of the sort, more of an out of box thing and have just left it that way over the years. Other wise, I'd never go from TitanXp to Vega64 but that's just me any my two cents....

Pre paxwell, Nvidia were cunts and didn't give it's cards 10bit functionality outside of DX windows. That's why AMD cards always looked a bit better, they gave you 10 bit on desktop and games, for those that can take advantage of it, you never want to go back to banding 8bit bullshit once you see the difference. More monitors have 10bit LUTs now so it really makes sense to do this.

The negative salt from the usual players in this thread was quite hilariously predictable. How dare you do such a thing! Great to play around with different hardware sometimes..
 
congrats on your purchase. wish I could swing a v64. I'll probably end up with a v56.

as far as tuning core speed I think the trick is to up the power limit and leave it alone. a lot of overclocking includes max power limit, undervolting and raising clocks.
 
Why would you use an AMD card with a new G-Sync display? You paid $200 for a VRR module you can’t even use. Makes zero sense but whatever.
For the games they play they get over frame cap for the monitor, so it doesn't matter.
They can always pick up a cheap VRR TV in the coming year and keep the G-sync screen for if they get another nvidia card in future...
 
Sorry, but this is not 'crazy'. "Crazy" implies some kind of intelligent plan, however crazy, to achieve a desired objective.:woot:

All that's happened here is, well, there is no word for it, because there exists no sane context from which to judge it from. When the only positive thing you can say is, "This self inflicted downgrade is not as bad as I thought it might be", well that's just pure madness! :wacky:
 
"This self inflicted downgrade is not as bad as I thought it might be"

0AXlBTh.gif
 
I get wanting to play with AMD- have a secondary system with a FreeSync display myself- but I wouldn't gimp my main system to do it

:ROFLMAO:


[and yes, as games increase in performance demands, the Vega will show its limitations before the Titan Xp would have, while G-Sync is still useful at 165Hz]
 
My friend has a Vega 64 sapphire nitro (the best Vega 64 on the market) that isn’t cutting it with his 38” freesync monitor at 3840x1440. He gets 40fps In vermintide 2 at max settings. He recently bought a 1080ti to try. Had the paths crossed I should have pointed him your direction so you could have swapped cards. You gave up AT LEAST 30-35% performance and picked up the occasional driver bug.

Bad move IMO since you already had the best hardware on the market, and as the other poster said, Nvidia already had your Money. Next gen swap to AMD For your frustration with GPP I understand. Same gen downgrade, I can’t understand.

I’ve used both over the last few months

Vega 64 on HP 32” 1440p 75hz Omen monitors with freesync, and now the 1080ti with a 34” Alienware 3440x1440 at 120hz with gsync.

I will agree that the AMD card without any other adjustments has a better look and I think it’s black levels that are better. That might be the pop you are seeing. Specifically, on my a Panasonic ae8000u hometheater projector the AMD 285, Fury X, and Vega cards had better black levels than the Nvidia 670 I had before and the 1080ti I have now.

Also I think the 10 bit monitors help.

As to freesync vs gsync. —- in range it doesn’t matter. They are identical —- but gsync range is generally significantly larger.

Drivers are better on Nvidia - no question.

Performance of the 1080ti is very significantly better than the Vega 64. Very significantly.
 
Last edited:
My friend has a Vega 64 sapphire nitro (the best Vega 64 on the market) that isn’t cutting it with his 38” freesync monitor at 3840x1440. He gets 40fps In vermintide 2 at max settings. He recently bought a 1080ti to try. Had the paths crossed I should have pointed him your direction so you could have swapped cards. You gave up AT LEAST 30-35% performance and picked up the occasional driver bug.

Bad move IMO since you already had the best hardware on the market, and as the other poster said, Nvidia already had your Money. Next gen swap to AMD For your frustration with GPP I understand. Same gen downgrade, I can’t understand.

I’ve used both over the last few months

Vega 64 on HP 32” 1440p 75hz Omen monitors with freesync, and now the 1080ti with a 34” Alienware 3440x1440 at 120hz with gsync.

I will agree that the AMD card without any other adjustments has a better look and I think it’s black levels that are better. That might be the pop you are seeing. Specifically, on my a Panasonic ae8000u hometheater projector the AMD 285, Fury X, and Vega cards had better black levels than the Nvidia 670 I had before and the 1080ti I have now.

Also I think the 10 bit monitors help.

As to freesync vs gsync. —- in range it doesn’t matter. They are identical —- but gsync range is generally significantly larger.

Drivers are better on Nvidia - no question.

Performance of the 1080ti is very significantly better than the Vega 64. Very significantly.
I get driver bugs on Nvidia so please don't start that driver crap. Performance sure, drivers no.
 
Performance of the 1080ti is very significantly better than the Vega 64. Very significantly.

I think the 1080 Ti was the clear winner in the higher end gaming category this generation. Though lately I've seen benches that are better than I would have expected out of the Vega 64. But Vega is always behind.

AMD has done some pretty solid work with desktop Ryzen, Vega is a great mining part, but AMD has dropped the ball on high end gaming for two years now at least.
 
I get driver bugs on Nvidia so please don't start that driver crap. Performance sure, drivers no.


When’s the last time you used an AMD card?

As I said

My last few gaming cards have been:


< 2015 all Nvidia 100% back to 3dfx voodo days.

2015 - GTX 670, AMD 285, AMD 380 (I turned to nvidia for PLP support which Nvidia didn't and still doesn't support - https://hardforum.com/threads/can-someone-clarify-for-me-which-amd-cards-support-plp-gaming.1861756/)

2016 - AMD Fury X

2017 - AMD Vega 56, AMD Vega 64, and then back to Nvidia with a 1080ti


*If you want to count mining card experience - add just about every card from either vendor since RX480 to this list, and 1060 and up for nvidia.


I play 20-30 hours of PC games a week and have since grade school. I’m 38 years old. I’ve hosted and organized at least 50 LAN parties.

I’m telling you AMD GPU drivers are more buggy. There is no question. You know the reason I was 100% Nvidia before the AMD 285? Because of all those LAN parties I hosted where the AMD GPU owners were fighting bugs in their drivers —- to the worst of it in the early days —- actually having older drivers the AMD owners would reinstall to make certain games work and regularly swapping drivers multiple times during the LAN party to play each different game. (UGH!!!!)

I suspect I have more experience with recent cards than most people. Feel free to back up your statement with evidence to your experiences.

If Nvidia drivers are a solid "A" for reference then AMD 285 drivers were mostly good - I'd grade them a "B+". AMD Fury X drivers were Grade "A-" by the time I picked one up in Jan. 2016, and issues I encountered were minimal, but not absent. Vega 56 and 64 drivers were an absolute "D" at launch. (Freesync didn't work, Crossfire didn't work, Eyefinity Bezel Correction didn't work, Chrome Browser didn't work, FPS ranges were terrible -- PUBG for instance I'd get 90FPS, then 8 FPS in the same match. Same thing for Wolfenstein New Order 90FPS - 30FPS range was common in playing through the game) Overclock settings didn't stick in Wattman, Red Screens of Death, my CPU overclock that was stable for many months suddenly became unstable. When I went back to Nvidia, I was able to restore my original CPU overclock with not one issue. The Vega drivers for the first few months after launch were nothing short of junk). I'VE NEVER ENCOUNTERED AN NVIDIA DRIVER EVEN CLOSE TO A SUBJECTIVE "D" RATING in 20 some years of using Nvidia.

Subjectively, I'd say it's a 3:1 ratio for AMD bugs to Nvidia driver bugs AFTER AMD drivers are mature. Before they are mature - it's not even worth owning the AMD card. I imagine Vega is approaching a mature driver by now - and most of the bugs are probably defeated - but certainly not all -- I was over at my friends house last night, looking at his Vega 64 and his new 38" monitor. He loaded up Path of Exile - the textures were blurred in town???, he got 72FPS in town and 150FPS in the game world - even though he set his frame rate target to 72FPS. (Neither issue happens with nvidia), he loaded up Vermintide 2 and his mouse cursor wouldn't disappear off the screen (doesn't happen on nvidia) he loaded up the Freesync windmill demo - Freesync didn't work. He tried to set a game profile in the AMD drivers with a target FPS, and it wouldn't stick when he launched the game - only the global config would stick. We tried to play a half dozen 4K videos on youtube with Edge browser to experience HDR - Edge wouldn't play above 1440K.even if you selected 4K - it would always just pick 1080p or 1440. We couldn't figure it out. We loaded up Firefox and it played the 4K videos just fine as expected - so it must be some Edge incompatibility with Vega drivers preventing 4K playback in Edge. (doesn't happen with Nvidia). Get the idea? These were issues we just discovered in casual use. I can only think of one single issue I have with my 1080TI. When I play Kingdom Rush ( a little casual tower defense game) at 1100FPS - I get blinking on the 1080TI. I've encountered NO other glitches on my 1080TI since last summer.
 
Last edited:
When’s the last time you used an AMD card?

As I said

My last few cards have been


< 2014 all Nvidia 100% back to 3dfx voodo days.
2014 Nvidia GTX 670
2015 AMD 285, AMD 380, AMD Fury X
2017 AMD Vega 56, AMD Vega 64/ Nvidia 1080ti

I play 20-30 hours of pc games a week and have since grade school. I’m 38 years old. I’ve hosted and organized at least 50 LAN parties.

I’m telling you AMD GPU drivers are more buggy. There is no question. You know the reason I was 100% Nvidia before the AMD 285? Because of all those LAN parties I hosted where the AMD GPU owners were fighting bugs in their drivers —- to the level of the early days —- actually having older drivers they would reinstall to make certain games work and swapping drivers multiple times during the LAN party.

I suspect I have more experience with recent cards than most people. Feel free to back up your statement with more evidence to your experiences.
Any examples post 2015.? Not saying there aren't, but you could've backed up your rant with some recent examples.

I've had mixed experiences with both. Nvidia was generally worse with HTPC usage, while AMD had worse desktop bugs. Card burnings were Nvidia's exclusive and there were some installation blunders recently with some new cards, so I would say neither is significantly better.
 
Haha, the children are awaken. It really does not matter as the power consumption during gameplay is pointless as you don't game on 100 gpus at once. Factoring in the Freesync vs G-Sync makes it even less important as to which card is faster in any game.. Neither Vega or 1080 Ti is 4 K capable. Can the 1080 Ti return 43 Mh/s in ETH or over 2000 in XMR >? at 160 Watts, No. Which one of these cards is better ?. Draw your own conclusions. He made a great choice. I do own 1080Ti's as well and a fair amount of Vega 64's.
Given a choice either for gaming or mining. I would take a Vega over a 1080Ti every day all day :D
 
Haha, the children are awaken. It really does not matter as the power consumption during gameplay is pointless as you don't game on 100 gpus at once. Factoring in the Freesync vs G-Sync makes it even less important as to which card is faster in any game.. Neither Vega or 1080 Ti is 4 K capable. Can the 1080 Ti return 43 Mh/s in ETH or over 2000 in XMR >? at 160 Watts, No. Which one of these cards is better ?. Draw your own conclusions. He made a great choice. I do own 1080Ti's as well and a fair amount of Vega 64's.
Given a choice either for gaming or mining. I would take a Vega over a 1080Ti every day all day :D

1080ti isn’t 4k capable? :rolleyes: I think a Vega64 or 1080ti would do alright. I DSR to 4k in the vast majority of games.

What kills AMD for me is having to drive VR.
 
Any examples post 2015.? Not saying there aren't, but you could've backed up your rant with some recent examples.

I've had mixed experiences with both. Nvidia was generally worse with HTPC usage, while AMD had worse desktop bugs. Card burnings were Nvidia's exclusive and there were some installation blunders recently with some new cards, so I would say neither is significantly better.
I've updated my post --- four issues with Vega 64 were from just last night.
 
Not the way I want to play. They need to do at least 100 FPS on ultra settings or a little less if its Freesync or G-sync.
 
Ignorance is a bliss then.

Hmmmm…. Clearly, there is not ignorance and he is sharing his exact experiences, not some internet agenda, like some will post here. OP, glad you enjoy what you enjoy and that your experiences are positive. Do not worry about the bashers, they will never be convinced, even if they saw it with their own eyes. ;)
 
Last edited:
Ah, forget it, wish I could have afforded to get one and had a place to install the radiator.
 
There is one note from nEo717 that I somewhat experienced going from Nvidia, AMD, to Nidia. The quality of colors seemed different in a good way on the AMD cards I had. Then changed slightly different when going to Nidia in a not so good way. Not sure how to explain it but I noticed a difference in colors, this is without any sort of color calibration or anything of the sort, more of an out of box thing and have just left it that way over the years. Other wise, I'd never go from TitanXp to Vega64 but that's just me any my two cents....

Yeah, I noticed a distinct color difference as well, good to see my experiences confirmed.

Yeah it's called the placebo effect. OP, not the wisest decision but if it helps you sleep better at night then enjoy the Vega. I'm sure whoever bought your Titan Xp at discount is enjoying it.

Yeah, it is called you are wrong, many have experienced the exact same thing. Must be a group thing, eh?
 
Lets get realistic here. The OP's monitor "LG 32GK850G-B" according to his signature is a 2560x1440 monitor. This is not a demanding resolution. At that resolution, the difference between a Titan Xp and a Vega 64 card is going to be academic at best. I wouldn't rip my NVIDIA cards out of my system given that NVIDIA already got my money. I suppose your supporting AMD by buying their card, but its a colossal waste of money given that you are technically taking a performance hit to do it. I think the switch and the perceived differences are mostly a placebo or imagined effect. I've compared NVIDIA and AMD products side by side and its very hard to tell the difference between the two in cases where one card doesn't have a huge advantage over the other one.
 
Again it's not just about the raw performance difference between the cards. He is also losing VRR entirely by switching to AMD because the monitor is G-Sync only. Modern demanding games like BF 1 are not getting >165FPS minimums at that resolution with a Vega 64 or a 1080Ti, so yes having VRR will provide the smoothest experience

If OP is sticking with the Vega, he should return or sell that monitor and get a FreeSync display. He will save money and have VRR, plus be out of the nVidia ecosystem entirely as per his own stated desire.
 
Back
Top