Some GOOD News

What, they were DX12 titles.....

its the same engine as Forza 7 man. Forza 3 and 6 use the same engine..... Forza 7 has been updated a bit but same engine non the less.

Game dev did something specifics they put the load on pretty much one core, they have done this on previous Forza's too, and ever time they did this it fucks with nV's driver and then later on they fix it....... Same thing happened here.

Odd , very odd. and it all got fixed by a driver , usually you would need to change it in the game code if all it did was send data to 1 core and that did not happen for AMD ?
 
Odd , very odd. and it all got fixed by a driver , usually you would need to change it in the game code if all it did was send data to 1 core and that did not happen for AMD ?


You can still change the way the driver functions and calls for things in the CPU via driver, its just looking at where the problem is and replacing it. Takes more time, but it can be done. LLAPI's don't stop that man.

Just because the dev's have lower level access doesn't mean the drivers loose their access, better yet drivers have access to what the cpu is doing now.
 
Hmm did you seen Forza 3 and Forza 6 have the same issue when they came out too?

Just because you are getting Vega doesn't mean you sit here and talk BS.
I have both so that means I can talk BS :D

Razor1, you just have to break down and buy a Vega, you know it is in your heart to get one. That way on new releases you will be running before you hit the ground ;). Kidding aside AMD is kinda slapping Nvidia a little with these new releases - which is fun to watch. Nvidia is quick though in getting the driver out (way quicker I've ever remember RTG/AMD was in the past.)
 
I have both so that means I can talk BS :D

Razor1, you just have to break down and buy a Vega, you know it is in your heart to get one. That way on new releases you will be running before you hit the ground ;). Kidding aside AMD is kinda slapping Nvidia a little with these new releases - which is fun to watch. Nvidia is quick though in getting the driver out (way quicker I've ever remember RTG/AMD was in the past.)


Wont' get it unless its useful to me lol.

Well this problem was really easy to do, the same problem as before, so they knew what to do.
 
In other words, Nvidia has no idea what they are doing and that is our fault? LOL!

Please explain the "logic" that made you post this?
I am really interested in reading that "line-of-logic".

Don't go full retard...this thread will still be here in a few weeks...most likely with a new perspective on the performance as bugs are eleminated....just saying.

Funny sidenote....the OP made his account to post this thread...and then "disappeared".
 
Please explain the "logic" that made you post this?
I am really interested in reading that "line-of-logic".

Don't go full retard...this thread will still be here in a few weeks...most likely with a new perspective on the performance as bugs are eleminated....just saying.

Funny sidenote....the OP made his account to post this thread...and then "disappeared".

Well it is simple Nvidia DX12 driver problem (seemingly unable to break up code to multiple threads)and according to Razor1 it is not the first time for Nvidia to have the same DX12 driver problem. Yeah well most if not all of your statements in this thread are devoid of any substance but for the name calling and complaining about a account.
 
Well it is simple Nvidia DX12 driver problem (seemingly unable to break up code to multiple threads)and according to Razor1 it is not the first time for Nvidia to have the same DX12 driver problem. Yeah well most if not all of your statements in this thread are devoid of any substance but for the name calling and complaining about a account.

I think you confuse Forza's game-engine with DX12...how about that for context.
 
Well it is simple Nvidia DX12 driver problem (seemingly unable to break up code to multiple threads)and according to Razor1 it is not the first time for Nvidia to have the same DX12 driver problem. Yeah well most if not all of your statements in this thread are devoid of any substance but for the name calling and complaining about a account.


Look the Forza engine developers said the engine is not a problem, the Forza game developers said they did something for specific reasons, to reduce input lag, yeah using more cores can cause input lag, a side effect of multi-threaded applications. So from a dev point of view they need to ensure threading an application threads are coming in when they should so they don't come across this, its a hard task to do, the developer first must know how the to do based on the application needs, then also understand how the driver will react on top of that they also need to understand how and when input devices react too, a bit more complex than the traditional put something up on the screen. Graphics drivers are made to best suite through put of a given architecture. Not to sit an analyze what the CPU is doing and what the input drivers and hardware are matching up with what is being rendered. So if the developer is focused on making sure input lag isn't there for Xbox, that will not go over well with PC games on another architecture.

This is the problem with LLAPI programing, whats good for one thing is not good for another.

We are actually going on in a circle here both AMD and nV agreed in 2016 at GDC, that developers have to be aware of the challenges that different architectures bring to LLAPI programming. These are the same issues that ASM, LL programing had in the past. Now current curriculum in CS requires ASM classes (its a 300 level course), but by no means does that mean a person without extensive experience understands the architectural concepts of different GPU's. Most companies that port over games, use their less experienced teams to do the port, because the main team is put on a new project which definitely needs more experienced people. This is why we see many ports that come out are just crappy till they get a few updates out.

If you watched that concurrent vs. parallel video by the google employee I link to in another thread, many programmers don't' understand what parallel vs concurrent work loads are, but we have people here that don't have an ounce of programming experience wanting to say its serial vs parallel is the difference between architectures.

If its confusing for programmers, what makes a non programmer so assure of themselves they know whats right? Because they were told so by AMD, or by some else that has no experience in programming?
 
Last edited:
Are we going to have a celebration every time Vega 64 passes GTX 1080? What about the scores of new games where GTX 1080 is ahead of Vega 64?


Project Cars 2
http://www.pcgameshardware.de/Proje...Specials/Benchmark-Test-Grafikkarten-1238952/
https://www.computerbase.de/2017-09...afikkartenbenchmarks_von_full_hd_bis_ultra_hd
http://www.benchmark.pl/testy_i_recenzje/project-cars2-test/strona/28643.html
http://pclab.pl/art75527-5.html
https://www.purepc.pl/karty_graficz...jnosci_kart_graficznych_i_procesorow?page=0,7

Dishonored Death of the Outsider
http://gamegpu.com/action-/-fps-/-tps/dishonored-death-of-the-outsider-test-gpu-cpu
https://www.purepc.pl/procesory/tes..._death_of_the_outsider_bywalo_gorzej?page=0,8

F1 2017
https://www.computerbase.de/2017-08/f1-2017-pc-benchmark/2/#diagramm-f1-2017-1920-1080

Ark Survival
http://gamegpu.com/action-/-fps-/-tps/ark-survival-evolved-test-gpu-cpu

Recore: Definitive Edition
http://gamegpu.com/action-/-fps-/-tps/recore-definitive-edition-test-gpu-cpu

Divinity 2
http://gamegpu.com/rpg/ролевые/divinity-original-sin-2-test-gpu-cpu

Call Of Duty WW2 beta
http://www.pcgameshardware.de/Call-of-Duty-WW2-Spiel-59925/Specials/Technik-Test-1239989/
http://gamegpu.com/action-/-fps-/-tps/call-of-duty-wwii-beta-test-gpu-cpu

Total War Warhammer 2
http://gamegpu.com/rts-/-стратегии/total-war-warhammer-ii-test-gpu-cpu
http://www.pcgameshardware.de/Total...60823/Specials/Benchmark-Test-Review-1240653/

Destiny 2 Beta
http://gamegpu.com/action-/-fps-/-tps/destiny-beta-test-gpu-cpu
 
Looks at the top of the forums, yep, this is the AMD section, wonders why NVidia players are here whining about things that mean nothing to them. Oh yeah, they are trying to justify their purchase, never mind. :D Me, on the other hand, I am going to be lovin me some RX Vega 56 starting tonight and over the weekend. :)

Oh well, I prefer sharp, clear, bright 10 bit hardware display and hardware for everything I do over the 8 bit washed out look that Nvidia gives. :)
 
You mean you can only make posts if your graphics card is fast and "cheap"?

Did not know that you have had to get committee approval for posting here about AMD hardware ?

Where did I say anything about who can post?
 
Looks at the top of the forums, yep, this is the AMD section, wonders why NVidia players are here whining about things that mean nothing to them. Oh yeah, they are trying to justify their purchase, never mind. :D Me, on the other hand, I am going to be lovin me some RX Vega 56 starting tonight and over the weekend. :)

Oh well, I prefer sharp, clear, bright 10 bit hardware display and hardware for everything I do over the 8 bit washed out look that Nvidia gives. :)

So the fact that AMD managed to reach a 1080 -(*not even close to NVidia's flagship, the 1080Ti)- performance with 1 year latency, and +100$ more expensive price, isn't something that bothers you?

P.S. As for your comment about NV participants at AMD's section, personally, my participation here comes as a reaction to all the hype i have suffered from AMD all these years (*before the release of FuryX and later).
I used to respect ATI's GPUs in the past, some of them were formidable opponents, but the last years all AMD has been doing is 1) hyping and 2) making us waiting !! (*and waiting....and waiting.....and..........:yawn::yawn: )
 
Looks at the top of the forums, yep, this is the AMD section, wonders why NVidia players are here whining about things that mean nothing to them. Oh yeah, they are trying to justify their purchase, never mind. :D Me, on the other hand, I am going to be lovin me some RX Vega 56 starting tonight and over the weekend. :)

Oh well, I prefer sharp, clear, bright 10 bit hardware display and hardware for everything I do over the 8 bit washed out look that Nvidia gives. :)


You want us to talk about AMD products else where?

ManofGod, you need to check yourself Pascal does 10bit HDR just fine.

You have literally spent 100 bucks more on a Vega56 that you should have and with that 100 bucks you could have gotten a 1080 it will go faster in pretty much all games and use less power..... you just supported a company to keep making products that are a year late and that can't keep up based on the price they are selling at right now.

You were one of the people complaining that Intel pricing sux on their CPU's, yet you are willing to spend 100 bucks over the SEP to get Vega56? What do we call that? Hypocritical? You spent more than you should and you complain about Intel pricing?
 
So the fact that AMD managed to reach a 1080 -(*not even close to NVidia's flagship, the 1080Ti)- performance with 1 year latency, and +100$ more expensive price, isn't something that bothers you?

P.S. As for your comment about NV participants at AMD's section, personally, my participation here comes as a reaction to all the hype i have suffered from AMD all these years (*before the release of FuryX and later).
I used to respect ATI's GPUs in the past, some of them were formidable opponents, but the last years all AMD has been doing is 1) hyping and 2) making us waiting !! (*and waiting....and waiting.....and..........:yawn::yawn: )

Hey, I have an RX Vega 56, not sure where you are speaking about $100 more in price than a 1080. Also, I used to own a 980 Ti and for me, it was a complete ripoff, the colors were always washed out on the desktop, probably in games and all AMD cards I used on it, the R9 380, R9 290X and R9 Fury never once exhibited that problem. No amount of tweaking ever fixed the NVidia issue, not a monitor issue.
 
You want us to talk about AMD products else where?

ManofGod, you need to check yourself Pascal does 10bit HDR just fine.

Oh wait, so you are saying I should have spent $600 more to get what AMD already gave me on top of the already over $600 I spent previously on a 980 Ti? LOL! No thanks, burn me once........
 
Hey, I have an RX Vega 56, not sure where you are speaking about $100 more in price than a 1080. Also, I used to own a 980 Ti and for me, it was a complete ripoff, the colors were always washed out on the desktop, probably in games and all AMD cards I used on it, the R9 380, R9 290X and R9 Fury never once exhibited that problem. No amount of tweaking ever fixed the NVidia issue, not a monitor issue.

well, sure, you might have RX VEGA 56, but the OP wrote at his 1st post about VEGA 64 . ( """The AMD Vega 64 seems to be making a name for itself as this is the third time that it is beating Nvidia at its game. "" )
So, i had in mind the OP's original subject of discussion.
 
Oh wait, so you are saying I should have spent $600 more to get what AMD already gave me on top of the already over $600 I spent previously on a 980 Ti? LOL! No thanks, burn me once........


So what you just spend 100 bucks more for a graphics card that is a year late that sucks down power like there is no tomorrow when compared to its direct competitor. Why are you complaining about money now?

At least if you went with a gsync HDR monitor before, you could have been gaming on it every generation and upgrading to exactly what you wanted every year and half, instead just sitting around with midrange cards.

They didn't even have games that could have taken advantage of 10bit HDR monitors back with the 980ti! WTF man the first game was ME Andromeda! And please show me those links where HDR monitors were demonstrated cause side by side they do show the washed out colors, but if you look deeper the settings were changed on washed out monitor!

I think you need to check your settings if you were getting washed out colors, cause even without an HDR monitor I can get the colors to be very close! Yeah I lose a bit on blackness with transitions but outside of that, its damn close. Yeah if I did what you did, for me its a side grade from a 980ti to Vega 54 and wasted 500 bucks! That extra hundred bucks would actually give an upgrade over the 980ti.
 
Last edited:
So what you just spend 100 bucks more for a graphics card that is a year late that sucks down power like there is no tomorrow when compared to its direct competitor. Why are you complaining about money now?

At least if you went with a gsync HDR monitor before, you could have been gaming on it every generation and upgrading to exactly what you wanted every year and half, instead just sitting around with midrange cards.

They didn't even have games that could have taken advantage of 10bit HDR monitors back with the 980ti! WTF man the first game was ME Andromeda! And please show me those links where HDR monitors were demonstrated cause side by side they do show the washed out colors, but if you look deeper the settings were changed on washed out monitor!

I think you need to check your settings if you were getting washed out colors, cause even without an HDR monitor I can get the colors to be very close! Yeah I lose a bit on blackness with transitions but outside of that, its damn close. Yeah if I did what you did, for me its a side grade from a 980ti to Vega 54 and wasted 500 bucks! That extra hundred bucks would actually give an upgrade over the 980ti.

look, you can't ever argue and win with an AMD Evangelist, going out of the way to show the rest of the world how little they know and how wrong they are about GPUs one post at a time... according to his belief and their own claims he will never own anything Nvidia and or Intel because AMD it's perfect, they don't make mistakes, and what everything they make it's due goodwill.

don't waste your time.
 
So what you just spend 100 bucks more for a graphics card that is a year late that sucks down power like there is no tomorrow when compared to its direct competitor. Why are you complaining about money now?

At least if you went with a gsync HDR monitor before, you could have been gaming on it every generation and upgrading to exactly what you wanted every year and half, instead just sitting around with midrange cards.

They didn't even have games that could have taken advantage of 10bit HDR monitors back with the 980ti! WTF man the first game was ME Andromeda! And please show me those links where HDR monitors were demonstrated cause side by side they do show the washed out colors, but if you look deeper the settings were changed on washed out monitor!

I think you need to check your settings if you were getting washed out colors, cause even without an HDR monitor I can get the colors to be very close! Yeah I lose a bit on blackness with transitions but outside of that, its damn close. Yeah if I did what you did, for me its a side grade from a 980ti to Vega 54 and wasted 500 bucks! That extra hundred bucks would actually give an upgrade over the 980ti.

Dude, I sold that 980 Ti, with its washed out colors, over 1 year ago, no thanks. No, it was not the settings of the monitor but the Nvidia hardware itself, nothing more and nothing less. Now, the person I sold it too has not said he had any issues so his monitor may have worked better with it but mine? Nope, nope, nope........

I am not sorry to say, I will not spend Nvidia prices just so I can get "close" after months of tweaking and hoping. Also, good luck with your 980 Ti outperforming my Vega 56 but, then again, I do not based my purchases on what you own or want. :D
 
look, you can't ever argue and win with an AMD Evangelist, going out of the way to show the rest of the world how little they know and how wrong they are about GPUs one post at a time... according to his belief and their own claims he will never own anything Nvidia and or Intel because AMD it's perfect, they don't make mistakes, and what everything they make it's due goodwill.

don't waste your time.

Because I do not agree with your NVidia point of view? Dude, I know and am aware of a lot more than you give me credit for but hey, you keep evangelizing that Nvidia hardware and I will keep giving you facts.

Yep, that is what I said, AMD is perfect. LOL! If you are not happy with my point of view, there is always the Nvidia sub forums, I am sure they will welcome you there with open arms. Oh, and I said I will not own anything Intel or Nvidia in my personal builds because AMD is what I prefer, I have no issue with others using what they want to use, like some around here seem to have.
 
Dude, I sold that 980 Ti, with its washed out colors, over 1 year ago, no thanks. No, it was not the settings of the monitor but the Nvidia hardware itself, nothing more and nothing less. Now, the person I sold it too has not said he had any issues so his monitor may have worked better with it but mine? Nope, nope, nope........

I am not sorry to say, I will not spend Nvidia prices just so I can get "close" after months of tweaking and hoping. Also, good luck with your 980 Ti outperforming my Vega 56 but, then again, I do not based my purchases on what you own or want. :D


That is why I never tell people what to buy, but when you complain Intel CPU's are too expensive over AMD Ryzen then spend 500 bucks on a card that should be only 400 that's a problem! Than to boot in washed out colors which is BS its just one button on the monitor and increasing digital vibrancy to +55% it makes it look like a complete troll post. So difficult it will take months of work.... yeah ok.

Again you just stated Nvidia prices Nvidia is giving you more for that 500 bucks you just spent for your Vega56, ~20% more performance at 20% less power consumption! Another fallacy.
 
That is why I never tell people what to buy, but when you complain Intel CPU's are too expensive over AMD Ryzen then spend 500 bucks on a card that should be only 400 that's a problem! Than to boot in washed out colors which is BS it makes it look like a complete troll post.

Again you just stated Nvidia prices Nvidia is giving you more for that 500 bucks you just spent for your Vega56, ~20% more performance at 20% less power consumption! Another fallacy.

I never said that Intel is too expensive but that what I spent for what I got was a waste of money. (The money I spent was a waste at that time because I was not at all benefiting from it and ended up happier with money in the bank to boot.) Now I am happy with what I have now and with greater performance for what I do on both computers I have. (Although, I will admit that I wish I had not upgraded my work computer so quickly but I wish I had waited since what I had already was doing pretty well.)

Going from a 8 Core processor that did everything I needed to a 4 core, 8 thread processor that was not needed and cost significantly more at the time was not worth it. The only fallacy you have is your own and nothing else, you seem to have issues with what I am saying and defending your purchases as though what I think even matters. LOL! :D
 
I never said that Intel is too expensive but that what I spent for what I got was a waste of money. (The money I spent was a waste at that time because I was not at all benefiting from it and ended up happier with money in the bank to boot.) Now I am happy with what I have now and with greater performance for what I do on both computers I have. (Although, I will admit that I wish I had not upgraded my work computer so quickly but I wish I had waited since what I had already was doing pretty well.)

Going from a 8 Core processor that did everything I needed to a 4 core, 8 thread processor that was not needed and cost significantly more at the time was not worth it. The only fallacy you have is your own and nothing else, you seem to have issues with what I am saying and defending your purchases as though what I think even matters. LOL! :D


Dude I haven't bought anything other the best of the best for the past 20 years or so. Doesn't matter what you say, I only get the best. Sorry. Your Vega 56 is not the best, its not even the best in its tier, so I wouldn't even look at it. Thats why I sold my AMD mining rigs, cause they aren't the best anymore, the 1070's, 1080's, cream them over a 1 year period of power consumption, I save 12k just on power alone ) or make 12k more with the 1080's, 12k is how many cards? 48 cards!

See you say you know what you are doing, but you didn't even know about the digital vibrance settings did you? It was like this from the 8700 vs the GF3, AMD's card by default have their digital vibrance just a bit higher than nV cards!

Your reasoning is sophomoric, if you just stated, I love AMD, we can leave it at that. Instead you make up excuses which show how little you know about your hardware settings, things that have been around for 2 decades!
 
Last edited:
Dude I haven't bought anything other the best of the best for the past 20 years or so. Doesn't matter what you say, I only get the best. Sorry. Your Vega 56 is not the best, its not even the best in its tier, so I wouldn't even look at it. Thats why I sold my AMD mining rigs, cause they aren't the best anymore, the 1070's, 1080's, cream them over a 1 year period of power consumption, I save 12k just on power alone ) or make 12k more with the 1080's, 12k is how many cards? 48 cards!

See you say you know what you are doing, but you didn't even know about the digital vibrance settings did you? It was like this from the 8700 vs the GF3, AMD's card by default have their digital vibrance just a bit higher than nV cards!

Your reasoning is sophomoric, if you just stated, I love AMD, we can leave it at that. Instead you make up excuses which show how little you know about your hardware settings, things that have been around for 2 decades!

Of course I did not know about the digital vibrance, how could I possibly have known anything about that? /s The fact that you do not like the reality of the situation does not change that reality and the problem was far more significant that a bit but hey, what do I know, I am just a person with sophomoric reasoning. :D:eek::oops::rolleyes::confused:

Oh well, I know far more than you give me credit for but then again, you do not have to give me credit at all, I am cool with that, does not change anything. *Shrug*
 
Of course I did not know about the digital vibrance, how could I possibly have known anything about that? /s The fact that you do not like the reality of the situation does not change that reality and the problem was far more significant that a bit but hey, what do I know, I am just a person with sophomoric reasoning. :D:eek::oops::rolleyes::confused:

Oh well, I know far more than you give me credit for but then again, you do not have to give me credit at all, I am cool with that, does not change anything. *Shrug*


Really you are the only person I know that has stated colors are washed out because they use an nV card in the past decade!. Pretty much everyone knows this by default AMD's digital vibrance is activated and higher. So if you are having problems with your colors, its either your monitor or you settings. Simple isn't it? Instead you blamed the card!

When I play games I always increase my digital vibrance by 5%, why cause it looks better in games the artwork tends to go with it better (I can't use it when making textures though cause it will create problems on different graphics cards, need to set back to default), but watching movies and such it doesn't look as good, it gives hot color spots a touch of neon garish look.
 
look, you can't ever argue and win with an AMD Evangelist, going out of the way to show the rest of the world how little they know and how wrong they are about GPUs one post at a time... according to his belief and their own claims he will never own anything Nvidia and or Intel because AMD it's perfect, they don't make mistakes, and what everything they make it's due goodwill.

don't waste your time.

It is a little bit pretentious of you to claim things on that level like any of you lot are super star engineers working the forums as a hobby to enlighten any of "us".
 
It is a little bit pretentious of you to claim things on that level like any of you lot are super star engineers working the forums as a hobby to enlighten any of "us".


Thats why we get paid at the higher end of 6 figures, we can multi task very very well ;) Unlike some chips, see what I did there?

Have you ever heard this, a person gets paid more because they are capable of doing that work and much more?

Its true ya know that right?

Being good or great at one thing let it be sports, business, what ever, it translates over to everything they do, they just need to put their mind to it. They might not become the best at everything, that never happens, but they get damn good at it what ever they want to spend time with.
 
It is a little bit pretentious of you to claim things on that level like any of you lot are super star engineers working the forums as a hobby to enlighten any of "us".

I am, but not in GPUs. I can still smell bullshit real well even if it's not my field. :)

It's generally a good idea to try and be somewhat humble though...

I thought everyone here made six figures and banged tens? Or is that just GenMay?
 
I am, but not in GPUs. I can still smell bullshit real well even if it's not my field. :)
It's generally a good idea to try and be somewhat humble though...
I thought everyone here made six figures and banged tens? Or is that just GenMay?

Well the bullshit you smelled is from this other thread called some bad news , maybe you are a little of your game ;)
 
Really you are the only person I know that has stated colors are washed out because they use an nV card in the past decade!. Pretty much everyone knows this by default AMD's digital vibrance is activated and higher. So if you are having problems with your colors, its either your monitor or you settings. Simple isn't it? Instead you blamed the card!

When I play games I always increase my digital vibrance by 5%, why cause it looks better in games the artwork tends to go with it better (I can't use it when making textures though cause it will create problems on different graphics cards, need to set back to default), but watching movies and such it doesn't look as good, it gives hot color spots a touch of neon garish look.
I hooked up a 1060 to the 4K 10bit IPS monitor and it looked like washed out shit. Since just configuring that system I just thought 8bit to 10bit was not being accurately translated. I didn't bother to adjust the monitor nor the card since on the 8bit monitor it looked fine. In the past I've seen Nvidia cards blurry as hell (owned one, GF2 MX400, anything over 1024x768 was a fuzz show). In the past ATI had superior image quality from my experience, today I just do not notice that much difference between Nvidia and RTG. I think I will take an extra look now. I wonder if that maybe the reason RTG won over Nvidia with that Doom test here at [H]ardOCP?
 
Back in the past, both companies traded back and forth on IQ, It wasn't till the rv680 or rv780 and the g80 they pretty much stopped all optimizations to filtering did they start producing same screens.

I don't think that will cause the doom win, people were saying AMD's vega felt smoother, its nothing empirical about it, just something they felt, I've used Gsync before, I don't see much need for it with fast sync and as long as the game can run over 60 fps. I'm sure the same thing with Free sync and its new triple buffering sync mode.

I got a gf2mx can't remember it was a 400 or something, as a gift (think it was Christmas) from my parents, they don't know tech at all lol, I turned around and gave it way lol, already had a GF 4 ti apparently the store they bought it from the guy said its the BEST! Yes marketing for ya lol.

Yeah its just a setting or two and you should be fine, the main one is digital vibrance. The monitor setting I set my monitor at "cool" gives it a bit of a slight blue hue, so whites look a bit whiter and blacks look a bit darker.
 
Hey, I have an RX Vega 56, not sure where you are speaking about $100 more in price than a 1080. Also, I used to own a 980 Ti and for me, it was a complete ripoff, the colors were always washed out on the desktop, probably in games and all AMD cards I used on it, the R9 380, R9 290X and R9 Fury never once exhibited that problem. No amount of tweaking ever fixed the NVidia issue, not a monitor issue.
I'll agree but in a different context. I have a pretty decent home theater. I could never get the black levels right with my Nvidia cards. Had frustrations with that for years. Posted here and at avsfourm, on jriver forum and chased the issue for years. HDMI full/limited, color space 0-255 or 16-255, custom tweaking, config files --- you name iI tried it. Even over three different projectors. I upgraded from a gtx 670 card to a AMD 285 card and boom --- problem gone. Blacks looked fantastic. Colors looked slightly better and more natural, but the biggest difference was the black levels. Fury X same thing - fantastic. It was an unexpected, welcome side effect for me. I had 15 years of Nvidia previously. I didn't anticipate the better blacks, I bought the AMD 285 so I could use threeDell desk monitors in portrait, landscape, portrait mode(20"/30"/20")---something AMD started supporting in hardware that Nvidia still doesn't support.

I recently bought the 1080tis but I'm mining with them. I haven't tried them on the projector. But I gotta agree with you in that the AMD card just worked out of the box and had clearly superior black levels on my home theater projector---which as we all know---strong blacks can make colors pop!

My last two projectors for reference.
Epson 8350
Panasonic ae8000u

In my case I don't think it was as noticeable on my monitors. Maybe IPS glow of the monitors negates the deeper blacks I was seeing on the projectors???
 
Last edited:
I bought a https://www.newegg.com/Product/Product.aspx?Item=N82E16824025164. Replacing a 120 Hz no freesync smaller monitor. Freesync is great and 1 ms response time kicks ass. This thing also supports HDR , but from what I gather there are only 3 games supporting it. I had no Idea BF! supported it.
Will try it out this evening. RE 7 supposedly has a great implementation of it and its on my to play list. One thing that I am sure of is Freesync or Gsync is the way to go.
 
I bought a https://www.newegg.com/Product/Product.aspx?Item=N82E16824025164. Replacing a 120 Hz no freesync smaller monitor. Freesync is great and 1 ms response time kicks ass. This thing also supports HDR , but from what I gather there are only 3 games supporting it. I had no Idea BF! supported it.
Will try it out this evening. RE 7 supposedly has a great implementation of it and its on my to play list. One thing that I am sure of is Freesync or Gsync is the way to go.
Wow! That is a nice looking monitor! Surprised at the cost being that it supports HDR and is FreeSync II (doesn't look like that added anything to the cost either), 144hz too. I almost want to buy one for the Vega 64.
 
Do it. You'll love it. I wanted a Ultra wide , but the 1 ms response time won me over.
 
Oh forgot one more setting guys in the change resolution panel, set your output color range to FULL! Extremely important.
 
Back
Top