Some GOOD News

What happened when AMD stopped supporting it, remember how poorly the same games with mantle did from the next generation GPU's? Once game dev's stop supporting 2 or 3 year old games, you guys don't think that won't happen to DX 12 games or Vulkan games?

People that wanted this have to swallow the pill, at the end it will make everyone more money, and we have to spend for it. Thats fine I'm cool with that part, as long as the games that are coming out are better than before right?

http://aras-p.info/blog/2014/05/31/rant-about-rants-about-opengl/

You guys should read this. Interesting tidbit at the end too



http://www.redgamingtech.com/ice-te...apis-sonys-libgcm-first-modern-low-level-api/


Now you guys think its good or bad that PC's have LLAPI's, its good in one sense, the full potential of the PC hardware can be utilized, but bad in a business sense, where the same problems with consoles will start showing up in PC's. backward capability, lack there of in consoles even one gen of AMD hardware to another, there are some issues if the generation of GPU's change.... even though PC environment is more open than consoles, those same problems will come up. With MS's abstraction layers, older game support on newer hardware, kinda screwed there, and dev's really don't updated older games unless they remain popular......

All the same problems consoles have will show up in the PC space now, everything MS and OGL were trying to avoid in the first place. Granted the abstraction layers were getting to thick so that was a good thing to get rid of them.

Everything else for the IHV's stays the same, no changes there, the IHV's don't give a shit if its DX11, 12, Vulkan, what ever, cause its just a DAMN API. For them its just marketing BS. AMD used it because they didn't have the resources to make the necessary improvements in their drivers in DX11, and we know their abysmal performance in OGL. That means nothing in context of hardware though. nV didn't want to go towards it cause they had an advantage with DX11 and Ogl which they wanted to exploit further.

All of us sitting here running our posts about this and that, all it is, is MONEY everything else doesn't matter.

You succeeded at moving goal posts so far it is impossible to not score a goal . But all in all thank you for showing that the gaming industry should be renamed to truckload of stupid greedy people, focius on games not money , that would be so much better
 
You succeeded at moving goal posts so far it is impossible to not score a goal . But all in all thank you for showing that the gaming industry should be renamed to truckload of stupid greedy people, focius on games not money , that would be so much better

A business that does not focus on money will soon stop being a business alltogether...
 
A business that does not focus on money will soon stop being a business alltogether...

Which has all of nothing to do with staying on topic. The topic was how well the AMD cards were doing in certain games out of the gate, not anything to do with what you just mentioned. In fact, I will just leave this here: https://forums.guru3d.com/threads/rx-vega-owners-thread-tests-mods-bios-tweaks.416287/page-12 Read posts 224 to 226 and see how much better AMD is actually doing in the graphics department. :)
 
You succeeded at moving goal posts so far it is impossible to not score a goal . But all in all thank you for showing that the gaming industry should be renamed to truckload of stupid greedy people, focius on games not money , that would be so much better


So what do you think these people are in business for then?

That is not moving the goal posts, that is why businesses do what they do!

If you don't understand that, stop talking about them ALL together.

AMD is in business to make money
nV is in business to make money
MS in in business to make money
Sony is in business to make money
ALL game companies are in business to make money.

If you think that is moving goal posts when you don't know why AMD did what they did with LLAPI's why they brought out Mantle, sorry you don't know the game industry and how it revolves around change and obsolescence.

LLAPI's don't give anything more graphically than DX11, why is that? Cause the shader array is programmable enough to do all the effects in DX11, just that for a certain vendor they had so much DX11 overhead that it cut down on their ability to move forward. Something they didn't want to put money towards to fix! So they want everyone else to spend money instead......

When a company does something all you have to do is ask yourself does it benefit them in any way? And answer the question, it will come down to money, either lack there of or making more.

PS you were also talking about Jaguar cores, they blow, and no API is going to help them cause if an API helps them it will be through the better use of multi threading right? But the uplift they have with all cores being fully used is very low because Consoles already had LLAPI's before LOL, your point Mantle and DX12 do less in Xbox than they do in PC's cause Xbox already had a LLAPI before.... It was called DX11 Next. What you think AMD was the first company to do LLAPI's? They weren't they weren't even the second. Not only that since the first consoles were out, they had low level access viaSDK's available for years before LLAPI's.
 
Last edited:
Which has all of nothing to do with staying on topic. The topic was how well the AMD cards were doing in certain games out of the gate, not anything to do with what you just mentioned. In fact, I will just leave this here: https://forums.guru3d.com/threads/rx-vega-owners-thread-tests-mods-bios-tweaks.416287/page-12 Read posts 224 to 226 and see how much better AMD is actually doing in the graphics department. :)

Really, didn't I say someone would come up with that excuse?

Yeah that was 2 weeks ago I think.

It looks like trilinear filter is off on nV cards in the prey video? or its not at the same setting. Either way that won't change frame rates. And it only happens in the helicopter scene. So must likely a bug or something.

Now you want an explanation of the frame rates of that video?

Outside of 2 scenes both of which use less polygons and over all shaders is where an overclocked Vega 64 with liquid cooling actually does better than a 1080 and gets close to the 1080ti, but the other scenes which are like 5 or 6 of in that video, the same water cooled overclocked Vega 64 ties the 1080 and gets its creamed by the 1080ti. You will also notice in those 2 scenes where Vega does better in, the frame times for nV cards are worse too. And this is Prey, AMD should do better! Its their game, the one they used to show off Vega.

And if you read the entire thread,

Wrong,
unless you have proof you're just spreading nonsense.

Regarding that video, NVIDIA/AMD setups are using different settings.
You can confirm that by viewing comments.
NV is using FXAA for one which causes significantly more blurring than SMAA


BigBoom Boom14 hours ago
Settings are NOT the same.

WOW, there you go, you want to go to other forums and find people of like mind to yourself and post their crap here? not a good idea....
 
Last edited:
Which has all of nothing to do with staying on topic. The topic was how well the AMD cards were doing in certain games out of the gate, not anything to do with what you just mentioned. In fact, I will just leave this here: https://forums.guru3d.com/threads/rx-vega-owners-thread-tests-mods-bios-tweaks.416287/page-12 Read posts 224 to 226 and see how much better AMD is actually doing in the graphics department. :)

and that's exactly how FUD are spread on the internet by trolls, as you are asking for others take the time to check others post and forums to see *How great* AMD it's doing then you should at least *check* your facts because first one the video on post 222 IT'S FAKE he took the already compressed video from another guy in youtube which it's *THIS VIDEO* at 1080P quality and he compared that video with their own video recorded at 1440P so yeah, of course there's going to be difference in graphic quality if a video is scaled from 1080P to 1440P side by side to a 1440P recording, also the guy was caught disabling SSDO to gain over 25% performance when the original video where he took the GTX 1080 and GTX 1080ti results had enabled.

so, next time verify yourself the things before spread more FUD and misinformation, your reputation as a troll it's getting high at an alarming rate, try keep lucky and keep your threads closed and not deleted ;).
 
Really, didn't I say someone would come up with that excuse?

Yeah that was 2 weeks ago I think.

It looks like trilinear filter is off on nV cards in the prey video? or its not at the same setting. Either way that won't change frame rates. And it only happens in the helicopter scene. So must likely a bug or something.

Now you want an explanation of the frame rates of that video?

Outside of 2 scenes both of which use less polygons and over all shaders is where an overclocked Vega 64 with liquid cooling actually does better than a 1080 and gets close to the 1080ti, but the other scenes which are like 5 or 6 of in that video, the same water cooled overclocked Vega 64 ties the 1080 and gets its creamed by the 1080ti. You will also notice in those 2 scenes where Vega does better in, the frame times for nV cards are worse too. And this is Prey, AMD should do better! Its their game, the one they used to show off Vega.

And if you read the entire thread,



WOW, there you go, you want to go to other forums and find people of like mind to yourself and post their crap here? not a good idea....

lol, what's worse it's the fact that the OP in that video used the GTX 1080 and GTX 1080Ti from another guy on youtube at 1080P and scaled up to 1440P but also got caught using different settings.
 
lol, what's worse it's the fact that the OP in that video used the GTX 1080 and GTX 1080Ti from another guy on youtube at 1080P and scaled up to 1440P but also got caught using different settings.


I don't understand why people don't take a second look at something that breaks the norm? And something so dramatic as one getting more performance than usual while the competitors are dropping IQ at the same time. That is just something that has to be looked into twice or three times before coming to a wonder drivers by AMD, cause those drivers shouldn't be affecting the IQ of the other cards.......
 
I don't understand why people don't take a second look at something that breaks the norm? And something so dramatic as one getting more performance than usual while the competitors are dropping IQ at the same time. That is just something that has to be looked into twice or three times before coming to a wonder drivers by AMD, cause those drivers shouldn't be affecting the IQ of the other cards.......

just the only and one fact of using ANOTHER video youtube to make a NEW Youtube Video it's enough to smell fishy as that only reason by itself will present a huge graphic difference as it will be compressed yet another time, that as itself should be called out without take in consideration different system even if they are similar in specs (just CPU it seems) still account as a different system, with different drivers, settings etc, so results *Shouldn't* be comparative one against others.
 
I don't understand why people don't take a second look at something that breaks the norm? And something so dramatic as one getting more performance than usual while the competitors are dropping IQ at the same time. That is just something that has to be looked into twice or three times before coming to a wonder drivers by AMD, cause those drivers shouldn't be affecting the IQ of the other cards.......

Confirmation bias...simple as that

7e3.jpg
 
razor1 : Are you saying that AMD created Mantle (and, thus, Vulkan) because of their disadvantage in DX11? I could believe this, but how do you explain DX12 then? Surely MS was developing DX12 at around the same time. To me this just seems like the direction the industry as a whole was moving in.
 
razor1 : Are you saying that AMD created Mantle (and, thus, Vulkan) because of their disadvantage in DX11? I could believe this, but how do you explain DX12 then? Surely MS was developing DX12 at around the same time. To me this just seems like the direction the industry as a whole was moving in.


AMD sped things up, they knew driver work was way behind in DX11 and to catch up with that and work on drivers for when ever DX12 was coming is just too more money then they were willing to spend. It was better for them to push something that requires less work for them but in the long run its going to hurt them anyways, because the company that can spend more resources will get the upper hand anyways, DX12 also negated the the gamework Libs that used tessellation, like hiarworks, cause they weren't made to run on DX12 at the time. So added benefit the first few DX12 games (paths) won't have game works working on them. Actually we knew AMD had issues with DX11 driver overhead a year before (there were two or three games that showed it) it was even prominently visible, it became visible after Mantle was released just how poorly AMD's DX11 driver performed. Pretty much the CPU bottleneck was visible with the x290 products.

If we see it as an end result from playing games, AMD must have know about well before that. So they had plenty of time to fix it, but they didn't. Why? Well it comes down to money, is it better to invest on something that will take quite some time to be competitive with, or push for something else, that they don't need to do the work on and would level the playing field? Better with the latter. They got a level playing field and don't need to spend the resources.

Also when we look back at the illustrious works of ATi and AMD driver teams, ATi and AMD both pushed forth with drivers of new API's quicker than nV. DX9, DX10. DX11 was the first time nV ever took the lead on driver development for a new API that was because their architecture was better suited for what DX11 had, the major feature was improved tessellation and same thing nV pushed for it because they had a huge ass advantage, that AMD couldn't match. Now with LLAPI's there is no advantage for AMD unlike with DX11 and nV, because with tessellation nV's architecture was and still is better suited for it, it can do more and that is it, no special code needed. But at least the short term both companies are on equal footing when it comes to any driver issues.
 
Last edited:
The combative and condescending attitude being displayed in this thread is not because the person needs help. No, said person has already divested himself of the card and is now claiming that nVidia doesn't display colors correctly. (As an aside, as a video engineer that uses both AMD/RTG and nVidia cards in multiple systems, I don't see 'washed out colors' on videos on nVidia cards and vibrant colors on AMD/RTG cards. Then again, part and parcel to the title is being able to successfully calibrate my tools to do the job I need to do.) Since the time has passed for the forum to help said member and said member has a different experience than what most people on this forum experience there's no real point in trying to help now, is there? He's not going to go out and get his 980 Ti back and, based on his signature and multiple forum posts, makes an abundant point of never buying an nVidia card ever again.
Put the 1080Ti on the 10 bit IPS Freesync monitor (Intel system) yep - WASHED OUT COLORS! No doubt, as obvious as it could be. Went into Nvidia driver panel and changed it from 8bit to 10bit - NICE FULL RICH LOOKING COLORS. Enough said on that.

Now I did notice something which I did not expect, even though another forum member noticed it too. This is a 4K 27" IPS monitor and I normally use Windows to scale 1.25x to 1.5x. When scaled, the 1080Ti does look blurrier than the two AMD cards I had on this monitor (Nano and Vega 64). I will have to re-verify by hooking both cards to the same monitor and flip back and forth. I finally have some time coming up to put the custom loop on the CH6 system so the comparison will be upcoming.

Anyways I did GPUcompute tests between the Vega 64 LC and 1080Ti - Vega did better in virtually all tests. Will do a write up once I get a chance.
 
Put the 1080Ti on the 10 bit IPS Freesync monitor (Intel system) yep - WASHED OUT COLORS! No doubt, as obvious as it could be. Went into Nvidia driver panel and changed it from 8bit to 10bit - NICE FULL RICH LOOKING COLORS. Enough said on that.

Now I did notice something which I did not expect, even though another forum member noticed it too. This is a 4K 27" IPS monitor and I normally use Windows to scale 1.25x to 1.5x. When scaled, the 1080Ti does look blurrier than the two AMD cards I had on this monitor (Nano and Vega 64). I will have to re-verify by hooking both cards to the same monitor and flip back and forth. I finally have some time coming up to put the custom loop on the CH6 system so the comparison will be upcoming.

Anyways I did GPUcompute tests between the Vega 64 LC and 1080Ti - Vega did better in virtually all tests. Will do a write up once I get a chance.

Interesting results and thanks for sharing! Have a buddy just bought a WC V64/1800x rig (got it for a steal - ex-three day show demo and I'm seriously having to hide my credit card..) so will be fun to have a play with it.

This 8-10bit discussion really blew out of the water lol. Paxwell has 10bit desktop support. All Nvidia dGPUs prior didn't support it outside of DX apps, this is where the banding/inferior colour appearance came from. Sure you can enable 'full range' which will help but it's not true 10 bit. Banding will often be reduced by the screen itself with a 10bit LUT..
 
So with the current performance one can say that AMD now lost the ball?

Perhaps on a hardware/uarch level yes. I guess this serves as an excellent case of 'AMD drivers are not stuck in 2011 any more and have not been for many years'.
 
Perhaps on a hardware/uarch level yes. I guess this serves as an excellent case of 'AMD drivers are not stuck in 2011 any more and have not been for many years'.

That is an odd conclusion to reach based on an anomaly...
 
Heck my Samsung TV supports 12 bits per channel. Occasionally a Radeon driver update sets it back to 10 bits and I can tell the difference just looking at the Windows desktop after the reboot. I would not want to go back to a card that only supported 10 or even 8 bpc
 
Heck my Samsung TV supports 12 bits per channel. Occasionally a Radeon driver update sets it back to 10 bits and I can tell the difference just looking at the Windows desktop after the reboot. I would not want to go back to a card that only supported 10 or even 8 bpc

What is you model, as I have seen many TV's advertised as as "12 bit"...even if they use 8 bit panels (Sony and Samsung)?
 
That is an odd conclusion to reach based on an anomaly...
I bet you don't use AMD gpus currently.
I have for over a decade and their driver game is far better than it used to be. This isn't the only time they have done this, another big one was Nvidia had borked drivers for win10 ffs, AMD didn't. Not just one time.
 
I bet you don't use AMD gpus currently.
I have for over a decade and their driver game is far better than it used to be. This isn't the only time they have done this, another big one was Nvidia had borked drivers for win10 ffs, AMD didn't. Not just one time.

Again, that is a strange conclusion to reach based on an anomaly.

Again, it seems AMD lost the ball here...just as in Forza 7.

But it is nice that you are happy about 1-2 weeks of perfomance lead be followed by a loss onwards :)
 
So what do you think these people are in business for then?

That is not moving the goal posts, that is why businesses do what they do!

If you don't understand that, stop talking about them ALL together.

AMD is in business to make money
nV is in business to make money
MS in in business to make money
Sony is in business to make money
ALL game companies are in business to make money.

If you think that is moving goal posts when you don't know why AMD did what they did with LLAPI's why they brought out Mantle, sorry you don't know the game industry and how it revolves around change and obsolescence.

LLAPI's don't give anything more graphically than DX11, why is that? Cause the shader array is programmable enough to do all the effects in DX11, just that for a certain vendor they had so much DX11 overhead that it cut down on their ability to move forward. Something they didn't want to put money towards to fix! So they want everyone else to spend money instead......

When a company does something all you have to do is ask yourself does it benefit them in any way? And answer the question, it will come down to money, either lack there of or making more.

PS you were also talking about Jaguar cores, they blow, and no API is going to help them cause if an API helps them it will be through the better use of multi threading right? But the uplift they have with all cores being fully used is very low because Consoles already had LLAPI's before LOL, your point Mantle and DX12 do less in Xbox than they do in PC's cause Xbox already had a LLAPI before.... It was called DX11 Next. What you think AMD was the first company to do LLAPI's? They weren't they weren't even the second. Not only that since the first consoles were out, they had low level access viaSDK's available for years before LLAPI's.
Now you guys think its good or bad that PC's have LLAPI's, its good in one sense, the full potential of the PC hardware can be utilized, but bad in a business sense, where the same problems with consoles will start showing up in PC's. backward capability, lack there of in consoles even one gen of AMD hardware to another, there are some issues if the generation of GPU's chang

Everything else for the IHV's stays the same, no changes there, the IHV's don't give a shit if its DX11, 12, Vulkan, what ever, cause its just a DAMN API. For them its just marketing BS. AMD used it because they didn't have the resources to make the necessary improvements in their drivers in DX11, and we know their abysmal performance in OGL. That means nothing in context of hardware though. nV didn't want to go towards it cause they had an advantage with DX11 and Ogl which they wanted to exploit further.

And you prove that you don't understand anything I was talking about. Consoles prove that outdated hardware because of the use of API can have performance which is not possible under older API unless you skew the hardware. Then you are going on about making money so what is your point here people doing things in DX12 or Vulkan or Mantle don't make money they do it for charity. You prove my bias by these statements that the gaming industry is ran by idiots which have no basis on being there in the first place..

You list all these companies that make money as a business yet every single time they needed to they used low level API when their hardware appeared in "consoles" You successfully argued against your own point . Good god what an accomplishment.
If IHV do not care why does this exist why are they taking part in Khronos? Another time you disprove what you are claiming by your own words this is not funny any more....

Where are all the DX11 capable engines running 4k on the PC platform since that is one of your next claims, find me one that runs 4K on a comparable console spec PC. Goodluck with that one.

Back to Forza which began all of this. It seems that when developers can do their stuff in DX12 the only time they have to go back and forth now is when it is their fault rather then a driver problem of which they do not have any control over. So that would be a time waster otherwise , guess what happened, win win for the consumer ....
 
And you prove that you don't understand anything I was talking about. Consoles prove that outdated hardware because of the use of API can have performance which is not possible under older API unless you skew the hardware. Then you are going on about making money so what is your point here people doing things in DX12 or Vulkan or Mantle don't make money they do it for charity. You prove my bias by these statements that the gaming industry is ran by idiots which have no basis on being there in the first place..

You list all these companies that make money as a business yet every single time they needed to they used low level API when their hardware appeared in "consoles" You successfully argued against your own point . Good god what an accomplishment.
If IHV do not care why does this exist why are they taking part in Khronos? Another time you disprove what you are claiming by your own words this is not funny any more....

Where are all the DX11 capable engines running 4k on the PC platform since that is one of your next claims, find me one that runs 4K on a comparable console spec PC. Goodluck with that one.

Back to Forza which began all of this. It seems that when developers can do their stuff in DX12 the only time they have to go back and forth now is when it is their fault rather then a driver problem of which they do not have any control over. So that would be a time waster otherwise , guess what happened, win win for the consumer ....


Consoles require highly specialized code which takes years to create, experience gets the performance out of consoles.

Yeah I do know what you are talking about, and its BS, it doesn't work in the PC realm! PC components change to fast for this!

Rest of your gibberish is just that.

I have worked on consoles games so yeah there you have it. It takes times,years of experience with a specific console to get the most performance out of it and ever game or engine a programmer works on the more he can get out of a console, if you look at the first games on a console vs EOL console games you will see a remarkable difference in performance and effects that is because at EOL the programmers are so experienced with the routines for that specific console they get ever ounce of performance out it. You are twisting programming experience to what LLAPI's? LOL great, what else can LLAPI's do, make a quadriplegic take the world record in a 100 meter dash?

Good some people will attribute the sun's ability to give radiation to the earth as God, great same shit

http://www.valvesoftware.com/publications/2008/GDC2008_CrossPlatformDevelopment.pdf

Take it from Valve, to create a document that says exactly what I stated. Console games differ from PC games becaus of the resource limitations. The code base is vasty different based on the architecture too. Now 4k gaming on a gtx 1060, sure at 30 FPS its easy, it can do that with most games easily. And that is what most games target for consoles 30 fps not 60 fps. Different targets for PC and Consoles, why? Consoles don't have the horsepower.

Guess you can teach every console developer something! They are all doing it wrong!

Resolution has nothing to do with performance of an API, as above start reading and learning, assets are different on PC vs Console, not only that, there are many differences in console version of the game!

This is why Sony requires the team to have 3 AAA titles under their belt to even be considered to buying a dev kit. MS has restrictions too, they must see the game or product of a new team to approve a dev kit. they DON"T Want inexperienced programming teams because console development has much stricter requirements and is much more difficult because they gotta do a lot of tricks to get the performance out.

If you were around in the late 80's and early 90's you will know exactly why they do this now, cause of the bad games that Atari dev's did and crashed the entire gaming market. Inexperienced programmers with crappy publishers pushing out shit.
 
Last edited:
Consoles require highly specialized code which takes years to create, experience gets the performance out of consoles.

Yeah I do know what you are talking about, and its BS, it doesn't work in the PC realm! PC components change to fast for this!

Rest of your gibberish is just that.

I have worked on consoles games so yeah there you have it. It takes times,years of experience with a specific console to get the most performance out of it and ever game or engine a programmer works on the more he can get out of a console, if you look at the first games on a console vs EOL console games you will see a remarkable difference in performance and effects that is because at EOL the programmers are so experienced with the routines for that specific console they get ever ounce of performance out it. You are twisting programming experience to what LLAPI's? LOL great, what else can LLAPI's do, make a quadriplegic take the world record in a 100 meter dash?

Good some people will attribute the sun's ability to give radiation to the earth as God, great same shit

http://www.valvesoftware.com/publications/2008/GDC2008_CrossPlatformDevelopment.pdf

Take it from Valve, to create a document that says exactly what I stated. Console games differ from PC games becaus of the resource limitations. The code base is vasty different based on the architecture too. Now 4k gaming on a gtx 1060, sure at 30 FPS its easy, it can do that with most games easily. And that is what most games target for consoles 30 fps not 60 fps. Different targets for PC and Consoles, why? Consoles don't have the horsepower.

Guess you can teach every console developer something! They are all doing it wrong!

Resolution has nothing to do with performance of an API, as above start reading and learning, assets are different on PC vs Console, not only that, there are many differences in console version of the game!

This is why Sony requires the team to have 3 AAA titles under their belt to even be considered to buying a dev kit. MS has restrictions too, they must see the game or product of a new team to approve a dev kit. they DON"T Want inexperienced programming teams because console development has much stricter requirements and is much more difficult because they gotta do a lot of tricks to get the performance out.

If you were around in the late 80's and early 90's you will know exactly why they do this now, cause of the bad games that Atari dev's did and crashed the entire gaming market. Inexperienced programmers with crappy publishers pushing out shit.

Just to add that for console games to make 4K30, they often have lower visual fidelity compared to PC counterparts. On top of that, they have to have a lot of visual-sacrificing performance optimizations, such as dynamic IQ or dynamic resolution or upscaled from a lower internal resolution.
 
Just to add that for console games to make 4K30, they often have lower visual fidelity compared to PC counterparts. On top of that, they have to have a lot of visual-sacrificing performance optimizations, such as dynamic IQ or dynamic resolution or upscaled from a lower internal resolution.


Yep they have to use tricks or optimizations to match PC performance, no way around it, consoles are limited in every regard.

https://www.quora.com/How-different...pared-to-programming-for-the-Windows-platform

here are more examples of people with experience and the difference between PC game programming vs. Consoles, LLAPI's are MUST in consoles otherwise they can't get anything good out of them to be competitive with properly made PC games with even high level API's.
 
Last edited:
Put the 1080Ti on the 10 bit IPS Freesync monitor (Intel system) yep - WASHED OUT COLORS! No doubt, as obvious as it could be. Went into Nvidia driver panel and changed it from 8bit to 10bit - NICE FULL RICH LOOKING COLORS. Enough said on that.

Now I did notice something which I did not expect, even though another forum member noticed it too. This is a 4K 27" IPS monitor and I normally use Windows to scale 1.25x to 1.5x. When scaled, the 1080Ti does look blurrier than the two AMD cards I had on this monitor (Nano and Vega 64). I will have to re-verify by hooking both cards to the same monitor and flip back and forth. I finally have some time coming up to put the custom loop on the CH6 system so the comparison will be upcoming.

Anyways I did GPUcompute tests between the Vega 64 LC and 1080Ti - Vega did better in virtually all tests. Will do a write up once I get a chance.

where did you set the NVidia card to 10 bit?

I just swapped from the Vega 64 to the 1080TI and yes - Nvidia looks washed out (and has less black levels) compared to AMD.

There's no question. Anyone who would argue otherwise is half blind.

I don't see the 10 bit toggle in the Nvidia control panel. I'm really disgusted with Vega's buggy drivers right now - so if I can get the same look I'm going to move back to Nvidia.



EDIT - nevermind - easy to find in a google search


trying it now.
 
Mining craze seems to finally be dying down, nowinstock showing RX 580s at >$300 prices.
 
Mining craze seems to finally be dying down, nowinstock showing RX 580s at >$300 prices.
Yeah, it's pretty bad now. My Ethereum rig was pulling over $600/month when I built it. Now, with the increased difficulty and lower value, I'm only getting like $150/month, which probably won't even cover electricity.
 
Yeah, it's pretty bad now. My Ethereum rig was pulling over $600/month when I built it. Now, with the increased difficulty and lower value, I'm only getting like $150/month, which probably won't even cover electricity.


should easily cover your electric bill even at 10c per kilowatt.hour.

Right now each of my 6 card rigs are getting 12 bucks a day (1070's), my last rx 580 rig is still getting 11 bucks a day. With power my 1070 rigs are getting just above 10 bucks a day, the rx 580 rig 8 bucks a day. Don't know why yours went down so much from 600 bucks to 150 bucks, something seems to be wrong. The Byzantium patch for Eth came on line on the 16th, which should have increased your returns by around 30%.
 
Last edited:
Sorry, I should have clarified (but I don't want to get too off-topic). Basically my power plan starts at 12 cents, but goes as high as 30 cents per kWh if you use "too much."

I had a pretty aggressive overclock on the 6x 1060's I have, meaning I quickly reached the 30c level on my power bill. With the OC, I was producing around $300/month but the power bill jumped by $250. Still a $50 gain, but almost not worth it.

So now I have set power limit on the cards to 50% what it was before and I've been running for a few weeks to see if I can stay under my power company's "too much" plan. But that means I'll only pull in $150/month, so it remains to be seen if this will be better than overclocking and eating the cost.
 
wow that is really high!, I'm paying 10 cents max. I've been looking for places to rent, industrial, and power prices drop down to around 4 cents. But to cover the rent and insurance of course need a good amount of rigs.
 
should easily cover your electric bill even at 10c per kilowatt.hour.

Right now each of my 6 card rigs are getting 12 bucks a day (1070's), my last rx 580 rig is still getting 11 bucks a day. With power my 1070 rigs are getting just above 10 bucks a day, the rx 580 rig 8 bucks a day. Don't know why yours went down so much from 600 bucks to 150 bucks, something seems to be wrong. The Byzantium patch for Eth came on line on the 16th, which should have increased your returns by around 30%.

10c is unusually cheap in California its 20 cents at first tier and then 27 cents at second tier. Thus why I never did mining as the electric costs are stupid out here, also why I slapped up solar panels a few years ago.
 
damn man that sux, NYC 12 cents is max, long Island 10cents, upstate goes as low as 6.5, industrial parks 4 cents. That's after taxes fees what not.....

Maybe its the type of power? most of the power here is coal/natural gas and hydro

But that is why is when I looked into solar, it wasn't worth it, cause I get power so cheap, the cost of the solar panels will never be covered it will take like 20 years lol.
 
Last edited:
damn man that sux, NYC 12 cents is max, long Island 10cents, upstate goes as low as 6.5, industrial parks 4 cents. That's after taxes fees what not.....

Maybe its the type of power? most of the power here is coal/natural gas and hydro

But that is why is when I looked into solar, it wasn't worth it, cause I get power so cheap, the cost of the solar panels will never be covered it will take like 20 years lol.

Yeah, for me in CA, the payback is 6 years. Panels it is.
 
Florida starts out at 8 cents and maxes out at 12 cents, I always hit the 12 cents range except in the winter which sometimes I fall below it. A/C eats up a lot of power, adding more heat via mining does not help in the hotter months which for Florida is most of the year it seems.
 
Last edited:
where did you set the NVidia card to 10 bit?

I just swapped from the Vega 64 to the 1080TI and yes - Nvidia looks washed out (and has less black levels) compared to AMD.

There's no question. Anyone who would argue otherwise is half blind.

I don't see the 10 bit toggle in the Nvidia control panel. I'm really disgusted with Vega's buggy drivers right now - so if I can get the same look I'm going to move back to Nvidia.



EDIT - nevermind - easy to find in a google search


trying it now.
End results? Tomorrow I will post screen where it is changed at, it is under the display section of the drivers for anyone else that may have a Nvidia card hooked up to a FreeSync monitor that are 10bit. AMD automatically will configure to 10bit if the monitor supports it.
 
Last edited:
End results? Tomorrow I will post screen where it is changed at, it is under the display section of the drivers for anyone else that may have a Nvidia card hooked up to a FreeSync monitor that are 10bit. AMD automatically iwill configure to 10bit if the monitor supports it.
yes it's better -- to the point where now I'm wondering if any difference left is placebo or not - but there is a shadow of a doubt that the AMD card still looks a bit superior to the 1080TI at 10 bit ---- but it's not nearly as drastic as it was with the NVidia running 8 bit. If there's a difference now - it's tolerable.

Would those types of differences show up in a screenprint taken with each card? probably not eh?
 
yes it's better -- to the point where now I'm wondering if any difference left is placebo or not - but there is a shadow of a doubt that the AMD card still looks a bit superior to the 1080TI at 10 bit ---- but it's not nearly as drastic as it was with the NVidia running 8 bit. If there's a difference now - it's tolerable.

Would those types of differences show up in a screenprint taken with each card? probably not eh?


AMD cards still have a little bit color (more contrast then anything else), that's why I turn up the digital vibrancy to 55%
 
Back
Top