Intel Wants You to Use Vulkan

The predictable red/green tribal bickering doesn't apply here since this is not Intel taking a shot at Nvidia - they already embrace Vulkan (notice Vulkan Runtime on list of installed programs after installing Nvidia driver). However its also a zero sum game for Nvidia to push and promote Vulkan over DX development, so they don't.

Intel on the other hand has a big vested interest in Vulkan, since their IGPs as well as AMD APUs run games better on Vulkan than DX - it's a boon for lower end hardware, which also happen to outnumber high end gaming dGPUs worldwide by about 100:1.

This is good news for PC gamers. This is also good news for Linux gamers. Viva la Vulkan.
 
Last edited:
For their GPU's. You've ran Vulkan games on a Intel HD 4400?
Do you have trouble reading? Are you just being obtuse? Your original comment on this said that Sandy Bridge, Ivy bridge and Haswell CPUs do not support vulkan in Windows. Not a damn thing was said about the integrated graphics when you mentioned that so I had no idea what you were even talking about. And then if you bothered reading, someone else already mentioned maybe you were referring to the integrated graphics and I said that makes sense but even the fastest integrated graphics in those CPUs is too slow to run the games anyway.
 
The Vega64 comes a lot closer than people want to admit in most cases. I just bought one, despite the performance deficit between it and a 1080ti because 1) F*** Nvidia, 2) Monitors are SIGNIFICANTLY less expensive, 3) my gaming will be at 3440 x 1440, and 4) Samsung's support of Freesync on 2018 TVs.

This summarizes as 'I want less performance because feels'.

Vega64 is closer to the 1080 than some want to admit, and the cheaper Freesync displays are crappier than some want to admit (I have one :) ). 3440x1440 is going to stress anything out today, too. I consider my 1080Ti with a fixed +33% overclock to be barely fast enough for 2560x1440, finding it regularly in the 60-80FPS range.

And I have no interest in Samsung's fake-HDR TVs; I watch TV on my TV and prefer the black depth of OLED. And Vega would still be too slow for 4k :ROFLMAO:.
 
This summarizes as 'I want less performance because feels'.

Vega64 is closer to the 1080 than some want to admit, and the cheaper Freesync displays are crappier than some want to admit (I have one :) ). 3440x1440 is going to stress anything out today, too. I consider my 1080Ti with a fixed +33% overclock to be barely fast enough for 2560x1440, finding it regularly in the 60-80FPS range.

And I have no interest in Samsung's fake-HDR TVs; I watch TV on my TV and prefer the black depth of OLED. And Vega would still be too slow for 4k :ROFLMAO:.
33% overclock on a 1080ti? What fantasy world are you living in? Even the reference 1080ti holds right around 1800 megahertz for GPU boost so that would mean you are overclocked to 2400 megahertz...
 
33% overclock on a 1080ti? What fantasy world are you living in? Even the reference 1080ti holds right around 1800 megahertz for GPU boost so that would mean you are overclocked to 2400 megahertz...

Base is ~1500Mhz, run it at 2000Mhz, ergo +33%.

[it's the closed-loop watercooled version, it does the above 24/7 at <60c :D ]
 
Base is ~1500Mhz, run it at 2000Mhz, ergo +33%.

[it's the closed-loop watercooled version, it does the above 24/7 at <60c :D ]
Base speed is irrelevant as that's not where the starting point actually is. The starting point is the actual GPU boost you have under load. For instance my MSI gaming x 1080 TI is over 1900 under load right out of the box and the reference cards typically hold pretty close to 1800.
 
Base speed is irrelevant as that's not where the starting point actually is.

It's where your card is boosting from, so quoting boosts and overclocks from base is the only way to have a point of reference. Picking what your card boosts to at a particular time or some random 'these cards usually boost to x' is irrelevant.
 
It's where your card is boosting from, so quoting boosts and overclocks from base is the only way to have a point of reference. Picking what your card boosts to at a particular time or some random 'these cards usually boo7st to x' is irrelevant.
You have clearly learned nothing about GPU boost. The advertised base clock is 100% irelevant relevant when talking about overclocking and this has been discussed to death for years. Your GPU was never running at 1500 so it's stupid to use that to base your percentage off of overclocking on. Again if you know what you're talking about then you base your overclock percentage on what your GPU was actually boosting to in the first place. Maybe bother to actually look at the reviews right here on this very site as what I am saying is correct and what you're saying is not.
 
Source?



It absolutely has.



It's pretty stupid to make an illogical argument and when called on it to start making personal attacks.
Thanks for continuing to prove that you have no idea what the hell you're talking about. And apparently you can't read either as I just told you all you have to do is even look at the reviews right here on this site. When Kyle or anyone else here test a video card they look at what the average boost speed is and use that as a baseline when they give the percentage of how far they can overclock the card. That's the way that you always give the percentage when you're dealing with a card that has GPU boost. It's not my fault that you are ignorant about how that works.
 
Thanks for continuing to prove that you have no idea what the hell you're talking about. And apparently you can't read either as I just told you all you have to do is even look at the reviews right here on this site. When Kyle or anyone else here test a video card they look at what the average boost speed is and use that as a baseline when they give the percentage of how far they can overclock the card. That's the way that you always give the percentage when you're dealing with a card that has GPU boost. It's not my fault that you are ignorant about how that works.

And their average boost speed is based on their copy in their testing environment...

It's not ignorance to understand that copies and testing environments vary. That's why there's a baseline. That's why I quoted from the baseline.
 
And their average boost speed is based on their copy in their testing environment...

It's not ignorance to understand that copies and testing environments vary. That's why there's a baseline. That's why I quoted from the baseline.
And you are not comprehending that the base speed is not the Baseline when you were calculating the percentage of an overclock. Again look at reviews here because they know what they're talking about just like I do. The Baseline is the average boost that you're actually seeing in games under load. Yes that can vary a little bit from card to card but that has nothing to do with anything.

If the average boost you are seeing in the games out of the box is 1775 or so and you can overclock it to 1950 average boost or so then that's what you base your percentage on because that's where the actual Improvement is. You do not base it on advertise clock especially not the advertised base clock. Everyone that follows gpus should know that is the way it's been ever since gpu boost was introduced many many years ago.

Anyway I'm done here as you're never going to admit that you're mistaken and you're just going to be stubborn about it instead of accepting how things are supposed to be done.
 
So, let's start: for the reference 1080Ti, The [H] has it at 1480MHz base and 1582Mhz boost. There's no mistake. That's the baseline speed, per [H], on the Nvidia reference card.

Quoting anything else is quite useless for everyone except for the user of the specific card as such a quote has no frame of reference to all of such cards. I qouted overclock from the base because that pins a reference point that anyone can use, which was the goal of posting above.
 
So, let's start: for the reference 1080Ti, The [H] has it at 1480MHz base and 1582Mhz boost. There's no mistake. That's the baseline speed, per [H], on the Nvidia reference card.

Quoting anything else is quite useless for everyone except for the user of the specific card as such a quote has no frame of reference to all of such cards. I qouted overclock from the base because that pins a reference point that anyone can use, which was the goal of posting above.
It is hard to believe that you are actually this ignorant and lack even basic reading comprehension skills. Again look at the fucking reviews right here on this fucking site as they say the same thing that I do.
 
It is hard to believe that you are actually this ignorant and lack even basic reading comprehension skills. Again look at the fucking reviews right here on this fucking site as they say the same thing that I do.

I linked you a [H] review, quoting the part that I'm referencing, which you just quoted :).
 
So, let's start: for the reference 1080Ti, The [H] has it at 1480MHz base and 1582Mhz boost. There's no mistake. That's the baseline speed, per [H], on the Nvidia reference card.

Quoting anything else is quite useless for everyone except for the user of the specific card as such a quote has no frame of reference to all of such cards. I qouted overclock from the base because that pins a reference point that anyone can use, which was the goal of posting above.

It is hard to believe that you are actually this ignorant and lack even basic reading comprehension skills. Again look at the fucking reviews right here on this fucking site as they say the same thing that I do.

Relevant quote from TFA:
We did try higher clock speeds of course and found that our video card actually froze in games at around 2038MHz. By setting the offset to +150 we start out at 2025MHz, just under the frequency it froze at. However, as you will see below the GPU frequency doesn’t stay there for long, it does drop into the mid to upper 1900s for long periods of gaming. That is still a good boost over the default 1721-1781MHz before the overclock. It represents around a 250MHz overclock or around 14% overclock.
14% relative to 1721, not 1480 or 1582. That said, this argument is silly and you should both stop.
 
I linked you a [H] review, quoting the part that I'm referencing, which you just quoted :).
Are you really even serious at this point? I'm just on my phone right now but when I get stopped later on I'll link you to the actual overclocking section in any of their reviews as they don't fucking use the base clock as the Baseline for overclocking percentage difference. Nobody Does that except for idiots that do not understand GPU boost at all.
 
Last edited:
14% relative to 1721, not 1480 or 1582.

Absolutely, but again, this is for their card in their testing environment. To use their clockrange, I'd have to have had referenced their review. Or someone else's review. None of which would be useful to another [H] reader with a different card in a completely different environment.
 
Absolutely, but again, this is for their card in their testing environment. To use their clockrange, I'd have to have had referenced their review. Or someone else's review. None of which would be useful to another [H] reader with a different card in a completely different environment.
My god you still do not comprehend it. You are comparing the advertised base clock to your actual in game boost after you overclocked and stupidly thinking you have a 33% overclock. If you are going to look at the base clock as a reference then also use the base clock speeds after you overclock for an apples to apples. Let me break it down for you since you have so much trouble with this simple concept.

base clock to base clock
boost clock to boost clock
actual in game boost under load to actual in game boost under load

Those are the clocks that would be apples to apples before and after you overclock and only an idiot could not comprehend that. You are foolishly looking at the advertised base clock as your baseline for an oc yet you cant get through head that you were NEVER at the base clock under full load in the first place. You were at whatever gpu boost took you to. And whatever gpu boost took you to is the baseline for calculating how much you overclocked your card.

This is how any reviewer including right here on this very site does it.

"We found on the previous page that the clock speed hovers right around 1897MHz while gaming. This will be the default frequency we compare our overclock with. "

https://www.hardocp.com/article/2016/10/12/msi_geforce_gtx_1070_gaming_x_8g_video_card_review/5

If you still do not get it and cant admit you are wrong then there is really something wrong with you.
 
Last edited:
The only time Pascal is at base clock is in between idle states on desktop. The base clock is meaningless.
 
This will be the default frequency we compare our overclock with.

For their card, for their review, in their testing environment...

I don't have their card, and it's not in their testing environment. Or anyone else's other than my own.

base clock to base clock

Once overclocked, my card runs at a single speed under load. This is an artifact of the cooler setup which has significant thermal headroom, and a key part of my original post.

So the new overclock is the new base speed.
 
My god you still do not comprehend it. You are comparing the advertised base clock to your actual in game boost after you overclocked and stupidly thinking you have a 33% overclock. If you are going to look at the base clock as a reference then also use the base clock speeds after you overclock for an apples to apples. Let me break it down for you since you have so much trouble with this simple concept.

base clock to base clock
boost clock to boost clock
actual in game boost under load to actual in game boost under load

Those are the clocks that would be apples to apples before and after you overclock and only an idiot could not comprehend that. You are foolishly looking at the advertised base clock as your baseline for an oc yet you cant get through head that you were NEVER at the base clock under full load in the first place. You were at whatever gpu boost took you to. And whatever gpu boost took you to is the baseline for calculating how much you overclocked your card.

This is how any reviewer including right here on this very site does it.

"We found on the previous page that the clock speed hovers right around 1897MHz while gaming. This will be the default frequency we compare our overclock with. "

https://www.hardocp.com/article/2016/10/12/msi_geforce_gtx_1070_gaming_x_8g_video_card_review/5

If you still do not get it and cant admit you are wrong then there is really something wrong with you.

Do yourself a favor and just put him on ignore, he wont stop despite being wrong and will continue to derail the thread. Personally I hope Vulkan support takes off, seems like it's been a really good open standard.
 
Lol why some people think intel push for Vulkan will spell "shit storm" for nvidia? OpenGL is also open source like Vulkan and nvidia have no problem with that. Some people might said vulkan favors amd hardware (and soon intel) more but that might be limited to Id Tech engine with many more extentions favoring amd hardware being use. I've seen a game in the past that actually running faster on AMD using DX11 but when running using Vulkan the game actually faster on nvidia. And i've seen some benchmark for linux based game that use Vulkan that nvidia performance is better than AMD.
 
Freesync gsync comparisons gets even more funny when you consider the fact that freesync monitors are poping up right and left - while g-sync is a few select models in comparison.. and cost more

having tried both, i would say that they are equal..

as an owner of a 1080ti.. i really hope to be able buy from team red soon.. just give me something close to it.. because that opens up a wide list of monitors and televisions to choose from..
 
Back
Top