Boil
[H]ard|Gawd
- Joined
- Sep 19, 2015
- Messages
- 1,439
Not all FreeSync is "cheap".
See C32HG70, costs the same as XB271HU.
G-Sync "tax" aside, you are comparing two different sizes of monitor with different feature sets...
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Not all FreeSync is "cheap".
See C32HG70, costs the same as XB271HU.
My OC'd GTX 1070 (reference card but with ACX 3.0 cooler, stays below 65c while fan RPM stays below 1400) scores above 20K. No extra voltage and not even the max overclock.VEGA 56 3Dmark Benchmark
Just of the top of my head, these seem like OCED 1070 scores, correct me if I'm wrong
That's the cloest I could get them to equate one another. Because the G-Sync tax for C32HG70 is "it doesn't exist".G-Sync "tax" aside, you are comparing two different sizes of monitor with different feature sets...
Something like this:
https://www.amazon.com/AOC-G2460PF-24-Inch-Gaming-Monitor/dp/B01BV1XBEI/ref=as_li_ss_tl
$200 with a 35-144hz range
That's the kinda "budget" I'm talking about....
For $600 total (incl. a Vega 56), ain't a bad deal!
(Especially if you're more inclined towards e-sports/crazy fps gaming etc)
*As you can see from my sig, I'm not one that is prone to dropping $700 on a monitor
It all boils down to facilitating a need in the marketplace and I believe AMD might have hit the nail on the head with the Ryzen 1600X, Vega 56, and a Freesync monitor.
that C32HG70 has quantum dots, (New color enrichment tech), and is 32" and 1440pNot all FreeSync is "cheap".
See C32HG70, costs the same as XB271HU.
This is disappointing... an OC'd 1070 can reach 20K.
Something like this:
https://www.amazon.com/AOC-G2460PF-24-Inch-Gaming-Monitor/dp/B01BV1XBEI/ref=as_li_ss_tl
$200 with a 35-144hz range
That's the kinda "budget" I'm talking about....
For $600 total (incl. a Vega 56), ain't a bad deal!
(Especially if you're more inclined towards e-sports/crazy fps gaming etc)
*As you can see from my sig, I'm not one that is prone to dropping $700 on a monitor
It all boils down to facilitating a need in the marketplace and I believe AMD might have hit the nail on the head with the Ryzen 1600X, Vega 56, and a Freesync monitor.
If people are on a budget then they may be best waiting for HDR monitors rather than buy Vega and Freesync now.
I say may because no-one can really say how much impact it will have with developers and market for monitors, but it will add to the cost of a monitor making it even more painful to buy Freesync now and then replace with a Freesync monitor end of the year or very early next year.
If already have Freesync sure it may be a no brainer, but in the context you raise of value concerns combined with VRR I do not think it is so clear cut especially if HDR does (big if) live up to what it is meant to offer, and this is ignoring possible shadow of 4-5 months with the GV104 launch more specifically '2070'.
I guess one needs to weigh up how much they want VRR now and when they would then be able to get onto HDR or happy to accept there will be another performance leap between now and Navi (which with AMD's record will probably launch late and especially so with them now adding AI core functionality into it).
Point being not that clear to me unless one can say HDR is going to be a flop with minimal impact and minimal ...
No, they can easily do that and without extra voltage.I feel like a 1079 needs to be oc'ed pretty heavily to hit those scores. Not a bad show the.
Your statements are reasonable. But how did I know you would shill for nvidia eventually?
Hdr is in the very early stages like 4k was. Remember when you needed 2 cables to display it. And not every format of 4k refresh/depth/bit format is supported yet (even if you leave out 3d formats). Your box may support the 21mbit format and your monitor may not.
I predict the HDR will fall out the same way. All the monitors now have compatibility problems.
Eventually one upgrades everything...
RX Vega 56 & low cost FreeSync monitor now...
Navi & HDR FreeSync2 monitor later...
I had a feeling my context would be missed.Eventually one upgrades everything...
RX Vega 56 & low cost FreeSync monitor now...
Navi & HDR FreeSync2 monitor later...
as much as I have bashed amd for vega. tweaktown benches weren't for 3dmark. and 3d mark isn't a game. Yea it will give you general ballpark, but games can be a totally different story.
It's a fair prediction. I'm an even more avid enthusiast in the home theater forums and the high end displays with HDR and top shelf 4K players with HDR are rife with growing pains, mismatches standards and frustrating incompatibilities.Your statements are reasonable. But how did I know you would shill for nvidia eventually?
Hdr is in the very early stages like 4k was. Remember when you needed 2 cables to display it. And not every format of 4k refresh/depth/bit format is supported yet (even if you leave out 3d formats). Your box may support the 21mbit format and your monitor may not.
I predict the HDR will fall out the same way. All the monitors now have compatibility problems. And to be honest they all lack in proper dynamic contrast.
That unfortunately is the competition between 2 standards pushing for dominance and they did jump early 2016.It's a fair prediction. I'm an even more avid enthusiast in the home theater forums and the high end displays with HDR and top shelf 4K players with HDR are rife with growing pains, mismatches standards and frustrating incompatibilities.
Which is actually worse than before, because their effectiveness is rather limited right now, ie: they don't offer any tangible performance increase in current games. Vega appears locked at 4 tri/cycle. At least with developer intervention hope is there for it to offer more with proper handling. Now it won't even do that.Finally some confirmation primitive shaders are used everywhere. Not a huge surprise considering they're programmable.
Ut oh... we need details! But he isn't really allowed to post any. Can he even post what he did without breaking NDA?
Does seem a bit of a grey area/tightrope he is walking.Ut oh... we need details! But he isn't really allowed to post any. Can he even post what he did without breaking NDA?
Finally some confirmation primitive shaders are used everywhere. Not a huge surprise considering they're programmable.
Eh?
Nothing in my post is about shilling for Nvidia.
Only one showing bias is your deliberate insulting/slighting of me.
If I want to shill I would weigh much more in my post about Volta than I actually did, along with being highly critical of AMD on the GPU side with a rather large breakdown.
In fact earlier I actually supported the leak Firestrike results as being pretty reasonable in general and its specific context where some were ignoring the MSI 1080.
People suggesting "go ahead and buy Vega AND Freesync now and save yourself cash" without pointing out the possibility of HDR and looming shadow of Volta '2070' are the actual shills, and may actually financially cost those people more as they pass on the simple recommendation of "do both now! its great value".
Who is the actual shill here, me or you who is already discounting HDR when no-one can yet make any conclusions albeit though consoles will use it heavily?
There are actual members here who went Freesync but ended up having to replace their 390/390x with an Nvidia GPU, so their lovely Freesync monitor was not doing much.
Meaning a repeat cycle could happen where some buy the Vega56 with Freesync and before they know it end up in the very same situation as those with the 390+Freesync in the past, and probably with a repeat upgrade cycle for the very reason I hinted at of Navi now incorporating AI core development (consider how late Vega is this should give you some indicators that Navi will run late with such R&D complexity added).
Like I said it is a tough call, and THERE IS NOT A SIMPLE RECOMMENDATION.
Which is actually worse than before, because their effectiveness is rather limited right now, ie: they don't offer any tangible performance increase in current games. Vega appears locked at 4 tri/cycle. At least with developer intervention hope is there for it to offer more with proper handling. Now it won't even do that.
Sure there is a 'tax' for GSync and when matching identical from same manufacturer cost is between £120 to £160 unless looking at the most extreme GSync models; I doubt anyone has ever disputed that and not something I ague about.I'm not here to get into a pissing match. The point being it doesn't matter if you pick nvidia's volta (when it does come out) or Vega. You're still going to get the nvidia tax. And to be honest you should really wait a year until the HDR compatibility issues settle out.
Sure there is a 'tax' for GSync and when matching identical from same manufacturer cost is between £120 to £160 unless looking at the most extreme GSync models; I doubt anyone has ever disputed that and not something I ague about.
But there is much more at play than just that, as I outlined without being a shill about it.
And tbh I doubt people will replace their monitors 12 months down the line as it is still not good value sense to do that (people who upgrade from 390/390x actually preferred to keep the monitor even though they upgrade to Pascal, showing there is a reluctance to actually upgrade monitors with any kind of frequency and so does play into possibly waiting on the monitor side with regards to factors I raised)
But from a cost perspective, I gave clear real world examples where Vega+freesync may not be an actual value proposition.
It is not as clear cut as some make it out even from a cost perspective; like there will be enthusiasts who will not be able to wait for Navi or not want to wait 1.5-2 years for HDR if it does take off because they just recently purchased their Freesync monitor with Vega.
If one must buy Vega, I would seriously consider doing it but waiting on the monitor side until it becomes clearer what is happening, unless one just cannot live without VRR and importantly is happy to ignore the other looming factors I mentioned.
To reiterate my posts come back specifically to the value proprosition raised and factors one should be considering/weighing that can change this.
Cheers
Ut oh... we need details! But he isn't really allowed to post any. Can he even post what he did without breaking NDA?
How will AMD make a profit when miners, scalpels, or stores sell them 300+ over MSRP? Not like they turn around give any of that money back to AMD.
These card will most likely sell out due to people who hate green, love underdog, only available card on the market for a short time.
Now I am not saying AMD will not make any profit, whatever the cost of the card/production (unknown to me) and MSRP is they will get. But everyone else posted above will make far more profits than AMD if they can snatch em up and sell them highly over priced.
AMD has to do the work and guess what that is why you won't see more than Polaris's geometry through put most of the time, I have already confirmed this. So unless the application is changed to take advantage of primitive shader, we will not see any improvements in Vega when it comes to geometry through put.
.
Remember when the 7970s came out and people said the same things about they're not competing.
My 7970 can play @ 1440p over 45 fps with every game I have tried it with.
I don't think there is anything from nvidia since the 9800 that had this kind of life in it.
Navi wait for Navi!!!!!!!Well I'm waiting for Kyles review. Then I'll decide. You can always wait for the next best thing, but then you'll have nothing. But with new tech, you should always wait a couple gens until all the standards settle and work nicely.
Apparently even the reviewers packaging says about a minimum 1000 watt psu, for the one that got the aio version at least.
I mean... even if it's 400 watts power draw, the rest of my computer does NOT use more than 200-300 watts under load...
Threadripper 19050X running OCed, my systems have busted 500w with zero GPU load.A high end intel processor with a high OC can easily pull that much, just for the processor. I've gotten up to 388W on my 5960x at 4.8Ghz.
It's not bad advice if you're OCing everything.
Low standards?i've saying for months, tenths of times,that geometry throughput was exactly the same as polaris, I did said several times that Polaris is to VEGA what Tonga was to Fiji, they release the same architecture revision first in the form of a mid range to study, learn, improve and optimize drivers and tech before the high-end.
lol you must have a very low standard for settings or be playing minecraft, ancient games or low settings on modern games to keep a 7970 playing at 2560x1440, I have an R9 280X at 1200mhz with mem at 6500mhz and still do a decent job at 1080P, that machine is used by my nephews every freaking week when they come to visit and everytime I have to optimize the hell out of modern games at 1080P to keep playable framerates.
using similar "logic"Not all FreeSync is "cheap".
See C32HG70, costs the same as XB271HU.