In my experience there are basically two types of reps. (with a grey area in between) Ones that are truly helpful and concerned about their products and reputations. And ones who think their shit don't stink and are doing you a favor every time they talk to you. Look at the posts. It's easy to...
Looking at Samsung's site they have released 3 models. All are 144Hz, HDR, Freesync 2, 10 bit , VA (178° viewing), and 1ms g2g. 27" (looks like the 27" isn't out yet) and 32" 2560*1440 and 49" 3840*1080. Just trying to address your original statement that you couldn't currently buy one...
HDR is part of the Freesync 2 spec. AMD didn't want to put too many conditions on the original spec that might have hindered adoption. Now that the spec is widely supported though they can require higher performance to get the Freesync endorsement.
I actually think including the monitor in the original decision is better than doing it ahead of time or making it an afterthought. There's no denying though that you can get a more powerful PC with a "highend" Freesync monitor than a similar Gsync.
[H] acknowledged that this comparison isn't statistically significant. Not to read too much in to it.
There is far more that could be done to improve the analysis. I do find it interesting that overall the two systems (and you can't finger the cards specifically) playing this game perform...
What was the false hype? I missed it.
AMD has much lower overhead than nVidia. Also, they are using these cards, Fiji and Vega, as test beds for HBM. The payoff is still down the road.
Well one thing Ryzen's release did was show us that everyone who has a high end video card is CPU bound. I seem to recall most people who dislike AMD thought that current rendering API's were only beneficial to the crap CPU's that AMD made. It was some sort of conspiracy bankrolled by AMD for...
And unfortunately there are review sites who will continue to do that forever. Ryzen could end up better by double digit % and you'll never know it reading those sites.
So now CERN is hiring experts on the physicality of ghosts? Has he published a paper on the subject? Taken appropriate courses? Where did he get his degree in "spiritology"?
It's quite common on a new node. Remember the 4770? 1st on 45nm, IIRC. Had stock for a bit and then... good luck! Been pretty much the same since on new nodes.
If you go by past iterations of DX why should they have believed it wouldn't be replaced for 8 years? Before that it was pretty much an annual event having a new DX version.
Anytime I've seen a site reinvent the wheel by changing their review process for a product it's because they want to highlight something, good or bad, about said product. If you want to be fair then test it the same way you always have. And this card is targeted at 1080p gamers. Not testing...
Semi accurate is ad free. Besides that, all of his articles are behind a pay wall. The forum is closed to new subscribers. Any type of click bait reasoning is completely impossible to substantiate and is merely sour grapes.
You are deflecting the real issue. nVidia lied about the cards specs. Not just the VRAM either. They used deceptive advertising with inaccurate specifications. Yet some people think this is OK. I can't understand it, personally.
If that;s the best part, what's the worse?
Not if the gains are from Async Compute, I'd imagine. They should still see an increase because of more efficient CPU usage though.
That would be an ill informed opinion, anyway. Not only are they different uarchs, they aren't on the same process either. There's also no reason to think AMD's design needs to run at the same clocks for similar performance.
You mean something like Pro Duo? ;)
Seems like they missed a chance to make it look like a supreme value and instead look like they are justifying the price. It really is an awesome card if looked at from the pro angle. I'd love to have one in a graphics work station.
They've had an MGPU advantage for a while now. So much that nVidia has lessened the importance of it. I think this is their way of getting devs to concentrate on it more. More shaping of the ecosystem. Quite clever actually.
Wasn't you I quoted was it? Then you take what I said and exaggerate it (again proving the weakness of your position, and exaggerate is putting it mildly. Lying might be closer) I NEVER said it only exists to hurt performance. That would depend on how it's used.
AOTS runs faster in DX11? Not...
While there are always a vocal few it is far from everyone. You are exaggerating the truth trying to fabricate a situation that doesn't exist. Sure, there are people, and I'm one, who believe that AMD has an advantage in DX12. Never believed that all DX12 games were going to be well optimized...
I see more nVidia supporters trying to devalue it than AMD supporters touting it. So, what's the big concern then if it's not all that?
Also, comparing over tessellation to async shaders is ridiculous. One improves performance for the hardware that supports it. The other hurts performance for...