I'd prefer ULMB over G-Sync, but from what I've seen those tend to come together. And 144 Hz, too. I haven't seen a 32" 1440p 144 Hz display with a VA panel(let alone IPS).
Ah yes, the inevitable "CRTs are still better" BS. Sorry, that ship has sailed a long time ago.
Getting a X34 is unquestionably a better monitor than a fat, smallish and flat CRT which causes massive eyestrain. Let's put that pro-CRT BS to death.
Come CES, I want to see a 1440p 32" VA...
Not sure if trolling? Three years ago, the high end single dGPU for NV was the 680 and for AMD it was the 7970. You take a look at the Fury X/980 Ti today and they are wiping the floor with the old cards.
I'm honestly laughing at you.
The 960 2 GB VRAM was/is a terrible card. NV discontinuing it is a show that it understands that.
As for drivers, I have a 980 now which has significant issues with drivers on G-sync. I didn't have a field day with AMD drivers, but the supposed "NV better drivers" meme is total BS.
Your opening statement is too broad. You have to be more specific about what kind of games you'll play, at what resolution and when you expect next upgrade. Also, what is the budget.
Finally, ignore the NV trolls in this thread. The 960 2 GB VRAM was DOA, yet a bunch of people(including the...
Oh, it's not just SweC. TechReport shows similar differences:
http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/8
It's more games than Civ, of course.
So there does seem to be a divide. Sites that show some discrepancy vs sites that show little to none.
http://cdn.sweclockers.com/artikel/diagram/10350?key=0a416d7eb565e02c673d198a107ba606
Sweclockers' benchmark do show a difference. Certainly if you compare i7 Sandy to i7 Skylake. There was a substantial difference in Dragon Age: Inquisition, GTA V etc etc, as well.
Also, look at i5...
Except that in many cases the increases have been half that. Skylake's main potential, aside from the token IPC gains, will be a lot of PCI-E lanes for all that SSD:age.
I'll borrow this thread for a while.
I was wondering, I got a 24" 1080p monitor now. If I upgrade to a 27" 1440p, the ppi is around 19% higher (even if the pixelcount is 77%, once the area served is larger, the sharpness decreases).
So would it be worth it? Anyone made the 24" 1080p to 27"...
I have used all three OS's. I've used WP for about a year.
I can tell you that every app you more or less need is there right now. Some custom apps are better than the standard apps. For example, TouchTube, a custom YouTube app, lets you download HD videos offline over Wifi and it keeps them...
You're not going to see cynicism from me. This is a great sign.
It's the right form factor, resolution, everything.
Yeah, it's not going to be available for consumers, but the fact that it is even being made in those areas is awesome. 4K monitors fell in price from 3000 dollars to 500...
Now more stores have it. Most have a late-march deadline. Inet are often quite optimistic about their delivery times.
I've ordered one. I want a monitor with very low input lag and this will be a lot cheaper than the BenQ Freesync model. I'll see if I'm happy with it, the worst thing to happen...
So we're seeing OLED screens in a lot of tablets these days, not just Samsung. (Like Dell's recent Venue 8). Ditto in mobile phones, like Microsoft's L930.
It's becomming more and more mainstream. Is it easier to blow up a small screen to a large one or a large one to a smaller one? (I'm...
Just as a refresh(pun unintended), here's the monitor:
Here's techpowerup's story. They don't have a price, but I took that from Inet(Sweden has 25% VAT plus import duty, you take that away and you get about 500 dollars).
So the Acer XG270HU was originally a G-Sync model but now Acer is annoucing a Freesync one. Their price is 500 dollars. This is interesting, because it undercuts the exact same specs of the BenQ by 150 dollars.
The specs is bascially the same as ROG Swift but with thin bezels for that stylish...
That is a strange argument. 4K on 24" requires scaling to be readable and quite significant scaling too.
But your second argument vs 27" and 32" is "well you have to deal with scaling". Well, newsflash, scaling on 24" is going to be a bigger issue than on 27". And if it isn't, that's because the...
If your definition of "high end" remains anchored about a greedy proposition called Gsync then I'm not surprised you're disappointed. I'm only surprised you weren't from the start.
With falling smartphone profits, Samsung are diversifying their production. Now they are looking for more external clients, primarily for smartphones and tablets, Reuters reports.
First, it will be internal focus on their own products as more and more stuff goes OLED, but they are also...
It will effectively have Freesync, since Freesync is essentially Adaptive Sync but with an AMD-guaranteed gamut.
http://www.pcper.com/news/Displays/CES-2015-ASUS-MG279Q-27-2560x1440-IPS-120-Hz-Variable-Refresh-Monitor
I disagree. The LCD market is innovating rapidly. We were in stagnation a few years ago. Now we're seeing the mainstreaming of 1440p, 4K is affordable. Adaptive Sync is becomming the norm, the specter of 100+ Hz monitors is just beginning and OLED is around the corner(which of course isn't about...
Really embarrassing that you don't get info on whether it is TN or IPS. Why are they hiding that? They say 178 viewing angles which implies IPS but not stating it outright is suspect at best.
Also, any monitor that you're going to game on that hasn't G/F-Sync in 2015 is a pointless purchase...
So Digitimes is reporting that we'll likely see 4K mobile phones this fall.
This is consistent with earlier reports. Such as this Samsung slide from last year.
(Source)
My reactions:
1. Why? Totally unnecessary and stupid
2. Please go on, not because I think that we need that in mobile...
Nice contribution, Rafall! I have a few questions. Did you calibrate your screen after you received it(I assumed you did), and if so, by how much did you have to tweak? Was the out of factory calibration decent?
What are your (average) FPS in other games, which are better optimized than newly...
How common is the flickering for DP 1.2? How bad is it generally? I mean, without DP 1.2, you lose a lot of bandwidth. Disabling DP 1.2 is a nonstarter for a 144 Hz version of this kind of monitor, like the Acer one.
Why not just return it and get a new one? It sounds like a problem with the panel. Most people in the 4065UC thread haven't had this problem.
Seiki is coming out with a similar model in a few months.
That's because very few games use more than 4 GB of VRAM, Shadow of Mordor is one of the very few exceptions that proves the rule. And at any rate, Nvidia's double VRAM cards are typically hilariously overpriced. Still can't believe people buy those cards, then again I laughed at people who...
So Sweclockers are reporting that the first few batches are coming in in the beginning of February. Again, the price hasn't changed, about 50 dollars(exl. VAT) less than ROG Swift.
Pretty sure the etailers have gone into a cartel. Sweden doesn't have Amazon because we have a lot of small, niche...
The first prices for the Benq XL2730Z has arrived in Sweden and it's a bummer. It's only 50 dollars cheaper than Swift(using Swedish prices ex VAT, not Newegg). Then again, there are only two resellers. For the Swift, we're talking about 10+. As more competition comes in, I'm guessing the prices...
It isn't confirmed to have Freesync but it's important to note that Freesync is basically very close to Adaptive Sync(the underlying tech behind both Freesync and G-Sync).
The monitor has DP 1.2a and AMD has stated that the same video cards that will work with Freesync will work with that...
Was that matte film ever a real problem, though? I make a distinction between actual real-world issues and pet-peeve issues. Not all matte films are created equal, just not like all TN panels are. The difference between a typical 1080p TN panel and the Swift's are miles apart.
The difference between 1080p and 1200p is negible. Especially as 1440p becomes the new mainstream and soon 4K. To pretend otherwise is to delude oneself.
16:9 was inevitable, primarily because of TVs as already pointed out. It's better for gaming too, for the same reason why 21:9s are so...