ASUS Announces ROG SWIFT PG278Q Premium Gaming Monitor

Ugg, full matte screen. Extensive majority of feedback preference matte? They must be talking to quite a few noobs. ;)

I guess my matte film removal service will get a little bump when this panel releases lol.

Uch, hopefully Benq/Samsung/Viewsonic jump in to fill the void. Too bad you can't offer your services internationally, would be very interested except for the prohibitive shipping costs.
 
I've shipped to the EU. It costs around $110 per monitor but some people are willing to pay. :)
 
I was watching the latest "WAN show" on Linustechtips and at the 1:22:57 time marker they began talking about the whole g-sync vs. free-sync thing; Linus mentioned the PG278Q and went off on a tangent describing what he thought of it from what he saw at the CES. He said that the display is indeed "much better looking than most TN panels", which is comforting because this is coming from someone who swears by his high end IPS models. He said that the panel is not as good looking as a IPS/PLS panel, that's a given, but did iterate on it's high quality. So that's good.

He also stated that while at CES, when no one was looking he went to check if it could run at 144hz and it indeed it can run at 144hz, however, he did not specify if it was running at 144hz at 2560x1440; but I gather from it that this is in fact the case as he wouldn't make sense to laud such a thing if it were only 1080p as we already have that res at 144hz (logical deduction and all that :D). So basically we are 99.9% certain this thing can run 2560x1440 at 144hz.

So yay.

http://youtu.be/cmuxVKCG5ws?t=1h22m57s
 
Last edited:
Blue, my only worry is that he said the XL240TE was a good looking TN. I found it rather average.

As for flicker free, sadly I doubt it as that would definitely be something they would market (which they aren't).
 
I was watching the latest "WAN show" on Linustechtips and at the 1:22:57 time marker they began talking about the whole g-sync vs. free-sync thing; Linus mentioned the PG278Q and went off on a tangent describing what he thought of it from what he saw at the CES. He said that the display is indeed "much better looking than most TN panels", which is comforting because this is coming from someone who swears by his high end IPS models. He said that the panel is not as good looking as a IPS/PLS panel, that's a given, but did iterate on it's high quality. So that's good.

He also stated that while at CES, when no one was looking he went to check if it could run at 144hz and it indeed it can run at 144hz, however, he did not specify if it was running at 144hz at 2560x1440; but I gather from it that this is in fact the case as he wouldn't make sense to laud such a thing if it were only 1080p as we already have that res at 144hz (logical deduction and all that :D). So basically we are 99.9% certain this thing can run 2560x1440 at 144hz.

So yay.

http://youtu.be/cmuxVKCG5ws?t=1h22m57s

DP 1.2 doesn't have the bandwidth to run 1440p at 144Hz. It probably gets close, and I assume does 144 Hz at lower resolutions.
 
Ugg, full matte screen. Extensive majority of feedback preference matte? They must be talking to quite a few noobs. ;)

I guess my matte film removal service will get a little bump when this panel releases lol.


I called them out on this claim to which they retorted with this:

Thanks again for your feedback our community polling and interaction ranges from community sites like Anandtech, HardOCP, Widescreen Gaming, PC Gamer, Steam Communities and dedicated game forums and much more. We take pride in that the feedback is sourced from users across all kinds of usage models and from all kinds of communicates ( most of the ones noted above are just from gaming not content creation or specialized usage models ).

With that noted it is a reality some users like yourself will have a difference in opinion and that is important and appreciated. If I along with the rest of our collective team get enough feedback that we should shift to using glossy displays rest assured we will move in that direction at this time though our collective feedback mirrors this what users are looking for.Feel free to send any feedback you would like to [email protected] Enjoy the rest of your Friday and have a great weekend.

I was not aware of any polling or interaction concerning this product. At least not here.


*sigh* PR.


I seem to have their attention, I am going to throw some of your more technical questions at them and see if I can get a bite.
 
DP 1.2 doesn't have the bandwidth to run 1440p at 144Hz. It probably gets close, and I assume does 144 Hz at lower resolutions.

More likely they can't advertise specs which exceed standard but hardware will do it as usually you have good margin of tolerance
 
Well, I don't think going full glossy is the best answer either. Really, the new semi-gloss films used on the new Dell's and the Eizo's is the best route IMO. Can make virtually everyone happy.
 
Ugg, full matte screen. Extensive majority of feedback preference matte? They must be talking to quite a few noobs. ;)

If you go into any non-enthusiast forum or chat channel, people are going to say they prefer matte, and they will even complain loudly about how companies like Apple have dumped the matte option on their macbooks/screens. That's because most people are used to experiencing LCDs in environments without lighting they can control, often businesses with very bright flourescent lighting that is never at the right angle to avoid reflections, on laptops in public spaces, etc. Reflections and glare ruin a screen a lot faster than the side-effects of matte.

It's only on forums like this where people prefer glossy because they are used to using the monitors in a darker environment where they control the lighting, and in that environment glossy is much better.


Well, I don't think going full glossy is the best answer either. Really, the new semi-gloss films used on the new Dell's and the Eizo's is the best route IMO. Can make virtually everyone happy.

Yeah, I agree with this, the new semi-gloss coatings seem really good. Unfortunately, I would bet money that the way Asus asked this question was 'would you prefer glossy or matte' and given the above, it's no surprise that the majority prefer matte. It just has to do with the environments they're used to experiencing monitors in.

If Asus did something like 'heres 3 monitors, which one do you prefer?'(where one is gloss, one is matte, and one is semi-gloss) in a moderately lit environment I bet you wouldn't see the significant bias towards matte.
 
We'll just have to wait and see.

I prefer glossy as I have complete control over the light of the room in which I game in.

I am of the pragmatic mind that if one can afford (and has the gumption) to spend to $800 on a display to play games on, they can afford blackout curtains, or perhaps re-arrange the room in a way so as to be unobtrusive to the viewing experience.

And if per chance wifey-poo doesn't like the blackout curtains or the flow of the furniture, well, I'm sure she won't appreciate the price-tag on that fancy new screen we bought to play our funny little games on.

I mean this is some enthusiast gear, why bother half-assing it for the lowest common denominator of the buying public? (rhetorical question)

Unfortunately they say they cannot afford to pump out variant SKUs of this model to appease all tastes (some tastes are better than others), which I don't buy for a second. I mean hell if they can do it in Korea they can do it Taiwan.

Oh well, hopefully we can find a solution. I'm pretty dead set on purchasing this, as I've waited several years for it; However I don't plan on making the purchase until well at the end of the year after I've completed my Haswell-E/Maxwell build. Perhaps then they may have a revision at market or on the horizon, preferably with glossy screens and DP1.3 ports if the GPUs will allow.

.....We'll just have to wait and see.
 
Well, I don't think going full glossy is the best answer either. Really, the new semi-gloss films used on the new Dell's and the Eizo's is the best route IMO. Can make virtually everyone happy.
Virtually, unless there's an eyelash in the coating, right?

I was not expecting the Eizo coating to be quite as reflective as it is (not that it's overwhelming) but I'm sold on it, because it's clearer and more natural looking. Was pleased to hear you didn't bother removing the films from your Foris array.
 
We'll just have to wait and see.

I prefer glossy as I have complete control over the light of the room in which I game in.

I am of the pragmatic mind that if one can afford (and has the gumption) to spend to $800 on a display to play games on, they can afford blackout curtains, or perhaps re-arrange the room in a way so as to be unobtrusive to the viewing experience.

And if per chance wifey-poo doesn't like the blackout curtains or the flow of the furniture, well, I'm sure she won't appreciate the price-tag on that fancy new screen we bought to play our funny little games on.

I mean this is some enthusiast gear, why bother half-assing it for the lowest common denominator of the buying public? (rhetorical question)

Unfortunately they say they cannot afford to pump out variant SKUs of this model to appease all tastes (some tastes are better than others), which I don't buy for a second. I mean hell if they can do it in Korea they can do it Taiwan.

Oh well, hopefully we can find a solution. I'm pretty dead set on purchasing this, as I've waited several years for it; However I don't plan on making the purchase until well at the end of the year after I've completed my Haswell-E/Maxwell build. Perhaps then they may have a revision at market or on the horizon, preferably with glossy screens and DP1.3 ports if the GPUs will allow.

.....We'll just have to wait and see.

...
You do not need to sit in the dark with a glossy monitor. The problem with lighting is that people typically set up their "computer studio" with their desk against the wall like a bookshelf, which acts as a catcher's mitt for direct light pollution no matter what type of coating they have. Computers have often been seen as something to stuff away somewhere as opposed to how some people set up a nice tv "theatre" specifically for lighting, seating, and surround audio for example, or how people set up a photo studio, etc.
.
I set my corner desk away from the corner, taking over the corner almost like a cubicle or command deck type of thing. The room itself has plenty of lighting from floor lamps and a window , but none of them are above or in front of the monitor faces and desk. I even have a small lamp at each end of my long desk, in line with/adjacent to my monitors, but they aren't in front of them where they would have an angle of reflection.
.
Direct light pollutes any monitor, pollutes the monitor and color space no matter what coating.
lcd-glare_ag-vs-glossy.jpg


Variable lighting condition environments completely alter the way our eyes and brains perceive brightness, contrast and saturation - so if you don't maintain the same lighting conditions your settings are completely off when the room lighting changes. I keep lamps so that the daylight window lighting levels are maintained at night so that there is no major lighting level fluctuation. In my living room, I keep 3 sets of settings on my TV for different lighting conditions/times of day for the same reason.
Even hardware monitor calibration is usually done right up against the screen in a dark room. It is a good baseline but once you change the lighting and use it in your actual viewing environment, the way your eyes and brain see that calibrated state is off from what you calibrated so you really should tweak it further to suit the way your eyes see it in that lighting environment. Any direct light hitting the panel also pollutes the color space as I stated. Then if you vary the lighting conditions in the room, your perceived settings fluctuate greatly (most notably, any brighter in the room yields paler screen and poor contrast, darker room yields brighter and more saturated, ..i.e. can be too much brightness and saturation).
.
So imo you don't need to blackout, just have a proper "studio" layout. For the ultimate "blackout" though, the oculus rift is coming. ;)
 
from the blurbuster's site, regarding someone who had a dev model g-sync monitor:

It actually gets it’s own hardware button on the kit’s monitor, as well, which makes it very convenient to enable/disable – it’s just to the left of the power button. I generally preferred G-Sync on, which meant ULMB wasn’t available, but on older games where I hold a solid huge FPS (or all of my 2D usage) it was quick and easy to enable. Hopefully a future update or revision allows for both G-Sync and ULMB, as that would truly be the best of both worlds.

ULMB is noticeably better than the lightboost hack on the same monitor, it’s still somewhat dimmer but it doesn’t wash out the colors or give any kind of tint to it. I don’t have a high-speed camera handy, or I’d take some sample pictures that could be compared against lightboost, however I’m sure blurbusters will jump in with these once the media ban is lifted. I’m quite impressed the improvements they made over straight lightboost, especially considering the kit has no settings for color tone or contrast.
.
So there are g-sync monitors and kits that have a ULMB mode superior to lightboost that you can turn on. IDK why this asus PG278Q does not support it at the outset and forces you to use the lightboost hack (according to the PCDIY asus rep on the youtube link)
http://www.youtube.com/watch?v=vMMtsnEfWeY
 
from the blurbuster's site, regarding someone who had a dev model g-sync monitor:


.
So there are g-sync monitors and kits that have a ULMB mode superior to lightboost that you can turn on. IDK why this asus PG278Q does not support it at the outset and forces you to use the lightboost hack (according to the PCDIY asus rep on the youtube link)
http://www.youtube.com/watch?v=vMMtsnEfWeY

Yeah that's disappointing and utterly baffling. It's still up in the air though. A few reps and spokespeople on th ROG forum have retracted certain statements about the product in question, citing that their information contradicts actual specifications and will not make any further claims until the hard data on the PG278Q comes to light. One can only hope that this is also true for the lightboost feature or lack there of.
 
I thought I would paste this excerpt of the oculus rift interview from road to VR on youtube since it shows how important low persistence/blur elimination is as a feature is in 1st/3rd person perspective cgi gaming worlds.

From The CES 2014 Interview with Palmer Luckey and Nate Mitchel..
http://www.youtube.com/watch?feature=player_embedded&v=3YoUV7uty40
Interviewer(Ben): Why don't you start out with a quick explanation of what low persistence is, why you are using it, and why it is better.

Oculus Devs: "I'll start back to front. We are using low persistence because it allows us to eliminate motion blur, reduce latency, and make the scene appear very stable for the user. The best way to think about it is.. a full persistence frame, you render a frame, you put it on the screen, it shows it on the screen and then it stays on the screen until the next frame comes. Then it starts all over again. The problem with that is a frame is only correct in the right place when it's right here <motions with hands together to indicate a short middle period>. For the rest of the scene, it's kind of like garbage data. It's like a broken clock - you know how a broken clock, how it's right occasionally when the hands move to the right place - most of the time it's showing an old image, an old piece of data. What we're doing with our low persistence display is rendering the image, sending it to the screen, we show it for a tiny period of time - then we blank the display. So it's black until we have another image. So we're only showing the image when we have a correct, up to date frame from the computer to show. If you do that at a high enough frame rate, you don't perceive it as multiple discreet frames, you perceive it as continuous motion , but because you have no garbage data - you know, nothing for your retina to try to focus on except correct data - you end up with a crystal clear image.
And part of that - one of the missing features that was required to do lower persistence is pixel switching time. We needed a sub millisecond pixel switching time which we get from oled technology to allow us to do all of this.
- And to be clear, pixel switching time is a big factor in motion blur. In fact, we used to think it was even a bigger factor and we drove pixel switching time down, down, down.. and .. once we starting experimenting with displays that allowed us to switch almost instantly, getting completely rid of the pixel switching time, it turns out that there are a lot of more artifacts like judder that look like motion blur even when the panel is perfect. When you put our panel under a high speed camera, every single frame would be perfectly crystal clear where as an lcd, you would see a smeared blurry image because the pixels are switching. For us, it's always crystal clear.. it's all in your brain this motion blur.

That's probably the biggest update we've made to this prototype. It's a major breakthrough in terms of immersion, comfort, and actually visual stability of the scene. Now you can actually read text, not only because of the high resolution, but with text in the world before, even if you were moving your head just a little bit, which most of us naturally are as we are looking around a scene, the text would just smear - very heavily. Now with low persistence, all of the objects feel a lot more visually stable and locked in place.
It's worth noting that, this technology will continue to be important for VR for a very long time. It's not a hack that gets around some issue we have right now. Until we get to displays and engines that can render at 1000 frames a second and can display at 1000hz basically displaying whole persistence frames that are as short as our low persistence frames, there's going to be no other way to get a good VR experience. It's really the only way that's known. And Valve's michael abrash has a blog that he posted about a year ago , talking about the potential for low persistence to solve these issues. Right now there is no other way that we know of.
.
Interviewer: "And although - everyone is talking about the positional tracking, and that's awesome, and everybody's been looking forward to that, you guys were telling us earlier that you think low persistence is perhaps a bigger, more important breakthrough for now that positional tracking".

A: "I mean, position tracking it's really good and it's important but it's something we've always known we needed to have and so we were going to have to build it, it was an expected. That's obvious for any VR system. Any VR system where you're trying to simulate reality, you want to simulate motion as accurately as possible. We weren't able to do it in the past, but we knew it was going to happen for consumers. So low persistence is a breakthrough. In that it was unexpected..it was, we did not expect to see the kind of jump in quality that we saw - where we said this isn't just one of those "every little bit helps" , it is a killer - it completely changes the way that, it completely changes the experience.. fundamentally."
.
Interviewer: "Now I want to backtrack slightly to low persistence. Earlier I think you guys had mentioned that lower persistence, in addition to bringing up the visual fidelity, reduces latency is that correct?"

A: "Kindof. Well, it that they all work together. You can't do low persistence without really fast pixel switching time, and fast pixel switching time also allows us to have really low latency. Um, because as soon as the panel gets the frame, we're displaying it and it's instantly showing the correct image."
"So I think if you look at the motion to photons latency pipeline that we've talked about alot, pixel switching time has always been one of the key elements in there, in that there is this major delay as the pixels change color. Now that we've eliminated the pixel switching time because of the oled technology - it's not that low persistence is getting us even lower latency, but all together - I think what's interesting is that at E3 when we showed the HD prototypes, those demos were running between you know, 50 to 70 ms of latency for the ue4 elemental demo. Here at ces 2014 we're showing the epic strategy VR demo and E valkyrie and both of those demos are running between 30 and 40ms of latency. So that's a pretty dramatic reduction, you know, in terms of the target goal which is really delivering consumer V1 under 20ms of latency. "
"That is a goal we'll be able to pull off."
 
Last edited:
I thought it kinda of funny when he said "low persistence is a breakthrough". Oh you mean what CRT's have been doing for 70 years? lol

Ok granted it takes flat panel displays for VR, but for it to take this long for people to wake up to circuit delay, pixel persistence, the sample and hold travesty is laughable.
 
as they say, better late than never... but I understand being disgruntled about it believe me.
 
I thought it kinda of funny when he said "low persistence is a breakthrough". Oh you mean what CRT's have been doing for 70 years? lol

Ok granted it takes flat panel displays for VR, but for it to take this long for people to wake up to circuit delay, pixel persistence, the sample and hold travesty is laughable.

Yeah, for those of us who stuck with CRT's long after LCD's took over the display market, the importance of low persistence is pretty obvious. People were talking about this stuff years ago...but it's great low persistence has finally been recognized as being critically important.
 
I thought it kinda of funny when he said "low persistence is a breakthrough". Oh you mean what CRT's have been doing for 70 years? lol

Ok granted it takes flat panel displays for VR, but for it to take this long for people to wake up to circuit delay, pixel persistence, the sample and hold travesty is laughable.

Now, with low persistence - if you shift your view and there's some text... It won't smear! I couldn't believe what I was hearing.
 
From JJ at Asus regarding PWM:

Depends on the operating mode for primary functionality it operates in Direct Control as opposed to PWM. For UMLB operation it is a suto PWM operation to control the backlight. More information on that mode will be release later on.

When he says "suto" PWM operation, I assume he meant "auto":confused:. Anyway, what do you guys think of this?
 
Last edited:
Or 'pseudo'. He probably is just saying that ULMB is a form of flickering, which it is. Good news on regular mode! Hopefully that becomes a standard for more monitors this year.
 
From JJ at Asus regarding PWM:



When he says "suto" PWM operation, I assume he meant "auto":confused:. Anyway, what do you guys think of this?

Sounds like good news to me. If not in strobe backlight mode, it's direct DC voltage control and not PWM, and of course strobes in ULMB mode. If that's what he means, its a very good thing.
 
Or 'pseudo'. He probably is just saying that ULMB is a form of flickering, which it is. Good news on regular mode! Hopefully that becomes a standard for more monitors this year.

That is what I was thinking. Probably just a misspelling.
 
I'm just adding my conversation with JJ to this topic. Some of the things in here will be repeated information, but there's one or two things that have yet to be outright stated in this topic.

Due to the disclaimer in JJ's e-mail I'll paraphrase the responses.

My first e-mail to JJ:
Good morning,

I just wanted to add my voice to the pile. Two things, Asus should offer these monitors in matte or glossy (both) as some of us do like a glossy screen.

Secondly, I've seen cable bandwidth calculations, and wouldn't it be best to have the monitor be display port 1.3 as opposed to 1.2? Furthering this thought, how are we supposed to put these monitors in surround when no dual display port geforce graphics card exists yet?

His reply
Cost of production is limiting them to matte only screens. Majority of people they talked too want matte so they're doing that first. With glossy or semi-glossy they have to be careful with as everyone has different preferences to just how glossy something should be, however doing that in the future after the matte screens is a possibility.

Display Port 1.3 currently brings no benefit over 1.2 to these screens (again paraphrasing his words here). Internal display chain in regards to the whole electronic package has to be taken into account, it's not as easy just to phase the specification in.

Mentions daisy chaining as an option.

My reply
While true to daisy chaining dp, if you want more than 60fps you can't daisy chain.

It would be nice to see a surround setup support with 3 of these monitors all running at 120hz. I know nvidia says g-sync doesn't work with surround [source here is talking directly with a Nvidia rep], but if you build a card with surround displayport capabilities, I'm sure you'll get even more people snatching up these monitors. Besides, technology only gets better and I have a feeling if cards can support surround displayport, nvidia will get g-sync working with a surround setup in no time.

His last reply
Right about limitations in DP 1.2, but it's a spec limitation not a panel.

DP 1.3 will be possible later when the spec is available with all internal components and is further refined for implementation.

ASUS has their own monitors and graphics cards so they like to align where they can. Enthusiasts already asked for more DP per card so they did that with AMD products. Nvidia needs to support through the card design, drivers, etc. for multiple DP. And finally ASUS values all feedback from enthusiasts and PC gamers.

Final thought(s)
So it looks like we won't see DP 1.3 with this version (year) of monitor, and until Nvidia and ASUS get together to bring out a multiple DP card, surround isn't happening. Pure speculation on my part here, but unless Nvidia comes out with a dual DP Kepler capable card and Asus takes advantage of it in the yet to be released (rumored?) 790, or 6GB 780Ti... we won't see a dual DP capable card until Maxwell - and even then I have no confirmation on that.
 
limited display ports unless someone does an amd hack for ULMB mode, if this monitor even ends up having a true ULMB mode and not just a lightboost hack.
.
however worth noting that if you want to play any sort of slightly demanding game with three 2560x1440 monitors, you would need at least two gpus if not three - so if you can use one dp port per card it would still work out. Hell, running a single 2560x1440 at 120fps is super demanding and requires at least two powerful cards for any moderately demanding game at high to ultra settings.
 
limited display ports unless someone does an amd hack for ULMB mode, if this monitor even ends up having a true ULMB mode and not just a lightboost hack.
.
however worth noting that if you want to play any sort of slightly demanding game with three 2560x1440 monitors, you would need at least two gpus if not three - so if you can use one dp port per card it would still work out. Hell, running a single 2560x1440 at 120fps is super demanding and requires at least two powerful cards for any moderately demanding game at high to ultra settings.

The way Nvidia's surround works is you can have all cables in 1 card, or two cables in card #1, and 1 cable in card #2 which is in SLI mode. You can't have any other configuration to run surround.

Because these are DP only and NVidia to date only has 1 dp per card, yes you'd need 3 video cards to even run all 3. However you can't do surround because 3 cards with 1 DP cable in each breaks the rule in the previous paragraph. That and g-sync only works over DP currently.

On your topic though, an Nvidia rep pretty much said if I have 3x 780 Ti's NOT SLI'ed together, that should run these monitors to near full capacity on most games.
 
http://us.hardware.info/reviews/463...li-and-3-way-sli-metro-last-light---5760x1080

Would require turning settings down on some games but everything has tradeoffs. Even x1080 is super demanding on three displays, especially if you are shooting for 120fps to feed 120hz.

farcry 3 w/ 8x MSAA on ultra .. 1920x / 2560x / 5760x1080

770 SLI 90fps / 50.6 fps / 9 fps
single 780 SC 64 / 37 / 23

780 tri-sli farcry 3 5760 x 1080 medium = 89.1 fps

780 tri-sli farcry 3 5760x1080 - ultra 4xAA = 51.4fps

Tomb raider 780 tri-sli 5760x1080, ultra 4xAA = 54.9 fps , on normal 195 fps though

Metro Last light 780 tri-sli 5760x1080, very high = 44fps, medium = 60fps
 
And for those of us that play a few older games, no problem :p.

TBH if I do end up getting these monitors, I'll be waiting to see what Maxwell can bring to the table.

I'm running two 1080p screens on a single 580 GTX (4gb memory version) and it's working good enough. Once haswell-e is out and Maxwell cards are out, time for an upgrade! Hopefully Maxwell will push those fps numbers even higher.
 
Once haswell-e is out and Maxwell cards are out, time for an upgrade! Hopefully Maxwell will push those fps numbers even higher.

Oh you won't have to worry about that. Just be on the look out for whatever "Super Ultra Hyper Championship edition" cards Nvidia will no doubt be hiding in their sleeves.

For me; well I'm waiting for a pair of EVGA Classified 880 ti's or whatever ti equivalent they have in store for us.

Welcome to the forum by the way.
 
I dont understook an important thing :

This PG278Q have an 8bit native panel or classic 8bit maked by 6bit + dithering ?

And when coming out in the shop ?
 
I have nothing against less demanding games. I still play L4D2 and TF2. However from what I could understand from your post you claimed that three 780's would run most games at full capacity on not one but three 2560x1440 monitors and that is not true, especially if you are trying to get the most out of 120hz by feeding it very high framerates ~ 120fps. Most non demanding games perhaps.

800 series won't be out for 9 to 12 months from what I heard. Still is something to look forward to.
With different upcoming bandwidth output standards, gsync dynamic hz, low persistence / ULMB becoming a priority, and maybe amd mantle unknowns... also oculus rift performance capability on different gpu setups considering any of the aforementioned factors and other unknowns... it would be hard for me to dump a lot of money into gpus at this juncture since we seem to be at a pretty large turning point this year or by the end of this year. 12 months can seem like a long time but it goes fast compared to dumping $720 - $1500 - $2100 or more on gpus early vs aforementioned imminent tech releases and unknowns, especially the more expensive end of that scale.
 
To clarify, 'most games' refers to 90% of all games (give or take) which include all and any game that can run on a pc between the dawn of time and now.

And that's what the nvidia rep I talked too added. I know you won't get 3 instances of bf4 running all 120fps at 1440 of course, but for the older or less demanding games, no problem!
 
Back
Top