Archaea....thats the price....deal with it or wait for a sale, or buy a cheaper CF791, or buy an even cheaper microboard.

Surround displays suck balls compared to one nice widescreen solution sitting on your desk.

Well if you say so.

I've had a 34" curved Acer and much preferred my three 32" Omens --- when games support it.

Shadow of Mordor in triple screen eyefinity with Flawless Widescreen running would convince even the most skeptical here that eyefinity or (Nvidia Surround) is really pretty exceptional when it works well. It's 81.5" horizontally of screen real estate at my desk with the three Omens.

I'm not saying triple monitor gaming works for everything - we both know it doesn't. I was disappointed to find out Witcher 3 doesn't work well enough to even bother. Everything is stretched so bad on the outside, that I play with just a single Omen 32. But when triple monitor does work with the FOV - it's really impressive. Another game that is super impressive in eyefinity is Star Wars Batlefront - specifically the flying modes --- or basically any racing game.
 

Attachments

  • IMG_1097.JPG
    IMG_1097.JPG
    165.4 KB · Views: 253
Got my Omen in. Can say I am pretty darn impressed so far. Reminds me of an Eizo monitor. Build quality is great, as is the quality control on my sample. Zero defects so far, no dead/stuck pixels, height stays put at all settings, no paint peeling from stickers, 100 Hz rock steady right out of the box.

Will post a summary after a day or two of gaming.
 
Got my Omen in. Can say I am pretty darn impressed so far. Reminds me of an Eizo monitor. Build quality is great, as is the quality control on my sample. Zero defects so far, no dead/stuck pixels, height stays put at all settings, no paint peeling from stickers, 100 Hz rock steady right out of the box.

Will post a summary after a day or two of gaming.
Video!!!
 
Dammit Vega! Your not supposed to like this stupid Omen...cause that is gonna cause me to buy one ;-/
 
I am trying real [H]ard to understand why this panel is causing so much comotion [H]ere:

-100Hz 34x14 is 2 years old tech.
-CF791 has same resolution, refresh rate and contrast, but higher color gamut
- neither have LMB mode.
-1080Ti made Gsync irrelevant at this resolution/refresh rate, as you will be above 100fps more often than not.
 
I am trying real [H]ard to understand why this panel is causing so much comotion [H]ere:

-100Hz 34x14 is 2 years old tech.
-CF791 has same resolution, refresh rate and contrast, but higher color gamut
- neither have LMB mode.
-1080Ti made Gsync irrelevant at this resolution/refresh rate, as you will be above 100fps more often than not.
No BLB or IPS glow that's plagued every other ultrawide, with the best aesthetics of all the ultrawides. Native 100hz panel vs. 'overclockable' one. Yes, GSync.. try it before you knock it. A 1080 Ti is still not enough to keep you capped at 100hz in most games.

What's not to like? It costs money, but this is [H]. I'm still waiting for mine, should be here Friday. Plan on keeping it if there's no noticeable BLB like all the other samples I've read about so far. Only other monitor on the horizon I'm personally interested in is the Asus 4K/144hz/HDR, but we probably won't even see that this year, and it's rumored to be in the ballpark of $2000 when it does land.
 
A 1080 Ti is still not enough to keep you capped at 100hz in most games.

If by capped you mean playing every F game with every F image setting maxed at 100fps, that is true.
If by capped you mean that you will frequently drop below 60Hz at sane image settings, that is BS.
Gsync is useless at such a low resolution as 3440x1440. You will never be below 60Hz with an adequate GPU, and gaming experience will be better with image quality settings that keep you above 100fps all the time.
Better SLI than waste money on gsync.
Your faith on the green side is misplaced.

Side note: CF791 is VA.
 
If by capped you mean playing every F game with every F image setting maxed at 100fps, that is true.
If by capped you mean that you will frequently drop below 60Hz at sane image settings, that is BS.
Gsync is useless at such a low resolution as 3440x1440. You will never be below 60Hz with an adequate GPU, and gaming experience will be better with image quality settings that keep you above 100fps all the time.
Better SLI than waste money on gsync.
Your faith on the green side is misplaced.

Side note: CF791 is VA.
You don't like and/or cannot afford Gsync. Cool story. Go buy a CF791?
 
If by capped you mean playing every F game with every F image setting maxed at 100fps, that is true.
If by capped you mean that you will frequently drop below 60Hz at sane image settings, that is BS.
Gsync is useless at such a low resolution as 3440x1440. You will never be below 60Hz with an adequate GPU, and gaming experience will be better with image quality settings that keep you above 100fps all the time.
Better SLI than waste money on gsync.
Your faith on the green side is misplaced.

Side note: CF791 is VA.

I have a Titan X....and BF1 frequently drops to the 80s on my CF791 and then rockets past 100s. Capping FPS at 100 does not eleminate screen tear. Turning Vsync on adds lag, turning on active sync is OK, but you still have those minor stutter issues when fluctuating from 80s to 100s..... Gsync makes things buttery smooth and I like that. I love my CF791, but its primarily a productivity display for me....will probably pick up an Omen 35 soon
 
This is not about liking gsync or not.
It is about needing it at lower resolutions at such price premium.
The CF791 costs less and has better gamut.
Neither has a TFTCentral review yet.
Be ready to eat crow when they do.
Better spend money on SLI or on a useless gsync logo?
 
geok1ng,

have you personally tried gsync or freesync?

I thought it was marketing gimmicks too until I tried it. I use Freesync and think it's worlds better than capped FPS or vsync. I think it's one of those things, you wouldn't know you were missing until you spent some time with it, but after you spend some time with it --- you can IMMEDIATELY tell when it's not present. No placebo. My brother had gysync and loved it, and I poo'ed it until I got freesync, then I understood. You can't appreciate it by seeing a store demo. You have to experience it in your games, and realize you never see frame tears, and just as importantly - when the frame rate dips off the max FPS cap on occassion -- it still feels buttery smooth somehow.

My freesync range is 48-75 on my monitor --- as long as the FPS stays in that range - it's buttery smooth. As soon as it steps below 48 - I know immediately. 48 without freesync feels like stutter. 49 with freesync feels perfectly smooth - it's that type of arrangement. Not intuitive, but it works.

I actually think you're completely off. A single GPU with gsync or freesync will present a more 'perfect' experience, than dual GPU trying to cap some monitor hz limit. Speaking from experience (as it relates to freesync and dual cards as well).
 
Last edited:
I have a Titan X....and BF1 frequently drops to the 80s on my CF791 and then rockets past 100s. Capping FPS at 100 does not eleminate screen tear. Turning Vsync on adds lag, turning on active sync is OK, but you still have those minor stutter issues when fluctuating from 80s to 100s..... Gsync makes things buttery smooth and I like that. I love my CF791, but its primarily a productivity display for me....will probably pick up an Omen 35 soon

Now that is a better point. :)
How about choosing IQ settings that do not drop below 100fps?
How about capping frames at 80fps?

owning a Titan X kind sucks in this scenario:oops:: not many people willing to pay a good price for it, and you need SLI to enjoy you new display.

the Aoc Agon AG352UCG costs $899, is available on Amazon, uses the same panel, also with Gsync.
the Omen 35 is vaporware on both amazon and Hp site as we speak.:mad:
i am sure some people payed the full $1300 for it:barefoot:. i am also sure that they were better off getting 1080ti SLI and a lower cost 34x14:rolleyes:


i wish you good luck on your quest for a better display, personally i believe it is too soon to decide which monitor better games at 34x14, i hit f5 every 10 minutes on 5 monitor reviews sites looking for in depth information on both panels.:nailbiting:
And i fear that the final answer will be : neither monitor is 100% adequate for gaming. :banghead:
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
This is not about liking gsync or not.
It is about needing it at lower resolutions at such price premium.
The CF791 costs less and has better gamut.
Neither has a TFTCentral review yet.
Be ready to eat crow when they do.
Better spend money on SLI or on a useless gsync logo?

Its all moot, everything is a place holder right now and I really, really like Gsync.....I am waiting on the 4k 144hz with FALD....will probably get two of those and ditch my CF791, S2417DG and OLED C6, when those come out.

Also, the Aoc Agon AG352UCG is vaporware....did you not see the ships in 1-2 months on Amazon lmao....that is like a lifetime away

Also, I have a second TItan XP for SLI. SLI is TRASH.....HOT GARBAGE. Even with FPS maxxed and on a Gsync Panel, my bionic eyes can still see the very, very minor microstutter. I HATE MULTI-GPU....HATE IT! Ended up putting the second Titan XP in my HTPC
 
Last edited:
I actually think you're completely off. A single GPU with gsync or freesync will present a more 'perfect' experience, than dual GPU without. Speaking from experience (as it relates to freesync anyway).

Thank you for sharing. I do not doubt that when fps drop, *sync matters.
I also agree that there are a number of situations where SLI will not improve, in my example 11 out of 25 games showed no improvement with SLI.
But we are talking about a $400 price premium over a CF791( a panel with 125% sRGb coverage), twice the price of other 34" 34x14 100Hz VA displays . for that kind of money, i would seriously consider either going SLI or lowering IQ settings
 
Its all moot, everything is a place holder right now and I really, really like Gsync.....I am waiting on the 4k 144hz with FALD....will probably get two of those and ditch my CF791, S2417DG and OLED C6, when those come out.

Also, the Aoc Agon AG352UCG is vaporware....did you not see the ships in 1-2 months on Amazon lmao....that is like a lifetime away

Also, I have a second TItan XP for SLI. SLI is TRASH.....HOT GARBAGE. Even with FPS maxxed and on a Gsync Panel, my bionic eyes can still see the very, very minor microstutter. I HATE MULTI-GPU....HATE IT! Ended up putting the second Titan XP in my HTPC

Do you experience micro stutter with gsync?
I can't feel any microstutter with my dual Fury X cards in crossfire with my freesync monitors. Non whatsoever. So either I'm not sensitive to it l? Or freesync takes care of it?
 
Do you experience micro stutter with gsync?
I can't feel any microstutter with my dual Fury X cards in crossfire with my freesync monitors. Non whatsoever. So either I'm not sensitive to it l? Or freesync takes care of it?

I'm sensitive to it and I hate multi-GPU...I really enjoy the CF791 and its 125% RGB, and its curve...but the lack of Gsync kills gaming for me with it....screan tearing totally drives me nuts and Vsync lag drives me nuts....Like Rainman & toothpics on the ground kinda nuts!!!
 
Pot, meet Kettle. Kettle, meet Pot.

Use Fury X to drive your CF791:cool:

I have a TitanXP...... AMD cards are Gross!
Why replace the goodness of my TitanXP, when I could simply go with the Omen X35?

Honestly, I am maddest at Nvidia....freakin support Freesync already Nholes!
 
I am trying real [H]ard to understand why this panel is causing so much comotion [H]ere:

-100Hz 34x14 is 2 years old tech.
-CF791 has same resolution, refresh rate and contrast, but higher color gamut
- neither have LMB mode.
-1080Ti made Gsync irrelevant at this resolution/refresh rate, as you will be above 100fps more often than not.

If by capped you mean playing every F game with every F image setting maxed at 100fps, that is true.
If by capped you mean that you will frequently drop below 60Hz at sane image settings, that is BS.
Gsync is useless at such a low resolution as 3440x1440. You will never be below 60Hz with an adequate GPU, and gaming experience will be better with image quality settings that keep you above 100fps all the time.
Better SLI than waste money on gsync.
Your faith on the green side is misplaced.

Side note: CF791 is VA.

Those are pretty blanket statements that appear to point to the fact that you don't know what you are talking about when it comes to SLI, ultrawides, and Gsync.

You start by saying 3440x1400 @ 100hz is 2 year old tech...and???? Even now upper tier cards have a problem running that res @ 100hz. So you must really look down upon those playing 1080p at 144hz right? Old tech playing fools.

Next you said the 1080ti made Gsync "irrelevant" because you will be "above 100fps more often than not." The BabelTechReviews site you then mention in you next post completely contradicts this statement. Even the 1080TI cannot maintain above 100FPS at ultrawide 1440p. Only 4 games crossed 100fps but their min frames still dropped below 100fps. This is what will give you tearing and why Gsync kicks in. That is also why you see 144hz Gsync 1080p screens. Gsync shines when you get a sudden drop in FPS which occurs in all games. The 1080 TI SLI, again from the review you referenced cannot stay above 100 FPS at all times. This is mainly due to poor SLI support.

I was glad to see the second post I quoted of yours corrected your prior statement about the 1080TI and 100 FPS.

Saying Gsync is useless at ultrawide res is asinine. Even the 1080TI will have minimum frames under 60FPS. The review you posted clearly shows this. You won't get 100FPS on all new titles unless you drop the IQ to silly levels. In fact on several games you will have issues keeping 60.

Point is you are wrong that a 1080TI will keep you at 100 at all times. It won't. You will get frame drops that will institute tearing. Trying to drop the IQ to maintain 100FPS will be hard due to many of the new AAA titles having substantial min frame drops. This again will cause tearing and depending how bad/often the drops occur and Vsync won't help. I had tearing/stuttering on a reg 60hz 1440p with GTA 5 even with Vsync. This is where Gsync steps up.

Even 1080TI in SLI won't keep you at 100FPS 100% of the time. In some cases, as your referenced review showed, SLI performs worse than a single card.

I just pulled my second 1080 for this reason. The games I am currently playing did not see an increase and in some cases, like Wildlands, my min drops were lower than a single card. GTA 5 was the same.

The amount it takes to lower the IQ on several newer games to make 100FPS constant is ridiculous and IMO hurts the gaming experience. I know this testing my X34's 100hz by lowering several games IQ till I could maintain 100FPS at all times with the 1080. I would rather run high or ultra settings at 75-80 with mins in the 50's and have the game look enjoyable and be very playable due to Gsync.

If you want competitive gaming FPS then you need to drop to 1080p's at 144hz. BTW, several of those top monitors run the Gsync module to prevent lag associated with triple buffering and vsync.

I would be amazed if a gamer invested $550-750 on a NV GPU and $950 on a CF791 to just crank all the settings down to your optimal 100fps 100% of the time. Just a waste of cash. For $50.00 more you can grab a X34 or PG34Q or $200-250 more the X35. Your last comment about trying to find 2 Fury X's ($1100-1200 for both if you can find them) so you can use freesync on the $950 CF791 and hopefully get 1080ti's performance has to be a joke.

You have to remember Gsync enabled ultrawide's are halo products for people with a certain desire when it comes to gaming. Just like 4K and 1080p @ 144hz.

It is clear you have never owned or used SLI or a ultrawide Gsync screen. You are actually doing a disservice to those wanting to know about the ultrascreens by spreading your FUD. A simple, "yeah I don't get the X35 and I like the Sammy cause of the gamut" is all you really had to say. Not the nonsense you did say.

As an actual owner and gamer of both SLI/UW Gsync, I can easily say, the SLI is nice but not for the money. A Gsync equipped screen is 1000% worth the money over SLI. If you can do both then it is a great combo under certain situations and descent SLI profiles.

Until AMD becomes competitive FPS wise at ultrawide res freesync is useless IMO.

I went through 6 ultrawides in the last few months trying to find a new gaming monitor. I didn't understand Gsync either until I had hands on gaming on a PG348Q and X34. It was night and day compared to non-gsync. Both screens were late 2016 builds and neither had BLB or quality control issues. BLB is comparable to my other IPS screens.

The X35 is really nice, I used one at a recent HP demo. The tighter curve is not tremendous and I like the stand way more than the Asus/Acer competitors. The only issue I personally had with the HP was the VA. After IPS I cannot go back. I will only look at Gsync screens for gaming from here on out until AMD does something competitive.


My advice to you geok1ng is don't comment about tech unless you have actual experience with what you are commenting on.
 
This man speaks the Truth^^^^^

Gsync & Vsync are game changers IMHO.....they are the ONE thing modern displays have bested CRTs on and I love the buttery smoothness of Gsync
 
Cr4ckm0nk3y

The same thing you are ranting about (badmouthing something you haven't tried) applies to you. You are negative on freesync and crossfire without trying it. Many sites have compared gsync and freesync and said they are very comparable almost undetectably different except the greater FPS range for gsync.

So if you like gsync @75 hz (you said you do) why do you think you wouldn't like freesync at 75hz.

I can tell you that a pair of crossfire Fury X will run Ultrawide easily into freesync range. At which point as you know, from your gsync experience, actual frame rate doesn't matter. It feels silky smooth, and frame rate variance is not a distraction.

Furthermore. I use a pair of Fury X in crossfire at 7680x1440 (eyefinity and freesync) and have little problem keeping games in freesync range on my Omen 32. If I encounter a problem I can simply lower a setting or drop AA and I'm back into freesync range. I've not had to drop out of crossfire mode a single time singe last summer for poor game performance.

Since 7680x1440 works on my pair of Fury X (at high or above on every game I've tried) then ultrawide 3440x1440 will too, obviously.

With freesync or gsync you no longer have to have the fastest card/absolute highest FPS, you just have to get performance into the technology's sync range and you're in for a fantastic gaming experience.
 
I have a TitanXP...... AMD cards are Gross!
Why replace the goodness of my TitanXP, when I could simply go with the Omen X35?

Honestly, I am maddest at Nvidia....freakin support Freesync already Nholes!

That was a fine example of how to flame bait . :p

The BabelTechReviews site you then mention in you next post completely contradicts this statement

games were reviewed at insane quality levels, and minimum were above 60fps quite often.

I would be amazed if a gamer invested $550-750 on a NV GPU and $950 on a CF791 to just crank all the settings down to your optimal 100fps 100% of the time. Just a waste of cash. For $50.00 more you can grab a X34 or PG34Q or $200-250 more the X35.

Or they could get an Agon 35 with gsync for the same cost of the CF791. I88 is a fine example of a Titan owner that invested on the freesync CF791. Does he amazes you?

It is clear you have never owned or used SLI or a ultrawide Gsync screen.

owned the first, used the second. loved both, didn't consider any "prime gaming experience"- that echelon is reserved for LMB displays powered by GPUs at IQ settings taht never drop below refresh rate. High resolution gaming is quite immersive, but can not compare to low motion blur gaming.

Until AMD becomes competitive FPS wise at ultrawide res freesync is useless IMO.

I totally agree with you, but some who have actually used AMD GPUs for gaming may disagree.

My advice to you geok1ng is don't comment about tech unless you have actual experience with what you are commenting on.

as long as you are willing to follow your own advice...

I can't feel any microstutter with my dual Fury X cards in crossfire with my freesync monitors. Non whatsoever.

Based on I88 and Cr4ck posts, I hereby declare your testimony biased, false and advise you to never again speak anything good about AMD on a thread owned by Gsync NVIDIA users. :LOL:

sarcasm: OFF

Funny how someone can be bashed around by calling SLI not such a good idea in 2017 and days later be bashed for suggesting SLI. Once again i wish good luck on all of us that are searching for a better monitor. Hope is not lost, NVIDIA filed a patent for using a strobing backlight on a variable refresh rate monitor.
 
I am happy with my TitanXP....not gonna dump it for an inferior Fury X crossfire setup (and Vega GPUs are looking weak IMHO)...I love myself too much to put myself through the hell that is AMD multi-GPU driver support. I ran 4 MSI 7970 Lightnings in Quadfire on a 5x1 portrait setup a few years ago so I don't wanna hear any whiners say I am Nvidia biased.....cause Nvidia can suck it too for tryin to hoard Gsync support and ignore the deafening cry of the market for Freesync support from their cards.

So no, I am not gonna drop my outstanding GPU just so I can play the CF791 with Freesync, because I am gonna sell this CF791 as soon as 4k 144hz displays come out....the CF791 is nice, very nice with its 125% RGB, delicious curve, and excellent contrast...however, its not my end all be all holy grail display.
 
Archaea

I ran Fury X's on my 2500 box shortly after launch. That is how I can attest to the min frame drops from high 90's to 30's. I didn't say the Fury's are crap.

Flipping the X's for my 980ti's + cash was totally worth it. Plus dealing with 2 AIO coolers was a PITA. My experience with Fury is also why I am super critical of AMD and their marketing. Fury's were great but for the cash and compared to 980ti's, NV had a better product.

But I don't think you got why I mentioned the X's in the first place. geok1ng recommended, whether with or without sarcasm, to grab a Fury X so l88bastard can use freesync. Having owned AMD cards in the past compared to current gen NV I would never recommend grabbing Fury's now. If you can find them new it will be 1080 pricing and used is still $350+. It is clear they are EOL since no major retailer is stocking them. That is why it was silly to recommend one.

And my comment on freesync stands. I didn't say FS is trash or not a good tech. I like the fact there is no premium added on for the tech and sammy ultrawide was one of the 6 I tried. The lack of Gsync was annoying and made the screens not worth the investment.

But as I said "Until AMD becomes competitive FPS wise at ultrawide res freesync is useless IMO." AMD's current gen cards are not competitive at ultrawide res compared to NV. Unless you want to play at lower frames or drop IQ. You can attest more frames is still better and its nice to have Gsync/Freesync tech there to fix tearing/stutters/min frame issues. Add that to Gysnc ultrawide's have dropped. The X34 is $1000.00 at several retailers. Right now for the best gaming experience with a sync module still goes to NV. Maybe/hopefully AMD brings it back with Vega.

Or they could get an Agon 35 with gsync for the same cost of the CF791. I88 is a fine example of a Titan owner that invested on the freesync CF791. Does he amazes you?
l88 even stated he doesn't like the fact it doesn't have Gsync and added its a placeholder screen. It is the same reason I didn't keep the Sammy or LG I played with. Gsync was worth the extra premium. Especially price wise premium to SLI. Now the X34 is within $50 of the CF791 and cheaper than the Omen.

I agree with you on the Agon. It seems like a great alternative to the high priced Gsync screens and it is about time.

geok1ng , my issues with your posts had more to do with coming into a Gsync screen forum, where people are obviously asking if the screen is any good, and adding nothing but nonsensical FUD about Gsync being DOA cause of SLI and Gsync not being needed. My replies to you highlighted where you were wrong.
 
geok1ng , my issues with your posts had more to do with coming into a Gsync screen forum, where people are obviously asking if the screen is any good, and adding nothing but nonsensical FUD about Gsync being DOA cause of SLI and Gsync not being needed. My replies to you highlighted where you were wrong.
I mean it's really to each his/her own.

Basically this, not sure why Geo has a vendetta against everyone who likes or is looking into a panel She-He doesn't like.
 
Adding some fuel to teh discussion of how a single 1080ti behaves at 3440x1440:
upload_2017-3-16_8-46-57-png.19427
You do realize this is reinforcing the opinion of everyone that's been calling you out, right? Do you have any idea how Gsync functions? Why not post the next graph in the video, the minimums?

Dips below 100hz in every game (BF1 being the exception) according to that next graph. Do you know what gsync does when that happens? Prevents screen tearing/stuttering. Do you like screen tearing? No? Me neither. In fact, I've paid money for a monitor that prevents it!

Give it a rest already dude, go jerk off with the Samsung monitor.. this is an X35 thread and, personally, I'm more interested in seeing discussions/reviews of the X35 over the trolling. At least go learn what you're talking about before posting more bologna.
 
Just to clarify: i was flame baiting I88.:cool: Not even an insane would get a Fury and leaving a Titan to colllect dust. :LOL:

I am middle age, live in my parents South Florida Basement and have a midget named Kenneth chained to my bed. I have Chihuahuas that eat a strict diet of Cheetos & synthetic apple treats from China. My hobbies include; long walks on the beach, body shaving, mirror squatting, and hiring freelance artists from around the world to make me XXX rated animations of myself modeled in 3D.
 
I read that this panel is only 6bit +FCR. Makes it hard for me to be excited for a 1300$ 6bit panel.
 
I read that this panel is only 6bit +FCR. Makes it hard for me to be excited for a 1300$ 6bit panel.
It's an 8-bit panel, same as the 35" AOG that's been reviewed here: https://pcmonitors.info/reviews/aoc-ag352ucg/. I read the only difference is that the HP panels were classified as borderless.

Mine showed up today. Giant black Omen box UPS guy left on the deck was intimidating. First impressions are good (no peeling stickers, heh). The stand is a little large. No noticeable backlight bleed at all, happy in that regard. Games feel great, but I need to play a bit more this weekend. I'll report back w/ my thoughts and some pictures.
 
It's an 8-bit panel, same as the 35" AOG that's been reviewed here: https://pcmonitors.info/reviews/aoc-ag352ucg/. I read the only difference is that the HP panels were classified as borderless.

Mine showed up today. Giant black Omen box UPS guy left on the deck was intimidating. First impressions are good (no peeling stickers, heh). The stand is a little large. No noticeable backlight bleed at all, happy in that regard. Games feel great, but I need to play a bit more this weekend. I'll report back w/ my thoughts and some pictures.

Congrats!!! Let us know how you like it!

The stands on these ultrawides are huge.

I immediately removed my stand on the X34 and grabbed one off amazon for $30 w/ shipping. I grabbed the Vivo single monitor stand (stand-v001C) and it has worked like a charm. I needed one with a grommet due to the sides of my desk.

I don't miss the huge stand at all.
 
TFTCentral review of the AOC version is up.

http://www.tftcentral.co.uk/reviews/aoc_agon_ag352ucg.htm#panel

Pretty much mirrors what I am seeing with the Omen on viewing angles, motion clarity, BLB etc. Although, mine doesn't appear to have as dark of a top left corner.

It is interesting that if you remove some minor black to grey transitions, TFTCentral came up with an average GTG of 6.6ms. That would actually put it faster than the IPS 21:9 monitors. I have also semi-confirmed this, as I am getting an average of about 8ms on the MPRT test. This is faster than the Samsung CF791 and the 21:9 IPS displays. Through extensive testing, I would only recommend the setting of Level 3 (default) response time. This seems to have the best overall response under the most transitions.

The build quality on the Omen is definitely on the higher end. Use of anodized aluminum and other features such as a minimalist style thin bezel and an industrial rear housing greatly appeal to me.

My display has zero pixel issues, minimal back-light bleed (almost identical to TFT's picture), runs 100 Hz stable with G-Sync and zero physical or cosmetic flaws. I would rate it a 9 out of 10, just slightly ahead of the 21:9 IPS competition. If you have an AMD GPU, the CF791 would be the best buy.
 

i would point out that the input lag is very low on the Agon. With luck the Omen will have low lag as well.

using pixel transitions times gives an objective measure, but i do like that some review sites offer pursuit camera images for comparison. looking at sites where both the older 34" VA and this new 35" VA where analysed, the older 34" performed better on such images.
 
Not sure why anyone would want a 1440 monitor in 2017... Or spend a fortune for one
Different strokes, different folks. Upgraded from the Wasabi Mango UHD400 and I'm loving it so far. UHD monitors aren't there yet IMO, if you're looking for low input lag and/or gsync/freesync, maybe Q3/4 if the Asus actually lands.. and then it's only 27" which is a bummer.

For productivity, yeah UHD is nice.
 
Not sure why anyone would want a 1440 monitor in 2017... Or spend a fortune for one

1- because it is 3440x1440 not 2560x1440
2- because it is 34x14 at 100Hz
3-because Gsync
4-it is 35" not 34"- feel to discover how % bigger they are.
5- Hp costs $400 more than AOC for same panel and specs. HP is sold out everywhere. Logically, HP is better. :rolleyes:
6- going up the price ladder you have $1700 38" 38x16 with LMB and $2000+ for OLED.

sarcasm off:
27" 2560x1440 Gsync+LMB displays are better for most gaming scenarios, except those where a 55" OLED TV beats both 27" and 34"
 
i would point out that the input lag is very low on the Agon. With luck the Omen will have low lag as well.

using pixel transitions times gives an objective measure, but i do like that some review sites offer pursuit camera images for comparison. looking at sites where both the older 34" VA and this new 35" VA where analysed, the older 34" performed better on such images.

All G-Sync displays have really low input lag, as the G-Sync chip is the image processor/transmission controller.
 
All G-Sync displays have really low input lag, as the G-Sync chip is the image processor/transmission controller.

Blusbusters said that gsync actually adds 2-3ms of lag, but is more than compensated by better image flow. One can not assume gsync will have more or less lag than Freesync or nosync beforehand:
lag.jpg


The 3415w is no-sync, the 38UC99 is Freesync and both are outsiders.
The gsync Predator X34 is slower than the Freesync XR34CK on the same panel.
On the other hand the gsync XB270HU is faster than the Freesync XG270HU.
The last pair is on class of its own when it comes to gaming. EDIT: and still cost an arm and a leg almost 2 years after launch!
 
Last edited:
Back
Top