Achieva Shimian QH270| $400 IPS 2560x1440 Korean Monitor

Ebay fees accumulate when you relist repeatedly, and the "amount sold" counters are important to these guys. Taking it off the market and then re-adding it is something they would like to avoid, so they simply change the price and make sane people go buy competitors products for a while.

Well that makes more sense. Thanks
 
Not trash.

There are a limited number of rejected panels guys. Before the sellers could cherry pick the best ones, now all that are left are really crappy ones. I've spoken to sellers who raised the prices themselves. They said it was to prevent sales as they have literally no good panels left, and are waiting on more - could take a month.

So not trash? Just "really crappy"? I guess I wouldn't distinguish the difference.

So it's a waste of time and money to even buy one of these now? Even if it's "pixel perfect"?

I want 1440p with a scaler and I don't want to spend the money for a 2713 considering all the bad feedback. It's starting to seem like I have literally zero options.
 
WOW. Just got my new Achieva Shimian 27" hooked up, and I am blow away at how amazing the image quality is. I haven't been this happy since the days of my old Sony CRT. The colors are amazing, the dot pitch is so tight that the pixels are hard to even distinguish, and the panel I recieved is simply great, with no dead pixels and only a slight amount of light bleed in the top left of the panel that is only noticeable when all the lights are out and you are staring at a pure black screen. Compared to my old Dell 2405 it is like night and day. I was concerned about blacks moving from an PVA to a IPS, but they are inky compared to my older Dell, and the not having any image smearing in fast moving scenes or while gaming is a joy. I am SOOOO happy. :) Also, picked up a new GTX 670 and an SSD drive for my machine so these too should give a bit of extra like to ol' Compy. Best $350 I have spent, and thanks much to the folks at OTV Computers here in town for finding me a "pixel perfect" for no extra charge.
 
So not trash? Just "really crappy"? I guess I wouldn't distinguish the difference.

So it's a waste of time and money to even buy one of these now? Even if it's "pixel perfect"?

I want 1440p with a scaler and I don't want to spend the money for a 2713 considering all the bad feedback. It's starting to seem like I have literally zero options.

Okay.

For the month of September? Yes, they're trash.

More panel stock is expected in October.
 
So not trash? Just "really crappy"? I guess I wouldn't distinguish the difference.

So it's a waste of time and money to even buy one of these now? Even if it's "pixel perfect"?

I want 1440p with a scaler and I don't want to spend the money for a 2713 considering all the bad feedback. It's starting to seem like I have literally zero options.

Got a microcenter nearby? Check out their 27" Auria.

Or, www.OverlordComputer.com grab a pixel perfect ME edition.
 
That's so lame.

To be honest, I think the ship has kind of sailed. There's only 'so much' rejected panels they can get a hold of, that can still have decent quality.

I want a Samsung 970D, but I also want to VESA mount the fucking thing. Seems my hands are tied at an ACD.

//killsself
 
Thousands of purchases, and everybody who gets a bad egg runs to internet forums to complain/get return advice while everyone who gets good ones keep lurking away happily. I wouldn't call them trash at all, but the risk has always been there from the very beginning. And it is definitely a risk.

@zazzn, that doesn't necessarily mean anything. The connectors always have all the holes, even if they are incapable of using them, so that any cable will at least plug in without bending the pins. Check your spec sheet.

This is true but not all connectors have all the holes which really pisses me off when I get a DVI monitor that cannot take my dual link, or DVI-I cable.
 
To be honest, I think the ship has kind of sailed. There's only 'so much' rejected panels they can get a hold of, that can still have decent quality.

I want a Samsung 970D, but I also want to VESA mount the fucking thing. Seems my hands are tied at an ACD.

//killsself

I wanted a U2711, but I got a lame panel and it was way too finicky with DP to even use. The 2713 looks great on paper, but apparently sucks. The HP 2740 isn't an option because there's no scaler. Ugh.
 
I had all kinds of trouble getting it to wake from sleep when connected via DP. I sometimes had trouble getting it linked with the machine to begin with. I'd have to power cycle the monitor and plug it back in before it'd work.
 
Just got mine... It's Fing amazing.... I got the one with the tempered glass for 280 from bigcloth... Amazing service, got the monitor in like 2 days which is insane considering i'm in SF, CA and I orders Monday at like 3 am. Only problem is there is dust behind the glass... Have no dead pixels but plenty of dust spots... Any way to blow air in there to get rid of the dust? I know I saw the glass removal on a site before, but I don't want to remove the tempered glass I love the look even if it reflects everything...
 
Sorry to be a pain, but wtf where is the ACD profile for windows? I can't find it anywhere in this tread I know it's here but it 300 pages long!
 
Well I got my monitor from Green Sum today and there are some issues.

There's a few dead pixels, I haven't done a complete thorough search but I found a few and based on there positions (and if I find more) they might be enough to send it back...although they are so hard to find since they are tiny that if they were the only issue, I wouldn't send them back.

The big problem is that the monitor makes noise based on what is on the screen. Usually it is faint but annoying (like the whine on a CRT TV whine it is warming up) but I found a post on overclock.net that suggested trying this picture:
http://img238.imageshack.us/img238/3503/200006resolutionprimertab3.png
When I open that picture full screen, it makes a very loud noise, audible over fans/air-conditioner/music/etc.
I made a video where I you can hear the sound:
http://www.youtube.com/watch?v=ka5JC8nKqx0&feature=youtu.be

What is my course of action here? In that same overclock.net thread, there was a post about opening up the screen and putting epoxy on one of the chips which can vibrate and make noise. I am not sure that it is the same chip in my case but I don't really want to open up a brand new monitor if I can return it. The power supply seems to be working fine (even though it doesn't say 110v) so I don't know that that is the issue.
I'm going to try harder to locate the dead pixels and then send that video to Green Sum and see what happens.

edit: to be clear, this noise is NOT coming from the adapter. The adapter is silent except for a 60hz hum that I can only hear if I press it hard against my ear. The sound is coming from behind the monitor.

Update on this-- got my replacement screen from greensum yesterday and brought it home this evening. The loud noise is not present (still maybe a tiny noise but can't hear without ear on screen).

I found one dead pixel (looks like a hole in the pixel grid...can only see it on black from close up at exactly the right angle) but otherwise it looks clean and there is nothing that is visible from reading distance.

A little bit of backlight bleeding in the same place as the previous one (bottom edge on the left) but no big deal except on pure black.

Overall super excited to have it back (no extra charges for the return shipping or anything)

I'm re-thinking getting one of these. I'm not happy with this generation of 27 inch IPS displays, so I figure I might as well go as cheap as I can. The thing is, I need a scaler and multiple inputs. So that brings me to the 2720MDP. Does anyone have this specific display? How is it? Bleed? Tint? Defective pixels? Should I get another one?

Why do you *need* a scaler? I have an ATI card and with hardware scaling switched on; I have noticed zero difference compared to my old monitor. Games and stuff work just fine at funny resolutions (and I think the card is actually better at it than most monitor scalers are).
 
People are talking about backlight bleed and image retention. I don't want to pay $800 for backlight bleed and image retention.
Oh yeah, I remember reading about that a couple weeks ago... This is really too bad even the high end monitors are suffering from very poor qc and other issues :(
 
Why do you *need* a scaler? I have an ATI card and with hardware scaling switched on; I have noticed zero difference compared to my old monitor. Games and stuff work just fine at funny resolutions (and I think the card is actually better at it than most monitor scalers are).

You wouldn't for a computer. If you want to hook up anything else though, DVD, BluRay player, Wii, Xbox360, PS3, etc etc. You need one with more inputs and a scaler.
 
I am pretty sure I recieved one of the models with a later "2D or 2E" PCB, as my colors were quite cool looking, even after trying out some of the color calibration profiles others had uploaded. I just picked myself up a Spyder 4 Pro calibration device, and colors look so much more natural now. Once again, I am so incredibly impressed with this monitor, and is no doubt the best purchase I have done for my PC in some time. Here is my Spyder 4 Pro color profile if you are using one of the later 2D or 2E versions.

https://dl.dropbox.com/u/18394826/Achieva Shimian QH270 Spyder 4 Pro.icm
 
I am pretty sure I recieved one of the models with a later "2D or 2E" PCB, as my colors were quite cool looking, even after trying out some of the color calibration profiles others had uploaded. I just picked myself up a Spyder 4 Pro calibration device, and colors look so much more natural now. Once again, I am so incredibly impressed with this monitor, and is no doubt the best purchase I have done for my PC in some time. Here is my Spyder 4 Pro color profile if you are using one of the later 2D or 2E versions.

https://dl.dropbox.com/u/18394826/Achieva Shimian QH270 Spyder 4 Pro.icm

Thanks. I'm currently using .icm on my hackintosh and it really makes the skin tones more natural looking. I like it.

I have the shimian with edge to edge glass.
 
So whats the consensus on ordering one of these at this point? I just found out about these today and I'm looking to get a good deal on a new 27 inch + monitor for movies and gaming. From what I can surmise from perusing the threads is these things are cherry picked from a factory lineup of imperfect displays, re-branded and sold on ebay do i have that right? Could anyone suggest a model \ seller and give me some info on input lag? Someone mentioned eye strain and coupled with talk of the funky a/c adapter has me a bit worried to pull the trigger on one.
 
I think they are fine to order some people get bad ones, that has been happening since the start it is part of the risk. There are more options now than ever. This is a typical glossy display eye strain / whatever is an individual problem.
 
192 pages on [H] is definitively a market tendency. Price on 1440p LED monitors have breaked the US$300 barrier, and now are mainstream.
The implications are tremendous for Intel, AMd and NVIDIA:
-The increased costs for Displayport means that there will be a sustained demand for DL-DVI capable motherboards- these monitors are cost effective for stock market business, and need cheap PCs to drive them. here AMD will delay the withdraw of DVI and Intel will be forced to introduce DL-DVI capable mobos.
-Again business users will want low end graphics cards with 2 DL-DVI outputs, and i can not name a graphics card without PCI-E auxiliar power with 2 DL-DVI ports.
-Finally gamers have a real chance to game at insane resolutions inside a reasonable budget.
-If the monitors makers want to sell more of these they shopuld focus on: lower the costs of a displayport interface, which may take a few trimesters and make monitors with scalers for HDMI connection capaple of bypassing scaler when using DL-DVI for gaming, something that may hit market soon enough.
I don't see anything wrong with up to five dead pixels- at this size it will be a non issue for business and gamers, especially considering that a flawless panel from Apple cost 3 times more. Come on guys: panels with 1-5 dead pixels shouldn't go to the trash: let them be used in stock market and multi-monitor gaming.
Finally, a market trend for Microsoft to adress: most triple monitors setups are using portrait mode, and cleartype does not work on portrait mode ( at least last time i checked ). MS must work around this limitation that impacts the use of these monitors in business.
It is a new era, and now more than ever there is a real pressure on graphics to improve performance per watt- the "usual" gaming resolution will increase 6x, and dual cards solutions will be a requirement.
 
192 pages on [H] is definitively a market tendency. Price on 1440p LED monitors have breaked the US$300 barrier, and now are mainstream.
The implications are tremendous for Intel, AMd and NVIDIA:
-The increased costs for Displayport means that there will be a sustained demand for DL-DVI capable motherboards- these monitors are cost effective for stock market business, and need cheap PCs to drive them. here AMD will delay the withdraw of DVI and Intel will be forced to introduce DL-DVI capable mobos.
-Again business users will want low end graphics cards with 2 DL-DVI outputs, and i can not name a graphics card without PCI-E auxiliar power with 2 DL-DVI ports.
-Finally gamers have a real chance to game at insane resolutions inside a reasonable budget.
-If the monitors makers want to sell more of these they shopuld focus on: lower the costs of a displayport interface, which may take a few trimesters and make monitors with scalers for HDMI connection capaple of bypassing scaler when using DL-DVI for gaming, something that may hit market soon enough.
I don't see anything wrong with up to five dead pixels- at this size it will be a non issue for business and gamers, especially considering that a flawless panel from Apple cost 3 times more. Come on guys: panels with 1-5 dead pixels shouldn't go to the trash: let them be used in stock market and multi-monitor gaming.
Finally, a market trend for Microsoft to adress: most triple monitors setups are using portrait mode, and cleartype does not work on portrait mode ( at least last time i checked ). MS must work around this limitation that impacts the use of these monitors in business.
It is a new era, and now more than ever there is a real pressure on graphics to improve performance per watt- the "usual" gaming resolution will increase 6x, and dual cards solutions will be a requirement.


probably because there was no need. I doubt last gen on board gpu could reasonably drive dual 1440p doing much of anything, and you dont need dvi-dl for anything until you go above 19x12





has anyone PLP'd their shimian/catleap with something that measured up well pixel size and physical size wise?
 
Last edited:
I think they are fine to order some people get bad ones, that has been happening since the start it is part of the risk. There are more options now than ever. This is a typical glossy display eye strain / whatever is an individual problem.

Any chance I could trouble you to link me to the best deal going at the moment? I'm real ignorant about what to look for specifically and can't really tell these hundreds of catleap and shimian listings apart.
 
probably because there was no need. I doubt last gen on board gpu could reasonably drive dual 1440p doing much of anything, and you dont need dvi-dl for anything until you go above 19x12





has anyone PLP'd their shimian/catleap with something that measured up well pixel size and physical size wise?

Yet oddly 2 x DL-DVI has been par for the course for a while with nVidia at least and it has over and over won purchases for them from me, be it 120hz displays or higher resolution displays. Also eyefinity could push 6 x 30 inch displays 3 generations ago in the 5xxx seried radeons.
 
Any chance I could trouble you to link me to the best deal going at the moment? I'm real ignorant about what to look for specifically and can't really tell these hundreds of catleap and shimian listings apart.

Do you want to be able to pivot the monitor? _ > |
Are you willing to pay extra for a metal or more sturdy stand?
Do you need lots of inputs like dislay port vga (will you be hooking up a low resolution console like the PS3 or Xbox?
 
Do you want to be able to pivot the monitor? _ > |
Are you willing to pay extra for a metal or more sturdy stand?
Do you need lots of inputs like dislay port vga (will you be hooking up a low resolution console like the PS3 or Xbox?

I wouldn't mind having the option of using a 360 on it but its not a priority. As far as the stand goes I could always just buy a better one later on.
 
Yet oddly 2 x DL-DVI has been par for the course for a while with nVidia at least and it has over and over won purchases for them from me, be it 120hz displays or higher resolution displays. Also eyefinity could push 6 x 30 inch displays 3 generations ago in the 5xxx seried radeons.

yeah AMD moved to mDP
 
yeah AMD moved to mDP

mDP would be a better product IF the cards come with free mDP to DL-DVI adapters.:D
When 2560x1600 where truly high end there was no need for a low cost solution.;)
Now AMD must adress this market or lose even more sales to NVIDIA.
And NO, active mDP-DLDVI adapters are not cheap costing around US$90 and are not without problem, since eyefinity setups using mDP adapters have all kinds of trouble, including but not limited to, monitor not waking up after going into sleep mode:eek:
These monitors may be the death of low cost TN panels for gamers and are a ground breaking event in multimonitor gaming.
 
Naw, monitors just need to come with DisplayPort inputs. Dual-Link DVI is actually a crummy standard for a variety of reasons. However, you work with what you have. In the not-very-near future, Scribby is hoping to make DisplayPort-only 120Hz Overlords like these monitors. That would be ideal. Then you could actually do 5x2560x1440p@120Hz off of one convenient setup without adapters etc.
 
mDP would be a better product IF the cards come with free mDP to DL-DVI adapters.:D
When 2560x1600 where truly high end there was no need for a low cost solution.;)
Now AMD must adress this market or lose even more sales to NVIDIA.
And NO, active mDP-DLDVI adapters are not cheap costing around US$90 and are not without problem, since eyefinity setups using mDP adapters have all kinds of trouble, including but not limited to, monitor not waking up after going into sleep mode:eek:
These monitors may be the death of low cost TN panels for gamers and are a ground breaking event in multimonitor gaming.

I agree.

I'm trying to plan out my road map for multimonitor gaming. I have my sights on getting a 670 right now and doubling or trippling them down the road when I get a 3rd monitor. I plan on the third monitor being one of these as I want the ability to do KVM on one of the monitors, and VGA is what I currently have for a KVM interface. Also having one with a scaler to hook up anything that doesn't scale properly would be a nice thing to have in a pinch. And I can cope with any added lag inherent with a display with a scaler.

I've been doing a lot of reading though and it appears that 2 670's will not really be enough power to run 3 1440p monitors at native rez. I suppose I will have to upgrade my whole platform to run 3.....oh well I guess I am just about due anyway....although its hard to justify as my 860@4Ghz runs anything without issue.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Naw, monitors just need to come with DisplayPort inputs. Dual-Link DVI is actually a crummy standard for a variety of reasons. However, you work with what you have. In the not-very-near future, Scribby is hoping to make DisplayPort-only 120Hz Overlords like these monitors. That would be ideal. Then you could actually do 5x2560x1440p@120Hz off of one convenient setup without adapters etc.

What are the variety of reasons DL-DVI is crummy? From my experience it just works, it never falls out, and it works with everything at just about any resolution and can do it all with cheap usually included adapters. Thus far no display standard has pulled that off. I cant think of anything I hated about DVI, but I can certainly do that for HDMI and DP is nonexistent on just about anything I own, and it seems DP is set to be replaced by lightpeak / thunderbolt before DP even gained any traction.
 
DVI Cons:

- Enormous plug. Larger than anything since old printer cables.

- 5 different implementations of DVI: Single link DVI-A, DVI-D and DVI-I, as well as Dual-Link DVI-D and DVI-I. This is immensely confusing to many people who only know "Its DVI", especially since there is substantial ambiguity on how the connectors are made. A port that LOOKS like Dual-Link DVI-I (aka all the holes) might only support Single Link DVI-I. Or a Dual-link DVI-I cable might not plug into a Dual-Link DVI-D port, due to missing holes.

- The connector is simply out of date. It has bendable pins, which obviously can ruin the entire cable, and it is difficult to plug in blind (eg. behind a PC which you can reach but not see). Plus you have to screw it in :/

- Uses 2 data channels. This restricts the amount of DVI-outs a graphics card can have. I don't know for Nvidia, but AMD 7000 series cards have a max of 6 data output channels, so you can't have 2 DL-DVI, 2 mDP, 1 HDMI, for example. However, you CAN have 6 Displayport/HDMI outs since they only take 1 channel each.

- No defined Maximum Bandwidth. As such, you DO NOT KNOW what your cable will support because there is no standard max. Also, this leads to interesting situations with drivers. Nvidia recently capped their Dual-Link DVI outputs to 330Mhz pixel clock via drivers. Before that, we hadn't found a max. However, Nvidia can do this because their is no "max" they have to meet. AMD cards are similar, although depending on the series there were (and are) different ways of working around this.

All that said, I'm well aware that DVI is pretty functional for most people. I just think it's an old, outdated standard that needs to be done away with. I'm sure I've overlooked a few things though.
 
DVI Cons:

- Enormous plug. Larger than anything since old printer cables.

- 5 different implementations of DVI: Single link DVI-A, DVI-D and DVI-I, as well as Dual-Link DVI-D and DVI-I. This is immensely confusing to many people who only know "Its DVI", especially since there is substantial ambiguity on how the connectors are made. A port that LOOKS like Dual-Link DVI-I (aka all the holes) might only support Single Link DVI-I. Or a Dual-link DVI-I cable might not plug into a Dual-Link DVI-D port, due to missing holes.

- The connector is simply out of date. It has bendable pins, which obviously can ruin the entire cable, and it is difficult to plug in blind (eg. behind a PC which you can reach but not see). Plus you have to screw it in :/

- Uses 2 data channels. This restricts the amount of DVI-outs a graphics card can have. I don't know for Nvidia, but AMD 7000 series cards have a max of 6 data output channels, so you can't have 2 DL-DVI, 2 mDP, 1 HDMI, for example. However, you CAN have 6 Displayport/HDMI outs since they only take 1 channel each.

- No defined Maximum Bandwidth. As such, you DO NOT KNOW what your cable will support because there is no standard max. Also, this leads to interesting situations with drivers. Nvidia recently capped their Dual-Link DVI outputs to 330Mhz pixel clock via drivers. Before that, we hadn't found a max. However, Nvidia can do this because their is no "max" they have to meet. AMD cards are similar, although depending on the series there were (and are) different ways of working around this.

All that said, I'm well aware that DVI is pretty functional for most people. I just think it's an old, outdated standard that needs to be done away with. I'm sure I've overlooked a few things though.

Here's a big one... There is no DVI consortium. It disbanded some time ago, so the standard will never again be updated. What we have in DVI now is all that DVI will ever be.
 
Here's a big one... There is no DVI consortium. It disbanded some time ago, so the standard will never again be updated. What we have in DVI now is all that DVI will ever be.

That too.

Also, with respect to thunderbolt: enjoy paying $250 start price for a Thunderbolt mobo. And having thunderbolt on the Mobo isn't even helpful really, since I don't think GFX cards will output to displays via the motherboard. Besides, thunderbolt really is just a glorified displayport. The plugs are the same. Its a cool idea and all (external PCIe) but I fail to see how it is gonna replace Displayport. I mean, when used with a monitor it IS displayport, period.
 
That's why i love it the [H] way!
A simple post about a cheap korean monitor made from rejected LG panels becomes a debate about the cons of DVI.
My 2 cents on the matter:
-costs are everything. If we want to have the cheapest possible working monitor, resorting to an old, oudated stantard that works without problems is almost mandatory
-If AMD cards have 6 data channels, it can manage 3 x DL-DVI outputs, or AMD should provide free mDP to DL-DVI active adapters. Or watch the uers move to NVIDIA, where 2 US$110 GTX 650 can handle 4 of these DL-DVI monitors, a dream for stock market users.
-DVI pins bend, and break. So do HDMI ports, but with much less force than DVI. There thousands of dead HDMI ports in TVs around the world, mostly thanks to electrical issues. Expect similar complains once DP and Thunderbolt becomes popular.
-If all this talk results in cheaper displayport monitors it will be better for us all. I just dont seee that coming for the next 4 trimesters.
 
Back
Top