Asus 39 inch 3840x2160 60Hz panel.

9 gigabit I mean desktop takes up at least 200MB at 1440P what you have is over 13x the resolution 9 gigabit = 1 1/8 GB = 1152MB
Obvious troll is obvious.
What are you talking about? I run all settings on low to get the maximum framerates in every game. High and Ultra Games settings are so useless, all they do is eatup vram with useless textures that cant be distinguished from low by 99.9% of the population. Resolution is the key to great gaming not useless settings.
 
What are you talking about? I run all settings on low to get the maximum framerates in every game. High and Ultra Games settings are so useless, all they do is eatup vram with useless textures that cant be distinguished from low by 99.9% of the population. Resolution is the key to great gaming not useless settings.
I run my 4K UHD screen of my GTX570 1.25GB using 450MB with not a single thing loaded except for Precision.
I refrain from AA and such myself but to say 3GB or 6GB of vram is enough while keeping them at full resolution is a big no.

I mean 6x 4K UHD is massive
 
I've been looking at the ASUS video and it looks like a single 780 might suffice at medium settings.
 
Sweet can't wait for your opinion on it.

I've been jonesin' to replace the dual monitors (24" Samsung + 20" Dell) I use at work with one of the Seiki 39's so I may just pull the trigger on this one too...

I can add hardware to my work machine as I see fit (upgraded it to 8GB and an M4 120GB SSD already)

I have two remaining issues...

I have to figure out a way to power the Seiki. The crappy HP6300 I use at work only has an i3 3220 powering it (no video card) and it lacks the room to install one of my spare 7970s (BTX form factor and the video card slot is partially obstructed by the CPU heatsink...

Then I'd have to figure out how to keep it from being stolen, lol.
 
The Seiki came in today. I'll mess around with it a bit over the next few days.
You might be interested to try some motion tests on it using the TestUFO motion tests (a web-based PixPerAn) -- www.testufo.com.

For example, test the ghosting test, especially at 1920x1080 @ 120Hz.
Also try the response time test, too. Make sure to use Chrome, since that is the only release web browser reliably able to do 120fps@120Hz VSYNC.
 
Man I am itching for Seiki or one of the Korean brands to get the 39" panel and DP 60hz input together for a night of passion. Sell it for ~$800 and rake in the sales.
 
Not true. He will run out of GPU horsepower. Now 2GB isnt enough video memory.

http://anandtech.com/show/7120/some-quick-gaming-numbers-at-4k-max-settings

There is about ZERO evidence of VRAM limitation on that review numbers, Titan loses to CF 7950 and SLI 680, in both cases the 6GB VRAM didn't benefit over the 3Gb and 2GB setups.

4K has less pixels than 3x1440p, a resolution that we know for many years that rarely if ever need more than 2GB VRAM in any game. Its been almost 2 years that the 4GB 680 launched and no review has ever show any advantage of 4GB over 2Gb in any resolution below 3x1440p.
 
There is about ZERO evidence of VRAM limitation on that review numbers, Titan loses to CF 7950 and SLI 680, in both cases the 6GB VRAM didn't benefit over the 3Gb and 2GB setups.

4K has less pixels than 3x1440p, a resolution that we know for many years that rarely if ever need more than 2GB VRAM in any game. Its been almost 2 years that the 4GB 680 launched and no review has ever show any advantage of 4GB over 2Gb in any resolution below 3x1440p.

I guess you didnt read that review good enough then.
 
Titan loses to CF 7950 and SLI 680, in both cases the 6GB VRAM didn't benefit over the 3Gb and 2GB setups.

The Titan only loses to SLI 680 on Dirt 3.... this quick benchmark review actually does show significant memory bottlenecking with 2GB, as both Sleeping Dogs and Metro 2033 show the TItan beating SLI 680 by a significant margin. However, it's notable that 3GB does seem to be more than enough, because the 3GB HD7950 and the 6GB HD7990 both seem to be performing very well, the 7990 even beating SLI Titans in Sleeping Dogs!

So, 2GB is borderline for 3840x2160 but 3GB seems to be more than enough.

That said, this review is tainted by the insistence on maxing out all settings even when it doesn't really make sense.... given that they were using at least 4xSSAA, which means that scenes are ACTUALLY being rendered at 7680x4320. I can't imagine most people really want to quad SLI Titans just so that they can use 4xSSAA in everything, and if you use a more sensible type of AA, you will double those framerates at least, so 3840x2160 doesn't really look all that insane to power as long as you use some common sense with your settings. SSAA was born from just an absolutely excessive amount of GPU power being available for 1080p.
 
Anyone who uses 4xSSAA on a 4k monitor has absolutely no idea what they are doing.
 
Anyone who uses 4xSSAA on a 4k monitor has absolutely no idea what they are doing.

I think 4k will finally let post-processing effects such as FXAA shine, as they'll have 4x the data to work with. Those are the benchmarks we should be seeing.
 
Ya, FXAA/SMAA work better at resolutions greater than 1080P.
 
I think 4k will finally let post-processing effects such as FXAA shine, as they'll have 4x the data to work with. Those are the benchmarks we should be seeing.

I don't think AA will be an issue going forward, really. This last generation of drivers and game engines has proven that there are effective ways of dealing with aliasing, starting with the very rough implementation of FXAA, that are both effective in reducing aliasing artifacts and crawl and capable of doing so without 'blurring' textures or introducing other artifacts, and all with a very minimal impact on overall performance, say <5%.

At this point, I'd figure that they'd be building 'soft AA' into the engines and turning it on by default; I mean, why do we use less than 16x AF ever?
 
Like only on the sharp/asus DP version which is lame. Not fond of their implementation at all.

Of course every driver on linux supports this and has for just about forever.
 
nVidia users!!!!!!!!!!!!

Good news.
4K60Hz now possible in videogames.
Twe-monitor spanning now in GeForce 326.19 beta for consumer cards.

http://www.anandtech.com/show/7153/nvidia-geforce-32619-beta-drivers-available

What are the consequences of using a "spanning" implementation for 4k@60 rather than a traditional method of display (i.e. like it is for 1080p@60)?

I don't really understand how the "spanning" thing works and what it means in terms of usage for games/movies etc.
 
What are the consequences of using a "spanning" implementation for 4k@60 rather than a traditional method of display (i.e. like it is for 1080p@60)?

I don't really understand how the "spanning" thing works and what it means in terms of usage for games/movies etc.

We're just talking about using two of the graphics card's outputs to output one half of the video signal each, say left on one output and right on the other. 'Spanning' is needed for the drivers to be able to see the two as one display, when previously they could only see one or three (in the case of Nvidia). Now, they can render everything to the two outputs so that a single 4k display can be fully supported at full refresh rates for all features including games.
 
We're just talking about using two of the graphics card's outputs to output one half of the video signal each, say left on one output and right on the other. 'Spanning' is needed for the drivers to be able to see the two as one display, when previously they could only see one or three (in the case of Nvidia). Now, they can render everything to the two outputs so that a single 4k display can be fully supported at full refresh rates for all features including games.

That makes sense. Except, isn't a single DP output from a GPU capable of delivering 4k@60Hz? So is spanning only needed for HDMI? I think most current GPUs have only a single DP and HDMI....
 
What are the consequences of using a "spanning" implementation for 4k@60 rather than a traditional method of display (i.e. like it is for 1080p@60)?

I don't really understand how the "spanning" thing works and what it means in terms of usage for games/movies etc.
It acts like 2 screens the new update makes it so that it still does this only you can no longer see it does it nor do you have to set it up.
 
It acts like 2 screens the new update makes it so that it still does this only you can no longer see it does it nor do you have to set it up.

Yeah, I wasn't really sure on the 'why' part either. Making it smoother to operate is quite nice, I'd think; I'm looking forward to seeing a Seiki set up like this!
 
I just realized that a monitor with this size and resolution will nearly perfectly match the pixel pitch of a 27" 2560x1440 monitor -- it should produce an enormous, if not quite perfect PLP display setup.
 
Seriously all the 1080P screens in the vicinity looked like absolute dog shit in comparison in terms of sharpness, vibrance and color reproduction.

This is non sense as resolution has nothing to do with vibrance and color reproduction.
The 5 year old Kuro 5080HD 720p still outperforms most LCD 1080p in almost every way.
 
nVidia users!!!!!!!!!!!!

Good news.
4K60Hz now possible in videogames.

Kind of. If you have an ASUS PQ321Q and are driving it via DisplayPort MST then yes.
If you have an ASUS PQ321Q and are driving it via dual HDMI then no.
If you have a Sharp PN-K321 then no, it won't work regardless of DP or HDMI.
This is because NVIDIA did not add 2x1 Surround support, they just put in a driver hack for the ASUS PQ321Q specifically for DisplayPort.

So much for "Adds support for tiled 4K displays."

Of course every driver on linux supports this and has for just about forever.

True. Only Windows suffers from crippled drivers. Linux has had Surround 2x1, 3x1, 4x1 and 2x2 support for a long time.

That makes sense. Except, isn't a single DP output from a GPU capable of delivering 4k@60Hz?

Yes...

So is spanning only needed for HDMI? I think most current GPUs have only a single DP and HDMI....

No. Spanning is needed when using a single DP cable or two HDMI cables. Currently there are no timing controllers that support 4K@60p. In order to drive the asus/sharp 4K@60p, two separate TCONs are used. Even though a single DisplayPort cable is used, there are actually two separate 1920x2160@60p streams going over the cable.
 
This is non sense as resolution has nothing to do with vibrance and color reproduction.
The 5 year old Kuro 5080HD 720p still outperforms most LCD 1080p in almost every way.

Not vibrance or color reproduction exactly, but there are definitely more pixels available to show much more detailed color gradients. That detail should be readily apparent.
 
Not vibrance or color reproduction exactly, but there are definitely more pixels available to show much more detailed color gradients. That detail should be readily apparent.

Greatly depends on what you use it for for colour critical gamut > number of pixels
 
Greatly depends on what you use it for for colour critical gamut > number of pixels

Sure, just pointing out that more pixels can help with the perception of the quality of the color of a screen.

For color critical work, I have my 30"; but there's something about a super-high-resolution panel that is very attractive.
 
Kind of. If you have an ASUS PQ321Q and are driving it via DisplayPort MST then yes.
If you have an ASUS PQ321Q and are driving it via dual HDMI then no.
If you have a Sharp PN-K321 then no, it won't work regardless of DP or HDMI.
This is because NVIDIA did not add 2x1 Surround support, they just put in a driver hack for the ASUS PQ321Q specifically for DisplayPort.

So much for "Adds support for tiled 4K displays."



True. Only Windows suffers from crippled drivers. Linux has had Surround 2x1, 3x1, 4x1 and 2x2 support for a long time.



Yes...



No. Spanning is needed when using a single DP cable or two HDMI cables. Currently there are no timing controllers that support 4K@60p. In order to drive the asus/sharp 4K@60p, two separate TCONs are used. Even though a single DisplayPort cable is used, there are actually two separate 1920x2160@60p streams going over the cable.

Very informative post. Thanks.
 
Anandtech recently posted a review of the PQ321Q, here. Looks like some minor issues with color and response time, but the reviewer loved the screen.
 
I like how the Chrome address bar is unreadable.

In what way? I think it would be unreadable even on a 1080p monitor if its some picture of the monitor and being downscale to 678x452 like that picture is? In that case I don't really get the point of how this?
 
Back
Top