sharp 32" 4k monitor - who else is planning on getting it?

that said, 30hz isn't bad.. it isn't like it is 30fps. yeah some tearing - but I never played with vsync on in the past.. so I've always seen stuff like that.


For gaming 30 Hz is pretty bad. I will admit its not the same as limiting a game to 30 FPS as it seemed to be a lot less laggy (input wise) but I played quite a bit of FPS' back in the day on my vp2290b running at 33Hz and I did pretty badly. Its amazing how much things improved just going from 33 -> 41Hz.

Have you tried getting a higher refresh rate out of the monitor with a custom modeline/upping the pixelclock?
 
sadly it goes only to 31 (it really isn't that bad... just tearing when I move around quickly, but that doesnt' effect how I play - like low fps would) :p but games are amazing looking @ 4k - I don't think I could go back to 2560x1600.... on the other side.. 1080p even looks more like ass now :)


far cry 3 looks AMAZING (I just got around to play it - I play a bunch o' games :) ) http://steamcommunity.com/id/zurv


what we need are some rts.. those shouldn't need a high hz :) oh... CoH 2 is almost out...
 
Last edited:
look at my sig.. I also spend over 3k on titans.. (also the monitor was closer to 4k.. but yes... still enough for your example) :)

I just move from a bunch of 7970s (5 are sitting under my desk) but they SUCK!!

they don't have the power to run games @ 4k. if crossfire works (that is a big IF), it gets slower the more cards I add... crossfire 2 is faster that 3 cards in crossfire (other than in bf3).

worthless crap.

i'd rather have 60-100fps+ @ 30hz then sub 40 @ 60hz

they fucked up thing is NVidia HAS it work, they just refused to put it on their Geforce line.

I am currently gaming on a titan and am in the process of doing a 3x1 2D lightboost setup like Vega....BUT I was running a 5x1 portrait setup with a much bigger resolution than your 4k, with quadfire 7970 lightnings and they were pushing that big ass 5400x1920 res pretty damn sweet. Quadfire is finicky but Trifire 7970s will handle that 4k good until Nvidia gets their head out of their ass. It will certainly be a better experienc than 30hz refresh rates on the titans.



 
...BUT I was running a 5x1 portrait setup with a much bigger resolution than your 4k, with quadfire 7970 lightnings and they were pushing that big ass 5400x1920 res pretty damn sweet. ...]

Eh, it's not that much more than 4K. Literally only 1920x1080 more.
 
I'm probably switching to 7970s. Can't stand 4k30hz. So for people who have crossfire radeons, is the framerating from this article actually very visible? Anyone even notice an issue before this article came out?
 
I'm probably switching to 7970s. Can't stand 4k30hz. So for people who have crossfire radeons, is the framerating from this article actually very visible? Anyone even notice an issue before this article came out?

Yes and no. I have an issue with v-sync. If I play with v-sync at 7680x1600, I get cut to 30 FPS regardless of whether I get more than 60 FPS without v-sync. Though without v-sync, everything is smooth. So... yeah... Idk. I could do some testing at 4K if you want (3x1280x2160). I've been meaning to do this for a while, but haven't had a real reason too.
 
I'm probably switching to 7970s. Can't stand 4k30hz. So for people who have crossfire radeons, is the framerating from this article actually very visible? Anyone even notice an issue before this article came out?

I very much noticed it (I have 5 7970s that i'm not using and bought them @ $600 a pop when released)

other than a few game.. crossfire doesn't' work - I found myself shutting off card to only 2 because it was slower with more cards. ("slower" not counting the fake numbers frame-rating lies of ati)


while a cool setup - playing COD isn't really pushing anything. a mouse running on a wheel can power that game. :)
I couldn't do it tho... those screen are still 1080p, are while a lot of them, it is still eye cancer.

see if you can borrow a friend card if you are going ATI. personally, if you are going that way - get the 7990? (that is the dual card, right?)


for example, hitman (which was the game that rage made me replace all my 7970s. was almost unplayable @ 2560x1600 with 3 cards crossfire. But fine with 2.. it wasn't the onlygame. I have a g19 keyboard with the LCD and it shows card usage. most of the time - even when the game was slow the card weren't used more than 50%. RAGE! (let's not even talk about its crazy change speed non-stop that would break audio on my htpc's receiver.)
All better with NVidia SLI.
 
Last edited:
Eh, it's not that much more than 4K. Literally only 1920x1080 more.

You have to take into account the FPS/refresh rate. 4K screens are 60 Hz. These multi-monitor setups I and bastard run are 120 Hz.

Pushing 5x1 120 hz at 120 FPS is tremendously more intensive than a 4K 60Hz/60 FPS demand. Heck, even my 3x1 120 Hz setup is quite a bit more demanding than a 4K screen being 6 mil pixels running 120 FPS versus a 4K screen at 8 mil pixels running 60 FPS.

That double FPS is also way more demanding on the other parts of your system whereas 4K 60 FPS just really puts pressure on the GPU's.
 
omegaw666:

make sure the your problem is not low fps. One card not matter how powerful is going to cut for games at this rez: (well.. some.. COD or skyrim might be able to). Far Cry 3 (the game I've been play for the last few days) - I had to lower AA because I was dropping before 60fps.(all titans were fully used)
You only have one titan right? What games are you playing? what fps are your getting?


60hz or not - low fps bloooooow

I made a very short fail video from my phone: yes, random babble (oh internet video)

http://youtu.be/xRUVN14hj20

but, it shouldn't be a nightmare to play @ 30hz (clearly still far from ideal) - but the issue is the tearing not playing like shit. Omegaw66.. if only one titan, that isn't going to be fun.
ARGH! FU both ati and nvdia (and sharp) - ati has the eyefinity but you don't have the power because crossfire sucks balls (and is totally worthless after 2 cards - on most games*) and NVidia has the power, SLI works, but limited span. *sigh*.
...wtf sharp! you very much hid away that you'll be running 2 screens to get 60hz - so the biggest FU is yours.


Vega:
it is my understanding (that might be wrong) is refresh rate has no impact on performance)
 
Last edited:
I would return that waste of 4 grand if i were you zurv and get my money back. Being limited to only one gpu brand sucks and Imy eyes would bleed out of my head if i tried to game at 30hz
 
15% restock fee... *sigh*

the monitor is great for work :) I'm just hope'n another screen will with a single input comes out this year.

I can deal with the tearing for a bit. The performance is still there.

I had saved a bunch of money to get a OLED TV this year, but they are to small. 55" my ass :p
so I still have room in the budget to get another screen.

I also might be able to mod the titans to get access to nview and mosaic - then I can span the "two" monitors and still have sli working
 
Last edited:
Don't be hasty. Nvidia are rumoured to be launching the 7xx series at Computex.
 
they aren't going to start supporting spanning two monitors (they could do it now if they wanted to)

I don't think i'll need the 7xx as I have 3 titans
 
link? they are adding mosaic and nview to the 7xx GeForce line?
 
Hey Guys,
I just registered to this Forum, because I saw you were the first english speaking people who got a PN-K321, beside Reduser.
I would love if you could make some tests with your display and tell me the results. I would like to know, if you could use the monitor for gaming. So I would really appreciate if you could download the Valley Benchmark (http://unigine.com/products/valley/download/) or just use a game of your own and test it in UHD Resolution. You mentioned the Problem with 60Hz and Internal Multidisplay, but with Windowed Mode it should work, shouldn't it? And with a extra Tool, you can use Borderless Windowed Mode on every program, which supports Windowed Mode (http://forums.steamgames.com/forums/....php?t=2675769). When you test your monitor, could you mention how the overall looking is, and if you can see any tearing or ghosting. And if you can feel any Inputlag.
Thank you in advance.


Greetings from Germany
Voigt
 
I can't tell with gaming because it can only play at 30hz.. which is total ass.
but desktop perf is great.

Are you running the desktop @ 30Hz as well, or are you running it at 60Hz as a virtual multi-monitor setup?

I ask because this just came out and it seems like a better deal than a 3 or 4 panel setup. It doesn't seem capable of handling multiple simultaneous inputs over HDMI (doesn't even have displayport), so it's strictly limited to 30Hz at 3840 x 2160. I don't do a lot of gaming, but I could really use that kind of screen real estate for work stuff. I've gotten a lot of punters claiming "30Hz is terrible and will suck for everything!", but nobody who has actually tried it for productivity purposes. My current panel doesn't even support 30Hz, so I'm hoping you could give me some feedback on whether or not this panel would be a workable option.
 
multidisplays on one screen sucks. I don't need stink'n extra task bar. "full screen" is only 1/2 the monitor too.

30hz on desktop doesn't matter. same for watching video as that will normally be 23 or 29fps.
the ass comes in when playing games.
 
the thought of getting a few 7990 makes me sick.. but maybe to get the 60hz..

then what the hell do I do with 3 titans? put them in my htpc connected to a 1080 screen :p or do I need to get 4k for that now too!

then what to do with a few months old evga 680 classifieds...


oh the dark hole i'm falling in...
 
the thought of getting a few 7990 makes me sick.. but maybe to get the 60hz..

then what the hell do I do with 3 titans? put them in my htpc connected to a 1080 screen :p or do I need to get 4k for that now too!

then what to do with a few months old evga 680 classifieds...


oh the dark hole i'm falling in...

tumblr.jpg
 
Seeing that nobody else is decent enough to jump in here and lend a helping hand...Zurv I will take one of those Titans off your hands and provide it with a loving home. Please no need to thank me, helping people in need is thanks enough!
 
lol, l88

I, too, will extend a helping hand, though my hands are bigger than l88's so I expect two cards plzkthx
 
30 hz works fine for internet and office applications.

I have connected my laptop HDMI 1.4 to Dell 3007wfp 2560x1600 DVI-D Single Link at 30 hz, and it works fine.

If you visit Notebookreview forums, a lot of laptop users connect their HDMI to DVI-D single link 2560 monitors at 30 hz since very few laptop comes with Displayports or DVI-D outputs.

Fortunately for me, my laptop supports 4 total displays. I have the laptop screen at 1920x1080 at 60 Hz, Displayport output to T221 3840x2400 DVI-D Single link at 31 Hz, Displayport to Dell u3011 Displayport at 60 Hz, and HDMI 1.4 to Dell 3007wfp 2560x1600 DVI-D Single link at 30 Hz. Everything works great for videos and work. I do not play games, so I do not really need a higher refresh rate. My laptop comes with two displayport and one HDMI.
 
Last edited:
I don't think anyone's disputing whether or not it WORKS, but if you're actually going to be doing work at 30hz then you're in for some sore eyes and/or headaches.
 
why would a refresh rate on an LCD give you sore eye or headahces? this isn't CRT.
the 2-3 fps your from desktop app laggy? :)
the problem is only with gaming.. not video not work apps.
 
What does CRT vs LCD have to do with refresh rate?

Low refresh rate causes more eye strain, period.

EDIT: my bad, misread your post. I agree that for pure text, refresh rate is unnecessary, but there are too many times when multimedia is necessary for work purposes.
 
crt you could see the flicker of the refresh rate...

any only if mutli media is more than 30fps AND is great movement - but not sure about headache, it would just be annoying tearing.

some more random 4k content... dishonored new dlc with me failing to remember how to play :p
http://www.youtube.com/watch?v=jDJRh0DcwWo

YouTube must down rez the "original" - any idea what they are dropping it to?
 
Tell me, is the unit two physical panels? Or have Sharp simply been truly stupid and wired one screen as two panels?
 
Then surely they should recombine it to one display before presenting it?
 
Tell me, is the unit two physical panels? Or have Sharp simply been truly stupid and wired one screen as two panels?

there are two - but you don't really notice it unless you are set it two 2 screens.

they didn't need to have windows see as two (unless I didn't read mst spec correctly).
it could be because of the hdmi "dual" mode option. I don't think they expect people to play games one it, BUT.. full screen.. video..er.. well.. anything.. sucks when extending the desktop (vs span)
 
So let me get this straight, you cannot use this 4K panel at 60 Hz via one DP connection with a PC?
 
I have never had a fondness for Sharp products because of their unreliability, but.. what market corner was this an attempt to canvas?
 
So let me get this straight, you cannot use this 4K panel at 60 Hz via one DP connection with a PC?

DisplayPort 1.2 allows for up to 4 lanes of data in a single connector. The Sharp 4K is using 2 lanes to do 60Hz. The graphics driver could hide this from users but unfortunately it's not. This 2-lane DP is roughly analogous to dual-link DVI although DL-DVI puts even pixels through 1 link and odd ones through the other.
 
I thought the issue was GPU's not wanting to run the display as a single screen?

Also, DP1.2 bandwidth is almost maxed out with a single 4K resolution at 60 Hz.
 
Back
Top