New Samsung 4k for everyone.

A lot of people cannot just be happy, relaxed and content. Chaos is always swirling around them. I mostly attribute this to the people making a huge deal out of PWM, Size and Latency. Don't worry, there are just a few of you. Most of you from what I've read do not have an issue with any of those three things and have gone on to say you LOVE the display.

To prove my theory I asked a friend a few simple questions.

1) Is my screen too big. A) For me? Yes.

2) Do you like more than one monitor on your desk? A) Yes.

3) Do you like them side by side or on top of one another or a combination of both? A) Combination.

4) What size of monitors do you like? A) 24".

5) So 3 or 4 maybe? And a combination of side by side or on top of one another? A) Yes.

6) And you feel these would fit comfortably on your current desk? A) Yes.

Here is where I paused and then asked my final question.

7) So is it fair to say that these monitors that if you had them and they were arranged side my side and possibly on top of one another, that these would fix on your desk and that you would greatly enjoy them? A) Yes.

I then said, you see my monitor on my desk that you just said was too big. Guess what? Your fucking monitor setup we just made in our imagination is just as big if not larger and we have the same desk!

then silence.

The point I am trying to make is that people, even when faced with something good will automatically default into and out of their comfort zone and start having issues with something they really don't have a problem with if presented a different way. I've never made sense of this. I see people all they time sabotage themselves out of something good because their brains are hardwired to go into default mode and start making something out of nothing. Seeing the negatives instead of the positives.

Latency, PWM, Screen size, etc.

I can find 10x more positives ( unbiased ) than the few people here pointing out the few quirks with latency, screen size and PWM.

These new Samsung 4k displays are fantastic! Price, Warranty, Availability, Picture Quality, Latency, Immersive Size, etc etc etc
 
Last edited:
When is your 7500 coming? Super curious to read your opinion on the A/B testing. You mentioned you're sensitive to PWM. Has it affected you?

Gets here Thursday according to tracking.
I'm mainly sensitive to flicker. No 60hz plasmas, CRTs etc.
While the PWM on the 6700 isn't a show stopper, I can definitely tell it's there. I can deal with the PWM, I think, but I cannot deal with both the PWM and the blur at the same time. An hour of BF4 in the pitch dark last night gave me a nice ol' headache for about an hour. Sometimes I'd get that with my old PVA Samsung 305T because of the blur (no PWM.) My IPS Z30i has PWM, but in NCXs review of it, the freq is so high is imperceptible. And it's pixel overdrive feature results in basically the fastest, blur-free non-TN panel available. HP classifies that as a enterprise productivity monitor so fatigue is not an option.

Anyway, hope to see good results with the 7500. Not quite sure what I'll do if it's negligible from the 6700. Send both back and try again in 2016? Ugh. You know 4K will mature very quickly - all the tech nology already exists, while it didn't years ago before 4K. Now it's just a matter of getting the manus to get onboard with it. I mean this is the first real, quality, widely available 4K 4:4:4 option that has come out that I personally can take seriously. The best is yet to come for sure.

The point I am trying to make is that people, even when faced with something good will automatically default into and out of their comfort zone and start having issues with something they really don't have a problem with if presented a different way. I've never made sense of this. I see people all they time sabotage themselves out of something good because their brains are hardwired to go into default mode and start making something out of nothing. Seeing the negatives instead of the positives.

Latency, PWM, Screen size, etc.

I can find 10x more positives ( unbiased ) than the few people here pointing out the few quirks with latency, screen size and PWM.

What an absolutely ignorant post.
 
My 40JU6500 just arrived. Haven't had a chance to play around with it or set up UHD color on HDMI 1, just enough time to make sure it starts up and has not damage/bad pixels.

That start-up video looks amazing! Going from the AG coating of the U3011 to this semi-gloss display makes it seem like a huge jump. (And then there's the size).
 
JU7100 vs JU7100 with Motion Interpolation vs JU7100 with LED Clear Motion Enabled
JU6500 vs JU6500 with Motion Interpolation
PWM Free 2014 Sony vs PWM Free 2014 Sony with Motion Interpolation

When MI is disabled, the Samsungs appear to have multiple stacked letters since they use PWM. PWM makes moving content like the UFO in the Test UFO Ghosting Test appear stuttery or like it's moving side to side rapidly while also moving in a set direction. Rtings tests are quite similar to the Blur Buster comparison I posted. The Crossover 44k (OCN thread) I mentioned earlier has been confirmed to be PWM free, support 4:4:4 and look much better the new 4k Seikis, but it does drop frames when overclocked as expected. It likely stretches content like the Philips too.

I can see a slight difference in the basic 6500 and 7100 images, but it's hardly definitive to me that one has better motion than the other without additional processing. I'd definitely be interested in hearing more personal accounts from people who have gamed on both.

Does enabling features like MI and LED Clear Motion introduce more input lag, and is it possible to do so in PC and Game modes? 60Hz might be more palatable with this lightboost like tech, as the image above with it enabled is the clearest of the bunch.

Can anyone comment on how these screens handle 1080p PC gaming?

I'm keeping an eye on the Crossover 44k thread as well, but I'm hesitant about that company given their blatant false advertising (does NOT do 144Hz 1080p or 120Hz 1440p as promised, although there is a shot of 70Hz not dropping frames at some unknown rez) and the lack of warranty. I've count myself lucky that my Seiki 39 has lasted over a year without issue, but I also know I'd be fine if it died tomorrow, given what I paid for it.
 
My 40JU6500 just arrived. Haven't had a chance to play around with it or set up UHD color on HDMI 1, just enough time to make sure it starts up and has not damage/bad pixels.

That start-up video looks amazing! Going from the AG coating of the U3011 to this semi-gloss display makes it seem like a huge jump. (And then there's the size).

Congrats! I came from Dell u3011's and I love my new Samsung. Good luck and hope you love it as much as a lot of us here do!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Several of us have confirmed that the 6ft and 10ft versions of this cable work just fine. Also - several of us have confirmed that the 15ft version does NOT work. It is a cable length/signal issue.

Earlier in the thread the Mediabridge 15' HDMI 2.0 cable was confirmed to work, but only on the 40" JU6700 and not the 48". I hope others can confirm 15' cables that work and which model they tested it on.
 
hannibl. Read the reviews on it, especially the Amazon reviews.

Apparently it's a bit of misleading wordplay on Crossfires side. The Crossfire can accept 144hz / 120hz but not display them.
 
Does enabling features like MI and LED Clear Motion introduce more input lag, and is it possible to do so in PC and Game modes? 60Hz might be more palatable with this lightboost like tech, as the image above with it enabled is the clearest of the bunch.

Can anyone comment on how these screens handle 1080p PC gaming?

I believe MI does increase input lag. With LED Clear Motion you will lose half light output. For those who wants 20 backlight, LED Clear Motion can actually cut your light output in half without lowering the backlight. It's a form of black frame insertion which I believe is only available on the 7100 and up. On the other hand, I cannot seem to enable it on PC mode as it's part of the MI menu which is disabled under PC mode. I have to play around with it some more on this to confirm.

1080P is great on this. It's close to native in clarity. This is scaling done right. Do not assume any 4K TV/monitor can do just because it's 4 to 1 pixel density, because this is not true at all. Go through the 4K monitor reviews on Amazon, you'll find that just about every single one fails at 1080P scaling. I play exclusively at 1080P as I wait to upgrade my GPU. However, do not game on 4K, because it will be hard to go back to 1080P after (you'll end up paying for Twin Titans X to scratch that itch :)). 4K quality is extremely noticeable with a screen this big.
 
I believe MI does increase input lag. With LED Clear Motion you will lose half light output. For those who wants 20 backlight, LED Clear Motion can actually cut your light output in half without lowering the backlight. It's a form of black frame insertion which I believe is only available on the 7100 and up. On the other hand, I cannot seem to enable it on PC mode as it's part of the MI menu which is disabled under PC mode. I have to play around with it some more on this to confirm.

1080P is great on this. It's close to native in clarity. This is scaling done right. Do not assume any 4K TV/monitor can do just because it's 4 to 1 pixel density, because this is not true at all. Go through the 4K monitor reviews on Amazon, you'll find that just about every single one fails at 1080P scaling. I play exclusively at 1080P as I wait to upgrade my GPU. However, do not game on 4K, because it will be hard to go back to 1080P after (you'll end up paying for Twin Titans X to scratch that itch :)). 4K quality is extremely noticeable with a screen this big.


So... I have the 48ju7500 coming in tomorrow, along with a Titan X. I serously debated just gaming on a Xbox One on this tv versus a high end pc.

In all seriousness, is 4k native gaming really that apparent over 1080p? I ask because alot of other people in other forums like Neogaf etc state 4k isn't worth it and there isnt enough power in pc gpu's to push 4k.

Now I know that there is enough power in gpu's, the Titan X in SLI would definately suffice in that department but I can't help but think that most people saying 4k isnt worth it is those that haven't actually seen 4k native gaming especially on larger monitors like these.

In other words, I would really love some detailed opinions on how you feel playing 4k native games on here...
 
hannibl. Read the reviews on it, especially the Amazon reviews.

Apparently it's a bit of misleading wordplay on Crossfires side. The Crossfire can accept 144hz / 120hz but not display them.

That Crossfire is garbage...HOT garbage. Feel sorry for anyone foolish enough to think they're somehow getting a $1000 panel for $600 with it...In reality that panel is actually a $300 unit being upsold at $600, on the quality and reliability of a Changhong, or other vendor of that class. Fact.

Gets here Thursday according to tracking.
I'm mainly sensitive to flicker. No 60hz plasmas, CRTs etc.
While the PWM on the 6700 isn't a show stopper, I can definitely tell it's there. I can deal with the PWM, I think, but I cannot deal with both the PWM and the blur at the same time. An hour of BF4 in the pitch dark last night gave me a nice ol' headache for about an hour. Sometimes I'd get that with my old PVA Samsung 305T because of the blur (no PWM.) My IPS Z30i has PWM, but in NCXs review of it, the freq is so high is imperceptible. And it's pixel overdrive feature results in basically the fastest, blur-free non-TN panel available. HP classifies that as a enterprise productivity monitor so fatigue is not an option.

You're doing great work Brahmzy, definitely understand where you're coming from. I've been fortunate to never suffer from eye-fatigue, headaches, or some of the other maladies that my friends often get with low-grade displays, but I definitely see others suffering.

Keep it up. But go easy on SixFootDuo, he wasn't talking to you specifically, and his point is true too. There are a lot of clowns out there, clowns like the one he described perfectly (hilariously as well), and clowns like NCX also, who still will be on the sideline with some "elite and maximum everything" micro small 24-inch display two years from now because they can't find something that checks all 100+ of their boxes...feel sorry for them also, they've missed out on SO much fun.

Looking forward to the 7500 eval, thanks again.

So... I have the 48ju7500 coming in tomorrow, along with a Titan X. I serously debated just gaming on a Xbox One on this tv versus a high end pc.

In all seriousness, is 4k native gaming really that apparent over 1080p? I ask because alot of other people in other forums like Neogaf etc state 4k isn't worth it and there isnt enough power in pc gpu's to push 4k.

Now I know that there is enough power in gpu's, the Titan X in SLI would definately suffice in that department but I can't help but think that most people saying 4k isnt worth it is those that haven't actually seen 4k native gaming especially on larger monitors like these.

In other words, I would really love some detailed opinions on how you feel playing 4k native games on here...

I did the same, had determined a few months ago that my XB1 would be the primary gaming machine while waiting for GPUs to catch up to 4k properly. But the Titan X changed that, is the first DX12 single GPU that can effectively run most current 4k titles at frame rates that are actually higher than the XB1 is struggling to put out at 900p, especially if overclocked. And once it's not enough, not an issue, SLI with a second card.

To me, the 4k gaming experience absolutely is noticeable and dramatically better than at 1080p. Plenty of comments in the thread already on this issue from people playing Cities, BF4, and other games in 4k, and absolutely feeling it's more than worth it. And besides, you get your rig tomorrow, and will be able to see for yourself whether it's worth it or not.

Best of luck, and let us know how you like it.
 
Last edited:
If you guys want another very dramatic, beautiful, stunning and excellent 4k gaming experience, take my word for it ........... Play The Vanishing of Ethan Carter

The Vanishing of Ethan Carter is incredible at 4K but please bring at least an SLI setup to get good FPS.
 
In all seriousness, is 4k native gaming really that apparent over 1080p? I ask because alot of other people in other forums like Neogaf etc state 4k isn't worth it and there isnt enough power in pc gpu's to push 4k.

I read Neogaf as well and you will quickly notice that it's a different crowd over there. Guys there will be saving tax refunds to purchase stuff. Those guys play games on 24" monitors. You cannot tell the difference between 1440P and 4K at that size. Additionally, the matte screens have awful clarity on the individual pixels. My old Dell, everything is just fuzzy and yellow... ugh. Once you see 4K on a 40" (or 48" yikes!) screen that's three feet in front of you, everything just looks clearer and smoother the more pixels/polygons are added. You will see jagged edges on 1080P, even with AA. 4K without AA is pretty damn nice. It's sad because 900P is the max for Xbox One and 1080P the max for PS4, and here we're talking how awful 1080P gaming is. The goalpost has moved, 4K is the new 1080P.

People here are sporting SLI and Tri SLI of Titan/980's, so there is definitely enough GPU power to push 4K. The question is, how much money ya got? :cool:
 
I read Neogaf as well and you will quickly notice that it's a different crowd over there. Guys there will be saving tax refunds to purchase stuff. Those guys play games on 24" monitors. You cannot tell the difference between 1440P and 4K at that size. Additionally, the matte screens have awful clarity on the individual pixels. My old Dell, everything is just fuzzy and yellow... ugh. Once you see 4K on a 40" (or 48" yikes!) screen that's three feet in front of you, everything just looks clearer and smoother the more pixels/polygons are added. You will see jagged edges on 1080P, even with AA. 4K without AA is pretty damn nice. It's sad because 900P is the max for Xbox One and 1080P the max for PS4, and here we're talking how awful 1080P gaming is. The goalpost has moved, 4K is the new 1080P.

People here are sporting SLI and Tri SLI of Titan/980's, so there is definitely enough GPU power to push 4K. The question is, how much money ya got? :cool:

Good stuff Cyph.
 
I read Neogaf as well and you will quickly notice that it's a different crowd over there. Guys there will be saving tax refunds to purchase stuff. Those guys play games on 24" monitors. You cannot tell the difference between 1440P and 4K at that size. Additionally, the matte screens have awful clarity on the individual pixels. My old Dell, everything is just fuzzy and yellow... ugh. Once you see 4K on a 40" (or 48" yikes!) screen that's three feet in front of you, everything just looks clearer and smoother the more pixels/polygons are added. You will see jagged edges on 1080P, even with AA. 4K without AA is pretty damn nice. It's sad because 900P is the max for Xbox One and 1080P the max for PS4, and here we're talking how awful 1080P gaming is. The goalpost has moved, 4K is the new 1080P.

People here are sporting SLI and Tri SLI of Titan/980's, so there is definitely enough GPU power to push 4K. The question is, how much money ya got? :cool:

Yup totally agree! Just reading threads on there about 4k gaming made me think it wasn't worth it but I have to realize who is telling me its not worth it, most probably never seen pc games at native 4k, I know I haven't, only 1440p at 27 inches. So 4k at 48 inches at 36 inches away from it will probably be intense.

I havent been this excited for gaming in a long time!!
 
Yup totally agree! Just reading threads on there about 4k gaming made me think it wasn't worth it but I have to realize who is telling me its not worth it, most probably never seen pc games at native 4k, I know I haven't, only 1440p at 27 inches. So 4k at 48 inches at 36 inches away from it will probably be intense.

I havent been this excited for gaming in a long time!!

Honestly where 4k actually doesn't make much sense is in the living room, unfortunately.

Unless your view distance is less than 7-8 feet, and your display is 75" or higher, it's difficult to really perceive a quality difference in the image. Not as easy as it was going from say 240i, or whatever broadcast television was in the 90s, to true 1080p the last time, which is dramatic and shocking even at longer view distances.

But once you start getting close, and the closer the better, 4k really pops and has that wow factor again. At Frys, watching the feed demos on the Samsung 98" and LG 105" 4k sets from view distances of as close as 4-5 feet were incredible. Now THAT is IMAX in the home haha. But for most living rooms, you're not sitting under 7 feet away, you're not buying a 75"+ display, and you're not benefitting from 4k at all truthfully.
 
Honestly where 4k actually doesn't make much sense is in the living room, unfortunately.

Unless your view distance is less than 7-8 feet, and your display is 70" or higher, it's difficult to really perceive a quality difference in the image. Not as easy as it was going from say 240i, or whatever broadcast television was in the 90s, to true 1080p the last time, which is dramatic and shocking even at longer view distances.

But once you start getting close, and the closer the better, 4k really pops and has that wow factor again. At Frys, watching the feed demos on the Samsung 98" and LG 105" 4k sets from view distances of as close as 3-4 feet were incredible. Now THAT is IMAX in the home haha. But for most living rooms, you're not sitting under 7 feet away, you're not buying a 75"+ display, and you're not benefitting from 4k at all truthfully.

I imagine a 48 display at 36 inches away will be pretty intense all things considered. I will post once I have it all running in a couple of days. For fun I will first try games in 1080p then 4k to see how much a difference it really is.
 
I imagine a 48 display at 36 inches away will be pretty intense all things considered. I will post once I have it all running in a couple of days. For fun I will first try games in 1080p then 4k to see how much a difference it really is.

For sure, especially curved, which wound up being way cooler than I thought it would be.
 
So 4k at 48 inches at 36 inches away from it will probably be intense.

Oh, it will be, trust me. :cool:

coolhandm3 said:
I havent been this excited for gaming in a long time!!

I know what you mean. Between getting my Samsung, and SLI 980s, and GTA V releasing yesterday, I am on Cloud 9. Stuff like this makes me giddy and reminds me why I am a PC gamer and put up with little frustrations from time to time.

After seeing how well a single 980 does at 4K for the past couple of weeks, I can say with absolute certainty that the "current GPUs can't push 4K" is complete and utter BS. If you see those posts, check the date because that was undoubtedly the case at one point and I can see someone saying that a year and a half ago. But if people are still saying that then they are likely just parroting old info or trying to justify their 1080p/1440p displays.
 
Oh, it will be, trust me. :cool:



I know what you mean. Between getting my Samsung, and SLI 980s, and GTA V releasing yesterday, I am on Cloud 9. Stuff like this makes me giddy and reminds me why I am a PC gamer and put up with little frustrations from time to time.

After seeing how well a single 980 does at 4K for the past couple of weeks, I can say with absolute certainty that the "current GPUs can't push 4K" is complete and utter BS. If you see those posts, check the date because that was undoubtedly the case at one point and I can see someone saying that a year and a half ago. But if people are still saying that then they are likely just parroting old info or trying to justify their 1080p/1440p displays.

Cant wait! I will be getting GTA V too, havent played a GTA since Vice City, I hope the music is just as good as that made it for me with Vice City.

It will be interesting if my Titan X will be good enough for 4k, I do not plan to use AA so I would think I should be good with one Titan X. if not I can always get another but I am uneasy with SLI with their frame pacing issues and extra input lag it causes. We shall see.
 
Just got my 40" curved in last night and still trying to work out all the kinks. Put it on UHD and at 60FPS and i got faint ghostly (?), white lines going down the screen every few seconds or so in some games. Took it to 4k non-UHD at 30FPS and it helped but still there on some games while on others it stopped it.Turned off the ECO and energy saving stuff already. Updated the TV Firmware/Software also.
Using SLI 780ti's atm.
Anything else i can try? Video cards worked fine at 4k on the games that didn't have a problem so don't think it's them. Elder Scrolls:Skyrim WAS amazing at 40"! :D
Thanks!
 
You're doing great work Brahmzy, definitely understand where you're coming from. I've been fortunate to never suffer from eye-fatigue, headaches, or some of the other maladies that my friends often get with low-grade displays, but I definitely see others suffering.

Keep it up. But go easy on SixFootDuo, he wasn't talking to you specifically, and his point is true too. There are a lot of clowns out there, clowns like the one he described perfectly (hilariously as well), and clowns like NCX also, who still will be on the sideline with some "elite and maximum everything" micro small 24-inch display two years from now because they can't find something that checks all 100+ of their boxes...feel sorry for them also, they've missed out on SO much fun.

Looking forward to the 7500 eval, thanks again.

Yeah I kinda realized that a bit after I posted. :eek: And I have had multiple run-ins with said clown criticizing my inferior monitor choices. As someone in this thread said earlier to him - size matters. Plain and simple. You make some compromises and that looks different for each of us, depending on what our tolerances and priorities are - zero reason to bash somebody for their choice of display. Immersion and pixel real estate is at the top of my list - been waiting for 4K for a long time now.
 
I'm going to be returning my 48" JU7500 this weekend for one of the 40"ers. The 48" is just too big for my desk for productivity (sitting 2-3' away). I think a 40" would be much more comfortable, now I'm just trying to decide if I should go with the 6700 series or not. I've only used 3D a couple times and actually ended up turning it off after a while because I felt it made playing battlefield more difficult. I'm also not really convinced the curve adds enough extra emersion for the $ difference. Thoughts?
 
I'm going to be returning my 48" JU7500 this weekend for one of the 40"ers. The 48" is just too big for my desk for productivity (sitting 2-3' away). I think a 40" would be much more comfortable, now I'm just trying to decide if I should go with the 6700 series or not. I've only used 3D a couple times and actually ended up turning it off after a while because I felt it made playing battlefield more difficult. I'm also not really convinced the curve adds enough extra emersion for the $ difference. Thoughts?

I think the curve is good for the 48". for the 40" I wouldn't bother.
 
Just got my 40" curved in last night and still trying to work out all the kinks. Put it on UHD and at 60FPS and i got faint ghostly (?), white lines going down the screen every few seconds or so in some games. Took it to 4k non-UHD at 30FPS and it helped but still there on some games while on others it stopped it.Turned off the ECO and energy saving stuff already. Updated the TV Firmware/Software also.
Using SLI 780ti's atm.
Anything else i can try? Video cards worked fine at 4k on the games that didn't have a problem so don't think it's them. Elder Scrolls:Skyrim WAS amazing at 40"! :D
Thanks!

Try a different shorter cable. Sounds like the cable can't handle the high bandwidth.
Edit: 780s can't do 4k 60 444 over HDMI I don't think.
 
Last edited:

Here's the deal: in a lot of "enthusiast" pursuits there's a lack of much technical competence so anything which appears to fill that void is treated as some kind of gospel.

For example, relevant to this case a CG image sequence (eg. games, not videos) by its very nature lacks any persistence (ie. blur in time) which is an unrealistic fault of rendering an instantaneous moment instead of an average. Let's repeat that: lack of motion blur is bad because it lacks the sense of continuity that the real world possess in our eyes. The sequence of discrete stepped frames is an inherent property of the source and display technology which displays in frames.

Given this drawback, blur (ie. persistence) introduced by the display is actually a good thing, because it simulates motion blur by happy coincidence of how the hardware happens to work.

Ironically, PWM can be bad because its sharp cutoffs on the light source display the original source for what it is: a bunch of disconnected frames with no sense of motion-continuity between them. What's being called an "artifact" is a problem with the source itself.

This is all very basic image display fundamentals easily verifiable by anyone who'd bother: look at frames of motion shot on a camera vs rendered. Then study why shutters are held open on video cameras longer than they need to be. Finally look at backlight persistence: those with the sharpest cutoffs display the flawed source with highest fidelity.
 
I read Neogaf as well and you will quickly notice that it's a different crowd over there. Guys there will be saving tax refunds to purchase stuff. Those guys play games on 24" monitors. You cannot tell the difference between 1440P and 4K at that size. Additionally, the matte screens have awful clarity on the individual pixels. My old Dell, everything is just fuzzy and yellow... ugh. Once you see 4K on a 40" (or 48" yikes!) screen that's three feet in front of you, everything just looks clearer and smoother the more pixels/polygons are added. You will see jagged edges on 1080P, even with AA. 4K without AA is pretty damn nice. It's sad because 900P is the max for Xbox One and 1080P the max for PS4, and here we're talking how awful 1080P gaming is. The goalpost has moved, 4K is the new 1080P.

People here are sporting SLI and Tri SLI of Titan/980's, so there is definitely enough GPU power to push 4K. The question is, how much money ya got? :cool:

The guys playing games on 24" monitors tend to sit close.

Pushing 4x more pixels to get AA and a bit of sharpening is an incredibly inefficient way of doing things.

The numerous image quality problems with CG games need to be solved in the engine, not by marketing numbers. This is trivially demonstrated by the fact that any decent film at 720p and 24fps looks far better than any game.
 
Honestly where 4k actually doesn't make much sense is in the living room, unfortunately.

Unless your view distance is less than 7-8 feet, and your display is 75" or higher, it's difficult to really perceive a quality difference in the image. Not as easy as it was going from say 240i, or whatever broadcast television was in the 90s, to true 1080p the last time, which is dramatic and shocking even at longer view distances.

But once you start getting close, and the closer the better, 4k really pops and has that wow factor again. At Frys, watching the feed demos on the Samsung 98" and LG 105" 4k sets from view distances of as close as 4-5 feet were incredible. Now THAT is IMAX in the home haha. But for most living rooms, you're not sitting under 7 feet away, you're not buying a 75"+ display, and you're not benefitting from 4k at all truthfully.

I don't disagree that bigger for 4k in the living room is where it is at. That being said, I saw quite a difference in detail level when I switched from my 65in Panasonic plasma to my current Toshiba 65in 4k. This is with the same content being played. The upconverter makes a difference. Now, it isn't the jump that standard def to 1080p was, but it is a richer experience.
 
Interesting from BlurBusters:
Severe ghosting — This GtG artifacts often looks like this when you have little or no overdrive, and/or when the LCD panel is very cold (cold VA LCDs create much longer ghost trails than when warmed up; the common advice to warm up your display before testing applies to LCDs too).

I guess I never realized temperature had anything to do with it, which may explain different results I've seen. My "good day" I had a few days ago was after like 6 hours straight using the 6700. Last night I fired up an hour of BF4 cold. Who knows... And who knows how long the panel takes to warm up - could happen in just a few minutes.
 
The numerous image quality problems with CG games need to be solved in the engine, not by marketing numbers. This is trivially demonstrated by the fact that any decent film at 720p and 24fps looks far better than any game.

This comparison is bizarre. My Apple looks better than your orange.
 
I think the curve is good for the 48". for the 40" I wouldn't bother.

I took his post to mean "I'll be returning my 48JU7500 for one of the 40" models, but not sure if I should stick with the 7500 series or go with the 6700." I don't think he's necessarily debating whether to go flat or stick with curved.

But I agree with what you're saying; although I really like the curved 40JU6700 I think the flat 40JU6500 would be a great choice too. I just went curved because ever since owning my U3415W I do slightly prefer a curved display (for monitors, not necessarily TVs).
 
Just got my 40" curved in last night and still trying to work out all the kinks. Put it on UHD and at 60FPS and i got faint ghostly (?), white lines going down the screen every few seconds or so in some games. Took it to 4k non-UHD at 30FPS and it helped but still there on some games while on others it stopped it.Turned off the ECO and energy saving stuff already. Updated the TV Firmware/Software also.
Using SLI 780ti's atm.
Anything else i can try? Video cards worked fine at 4k on the games that didn't have a problem so don't think it's them. Elder Scrolls:Skyrim WAS amazing at 40"! :D
Thanks!

780ti can do 4:2:0@ 4k/60Hz which is fine. I am running my Titans at this and I am happy with this for now until I upgrade.
 
I don't disagree that bigger for 4k in the living room is where it is at. That being said, I saw quite a difference in detail level when I switched from my 65in Panasonic plasma to my current Toshiba 65in 4k. This is with the same content being played. The upconverter makes a difference. Now, it isn't the jump that standard def to 1080p was, but it is a richer experience.

The source tends to make a much bigger diff than anything on the display side, esp as displays get marginally better. A sharp/saturated meticulous photo is going to look a lot better on a mediocre 1080p than a mediocre taken photo on anything. Same thing for quality video sources even at 480p.

Sometimes people tend to focus even fixate on things that are perceivable but aren't necessarily consequential. More res is good for very fine detail but only a limited amount of material outside of text/ui/etc warrant it.
 
This comparison is bizarre. My Apple looks better than your orange.

The point is the problem has hardly anything to do with resolution.

Sure if you tune out everything but whatever arbitrary definition of "blur" there may be molehills of differences; but the largest of which stem from the fact 4k CG images w/o motion blur will inevitably look quite flawed.

There was some complain above of "over-analysis". Well, analyzing more consequential features is relatively important.
 
Last edited:
Interesting from BlurBusters:
Severe ghosting — This GtG artifacts often looks like this when you have little or no overdrive, and/or when the LCD panel is very cold (cold VA LCDs create much longer ghost trails than when warmed up; the common advice to warm up your display before testing applies to LCDs too).

I guess I never realized temperature had anything to do with it, which may explain different results I've seen. My "good day" I had a few days ago was after like 6 hours straight using the 6700. Last night I fired up an hour of BF4 cold. Who knows... And who knows how long the panel takes to warm up - could happen in just a few minutes.

That's funny you saw this today too, I literally just read that same piece a few seconds ago and wondered about how "warm" some of our examples were that were blurring.

But either way, warm or not, you do see slight blurring in the rtings shots, though not as bad as it was just a few short years ago in terms of "gaming on a television".

Looking at them side by side (7100 / 6500) you can definitely spot the differences...but those differences are so slight that it's really going to be a personal decision unfortunately. Wish it were night and day but it's just not, it's another input-lag style preference, some will be sensitive to it, others won't.
 
Just got my 40" curved in last night and still trying to work out all the kinks. Put it on UHD and at 60FPS and i got faint ghostly (?), white lines going down the screen every few seconds or so in some games. Took it to 4k non-UHD at 30FPS and it helped but still there on some games while on others it stopped it.Turned off the ECO and energy saving stuff already. Updated the TV Firmware/Software also.
Using SLI 780ti's atm.
Anything else i can try? Video cards worked fine at 4k on the games that didn't have a problem so don't think it's them. Elder Scrolls:Skyrim WAS amazing at 40"! :D
Thanks!

780ti does not support 4-4-4 chroma at 4k 60z so you are going to see compression artifacts. You need 970/980 video card.
 
I'm going to be returning my 48" JU7500 this weekend for one of the 40"ers. The 48" is just too big for my desk for productivity (sitting 2-3' away). I think a 40" would be much more comfortable, now I'm just trying to decide if I should go with the 6700 series or not. I've only used 3D a couple times and actually ended up turning it off after a while because I felt it made playing battlefield more difficult. I'm also not really convinced the curve adds enough extra emersion for the $ difference. Thoughts?

3D on the 7500 is fairly worthless for PC/Gaming. So if you are not going to watch 3D movies you definitely do not need it.

Flat 40 inch is a good choice especially for non gaming applications. I personally prefer curved but at 40inch flat should work.
 
Back
Top