New Samsung 4k for everyone.

Well, after a few days of thinking about it I decided to go ahead and order a 48JU7500. I still have a few weeks with my 48JU6700 before the return window ends, so I'll be among the few here who are able to directly compare the two.

I'm actually really happy with the 6700 but I started noticing some minor blur here and there after the 7500 owners called attention to it, and I'm wondering how much the 7500 can do to minimize it. It's really not a big deal, meaning I could keep the 6700 and be very satisfied with it, but considering the amount of time I spend using it and how long I will likely have this display I figure the money will be well spent if the 7500 is as good as the others are saying. Before I owned any of these I hesitated to spend the extra $450 for what might not have been a noticeable difference, but with Crutchfield's 60 day return policy it encourages me to try it out for myself and see.
 
How I feel:

Extra $450 for 120Hz and G-SYNC? Yes
Extra $450 to reduce blur? No

Ha, I know. It almost frustrates me to order it because I'm perfectly content with my 6700. Something about doing it in the name of science I guess.

Trust me, if I don't feel that it's worth it, it's going back! But the 3-ish or so people that said it was a big difference have piqued my curiosity. And chances are, when the next great 40"+ TV comes out that's even better as a PC monitor than the Samsung (as hard as that is to imagine), I can always get it and relegate my current monitor to TV duty. And at least on paper, the 7500 does seem like it would be superior in that regard.

I just wish the difference was more like $200...
 
As I've already said, 980 Ti for 4k is a non-starter. We have games already using more than 6GB+ of VRAM at 4k x 4xAA currently, it would be foolish to bet on cards limited to just 6GB being a working formula moving forward.

The smart play...really the only play...is 390X 8GB or Titan X, both of which should be dual cards ideally. To ignore VRAM limits is to do so at your own peril.
 
How I feel:

Extra $450 for 120Hz and G-SYNC? Yes
Extra $450 to reduce blur? No

Agree, If I could grab 120Hz and G-Sync on top of it, that might be worth it...though so far there's no tearing in any game I'm playing at all. V-sync and adaptive seem to have taken care of it nicely.

So I look at it this way: I can put $450 towards another Titan X now! :D

The blur issue is overblown imo, playing on LCDs ten years ago...now THAT was blur! Looked like a bad acid trip haha. ;)

But luckily this is why it's so nice to have choices, money can solve some of the blur issue for those that are overly sensitive to it, which is awesome.
 
Last edited:
Has anyone played with PIP? So far I only see that you can do TV (or DirecTV) in the PIP window. I was really hoping to be able to have HDMI2 (from my receiver with AppleTV/BluRay player/Amazon FireTV) playing in it.

The manual says:

"PIP cannot be used while Smart Hub or 3D is active"
"The PIP function is only available under the following conditions: 1) The main screen's source is a Component or HDMI connection. 2) The input resolution is less than FHD (Full HD)"
"The PIP window supports digital channels only and does not support UHD channels"

Hmm...
 
As I've already said, 980 Ti for 4k is a non-starter. We have games already using more than 6GB+ of VRAM at 4k x 4xAA currently, it would be foolish to bet on cards limited to just 6GB being a working formula moving forward.

The smart play...really the only play...is 390X 8GB or Titan X, both of which should be dual cards ideally. To ignore VRAM limits is to do so at your own peril.

I don't have a dog in that fight. I've seen a lot of people arguing lately about what cards provide an enjoyable experience at 4K, and all I know is this...I upgrade when I feel that my hardware no longer provides an acceptable experience (sometimes sooner if I run across a great deal). I'm not going to run out and buy two Titan X cards because on paper, some games use more than 4GB or dip below 60fps. All I know is that I just upgraded from one GTX 970 to two GTX 980s and what I currently have is an extremely enjoyable experience at 4K. I play with settings max or near max, with no or low AA (usually a maximum of 4x if I do use it). By the time this setup begins to choke, the Titan X will be much much cheaper and the 390X will be old news. And even at that time, people will be saying "The 390X can run 4K, but for an optimal experience you really need the L337 990XXT 24GB card."

I agree that if someone is on the cusp of upgrading NOW, might as well wait to see what the 390X brings and weigh your options against the Titan X at that point. But if a single 980 is all someone can afford, they can still get by. Will it be the elite experience that Titan X SLI is on the newest games? No, but we shouldn't act like it's unplayable either. With new hardware coming out all the time, the current stuff is always going to look like a dead end road.
 
The blur issue is overblown imo, playing on LCDs ten years ago...now THAT was blur! Looked like a bad acid trip haha. ;)

But luckily this is why it's so nice to have choices, money can solve some of the blur issue for those that are overly sensitive to it, which is awesome.

Yup. I'm kind of stuck in the middle because on one hand we have Brahmzy and Cyph and someone else IIRC saying that the difference is huge and worth the money. Then on the other hand we we those who think the issue is overblown. I happen to fall into the latter category, because I'll only see blur on certain objects within the game world and even then it's not like it's a game-breaking issue. Like you said, compared to LCDs of olde, this is very impressive!

But to reference my previous post, as well as yours, I agree that it's great to have choices. I play a lot of older games, and my experience in current games is such that I know it will be viable for quite some time. I do not see the value in the Titan X at current prices. But, I'm not concerned with maintaining more than 60fps with max AA in all titles, either. Someone else will choose to pay that huge premium to do so and I might view it as a waste while that person might view the blur reduction on the 7500 as a waste. Doesn't mean either of us is in the right or in the wrong. We do with our money as we please and all that matters in the end is if we're satisfied with our choices - not what a few hundred random strangers think. :)
 
Yup. I'm kind of stuck in the middle because on one hand we have Brahmzy and Cyph and someone else IIRC saying that the difference is huge and worth the money. Then on the other hand we we those who think the issue is overblown. I happen to fall into the latter category, because I'll only see blur on certain objects within the game world and even then it's not like it's a game-breaking issue. Like you said, compared to LCDs of olde, this is very impressive!

But to reference my previous post, as well as yours, I agree that it's great to have choices. I play a lot of older games, and my experience in current games is such that I know it will be viable for quite some time. I do not see the value in the Titan X at current prices. But, I'm not concerned with maintaining more than 60fps with max AA in all titles, either. Someone else will choose to pay that huge premium to do so and I might view it as a waste while that person might view the blur reduction on the 7500 as a waste. Doesn't mean either of us is in the right or in the wrong. We do with our money as we please and all that matters in the end is if we're satisfied with our choices - not what a few hundred random strangers think. :)

Plus, some games actually build motion blur IN, on purpose, to simulate movement, which is crazy but whatever.

The blur fetish is a nit, and some want to pick that nit haha. I'm just glad/lucky I'm not one of them.

The Titan X is a value play, not a premium play, because I'm talking specifically about one resolution: 4k with 4xAA. And you're not getting anywhere near 60fps at that res/AA without at least one, and preferably two of them.

The 390X 8GB variant will MSRP at $749, so the card price game has changed a bit. All top cards will be at that price or above moving forward, which is awful for consumers, but unfortunately expected. Sucks, but we're powerless to stop it. I remember when I thought $399 for the halo card was extortion, now I'm willing to pay more than double that...awful.

The 980 Ti is the non-value/high high premium play out of all three. Worst VRAM, least future proof, and an MSRP of also $749. Not smart. The Titan X is a few bucks more, nothing really when you're talking about already dumping what we've dumped on the game machine and the display itself, that's the core point.

For those on a budget, of course they'll have to suffer through with 970 SLI or something, we all have to make sacrifices to our budget constraints unfortunately. But if you have the means, waiting for a $1500 VRAM limited 980 Ti SLI over a $2000 Titan X SLI setup is the worst mistake of the three.

390X 8GB, or Titan X, preferably in Crossfire/SLI. Only options for 4k @ 4xAA that make sense.
 
Well, after a few days of thinking about it I decided to go ahead and order a 48JU7500. I still have a few weeks with my 48JU6700 before the return window ends, so I'll be among the few here who are able to directly compare the two.

I'm actually really happy with the 6700 but I started noticing some minor blur here and there after the 7500 owners called attention to it, and I'm wondering how much the 7500 can do to minimize it. It's really not a big deal, meaning I could keep the 6700 and be very satisfied with it, but considering the amount of time I spend using it and how long I will likely have this display I figure the money will be well spent if the 7500 is as good as the others are saying. Before I owned any of these I hesitated to spend the extra $450 for what might not have been a noticeable difference, but with Crutchfield's 60 day return policy it encourages me to try it out for myself and see.

Well, let us know what you think. Most seem to attribute blur = 450. However, the screen is in fact better IMO. Image quality is pretty high on my list when I purchase monitors/TV's, hence I usually purchase mid to top tier.

AA, 4K, blur, contrast, colors, etc. all contribute to image quality. Different individuals prioritize different aspects. Some purchase dual Titans for the AA, which I don't really care for. Different strokes for different folks.
 
Last edited:
I'm not overly sensitive to motion blur, but I am tearing and the jagged edges bother me. So I want as much AA as I can get away with while still maintaining a good frame rate.
 
I'm not overly sensitive to motion blur, but I am tearing and the jagged edges bother me. So I want as much AA as I can get away with while still maintaining a good frame rate.

This is my take also. Tearing in particular is 100x worst than any blur, not even close...can't stand it, factors into every GPU/display-related purchase decision for me.
 
Plus, some games actually build motion blur IN, on purpose, to simulate movement, which is crazy but whatever.

Sooo...if you play a game with built-in blur, on an LCD that blurs it even more, are you getting double blurred? Haha. There has to be a term for that! :p

But yeah, I know exactly what you mean. This is not that, though. Like if I make my character run past a static object (like a statue or a poster), I can sometimes see the trailing image just like you'd see on the UFO Test image that's frequently used.

Someone earlier posted comparison images between the 6xxx and the 7xxx, and the 7xxx moving image did seem much clearer. I didn't know how that would translate into real world gaming, but there was a definite difference looking at those pics that has been backed up by 7xxx series owners.

Mako360 said:
The blur fetish is a nit, and some want to pick that nit haha. I'm just glad/lucky I'm not one of them.

Definitely know where you're coming from. I have never felt that the blur has significantly detracted from my experience, and I could keep rolling as is without any major complaints, especially for the price. The 6500/6700 series is an outstanding value, there is no arguing that. Some of us are just always in search of the next best thing, or getting closer to perfection, cost be damned. Just like with video cards. I feel glad/lucky that I do not feel like I have to maintain 144fps with 8x AA at 4K in order to enjoy a game!

Mako360 said:
The 390X 8GB variant will MSRP at $749, so the card price game has changed a bit. All top cards will be at that price or above moving forward, which is awful for consumers...

+1, FTL

Mako360 said:
The 980 Ti is the non-value/high high premium play out of all three. Worst VRAM, least future proof, and an MSRP of also $749. Not smart. The Titan X is a few bucks more, nothing really when you're talking about already dumping what we've dumped on the game machine and the display itself, that's the core point.

For those on a budget, of course they'll have to suffer through with 970 SLI or something, we all have to make sacrifices to our budget constraints unfortunately. But if you have the means, waiting for a $1500 VRAM limited 980 Ti SLI over a $2000 Titan X SLI setup is the worst mistake of the three.

This right here. Although I will say that although VRAM is especially important at high resolutions with high graphical settings applied, it may not be everything. The 8GB 290X, for example, showed little to no improvement over the fastest 3 and 4GB cards that it competed with. We'll have to see how the 6GB 980Ti benches out, but yeah I don't see it being a superior option at the moment.

As far as suffering through with lower-mem cards, I still think that the experience is not quite as gimped as it's made out to be. At least on current titles. If I may, I'd like to reference something that Dan D said in the GTA V thread, which is very relevant here:

"At 4K I can see the VRAM issue being more of a concern but the 980's are still great cards. Despite all this VRAM talk I had pretty good experiences with GeForce GTX 780 Ti's at 4K despite the 3GB of VRAM."

Like I said earlier, it's important for prospective buyers that we don't act as if anything less than a $1500 GPU setup is unplayable. But we should also inform them that it might not be the optimal experience moving forward. Then they can decide what they're comfortable spending.

Well, let us know what you think. Most seem to attribute blur = 450. However, the screen is in fact better IMO. Image quality is pretty high on my list when I purchase monitors/TV's, hence I usually purchase mid to top tier.

I had actually forgotten about some preferring the the actual image quality (blur aside) on the 7xxx; there was some talk about that when you guys received yours.

Cyph said:
AA, 4K, blur, contrast, colors, etc. all contribute to image quality. Different individuals prioritize different aspects. Some purchase dual Titans for the AA, which I don't really care for. Different strokes for different folks.

Agree with you 1000%.
 
Got the new monitor in again seems a bit hard to find the optimal settings for the JS9000 series anyone pick one up
 
Plus, some games actually build motion blur IN, on purpose, to simulate movement, which is crazy but whatever.

IMO it's a bit crazier that people pay several fold more for GPUs with marginally better realism at best in sharpness or detail yet opt to instead reduce realistic motion blur (ie. increase animation jerkiness).

The game in business is to part consumers with money, and the best way to do this is to provide justification in guise of reasoning.

For example, low freq PWM can be bad, but "Lightboost" 60hz PWM is good, just make sure to avoid the italicize part in the ad-print. Given the current situation I'm pretty sure there are plenty of consumers warning about the perils of PWM but praising the lightboost which they paid extra for.

If the natural state of things were that realism came for free and it took some tweaking of bits to get games to look like Quake1, some brilliant marketeer would find a way to push artistic cubism in gaming.
 
The game in business is to part consumers with money, and the best way to do this is to provide justification in guise of reasoning.

What a game it is...all of this nonsense is too expensive but the GPU side is out of control in terms of price inflation the past decade.
 
...yet opt to instead reduce realistic motion blur (ie. increase animation jerkiness)....

'Realistic motion blur'. Put the pipe down son!

There is NOTHING realistic about staring at a computer screen. Stop with the nonsense already. That is NOT how our eyes/brains process motion or the images we see.
 
Last edited:
I'm not buying the sky-is-falling mentality concerning VRAM and 6GB cards. I rocked my 2GB 670FTWs @ 1600p with 2x and 4x on BF4 and many other games at a buttery smooth 60FPS when, according to the VRAM clan posters I shouldn't have even been able to play them. I'm not buying it.
 
I have been watching this forum like a hawk for over a month. I have been stuck between using the 28" Acer 4K G-Sync (only because of G-Sync), then trying the Acer 32" 4K IPS (garbage latency), then onto the current BenQ BL3201PH which fit the bill pretty well except for IPS glow and lack of G-Sync. Size is amazing, and I love contrast... so after reading about half this thread, I ordered one of these TVs to try out the same day. Right now I am going to try the 6700 from Crutchfield. But I am right there wondering about the differences with the 7500 model. I'll compare it as best I can to my BL3201PH before it gets returned.

I asked a Samsung and Crutchfield rep about it. The Samsung rep said the "Ultra Clear Pro" and "Ultra Clear" panel difference is just that - a naming difference only. Apart from this, the 7500 appears to have slightly different dimming/brightness technology, but likely that is irrelevant to PC usage.

I have questions though. I've been all over the tail and front ends of this thread, and haven't really dove into the middle yet. Please try to help me out and not bash me if I'm asking something already said.

  • What is color going to be like? Close to IPS levels? Even AT IPS levels?
  • Does anyone have a true contrast ratio available, or better yet, minimum and maximum luminance levels?
  • When talking about firmware updates, this thread seems to lump the 6500, 6700, 7100, and 7500 together. Are they receiving the same-ish updates? The latest one with 4:2:2 gaming mode chroma seems like a big deal!
  • Is color shift noticeable? I HATE this about TN monitors, and worry about it with a VA panel (which this is, correct?)

Well, after a few days of thinking about it I decided to go ahead and order a 48JU7500. I still have a few weeks with my 48JU6700 before the return window ends, so I'll be among the few here who are able to directly compare the two.

I'm actually really happy with the 6700 but I started noticing some minor blur here and there after the 7500 owners called attention to it, and I'm wondering how much the 7500 can do to minimize it. It's really not a big deal, meaning I could keep the 6700 and be very satisfied with it, but considering the amount of time I spend using it and how long I will likely have this display I figure the money will be well spent if the 7500 is as good as the others are saying. Before I owned any of these I hesitated to spend the extra $450 for what might not have been a noticeable difference, but with Crutchfield's 60 day return policy it encourages me to try it out for myself and see.

Please please do this. I am on the fence with this myself, but the 7500 to me seemed like only a difference in terms of 3D capability. But if color, motion, etc. are better, it will sit on my desk!

Well, let us know what you think. Most seem to attribute blur = 450. However, the screen is in fact better IMO. Image quality is pretty high on my list when I purchase monitors/TV's, hence I usually purchase mid to top tier.

AA, 4K, blur, contrast, colors, etc. all contribute to image quality. Different individuals prioritize different aspects. Some purchase dual Titans for the AA, which I don't really care for. Different strokes for different folks.

'Realistic motion blur'. Put the pipe down son!

There is NOTHING realistic about staring at a computer screen. Stop with the nonsense already. That is NOT how our eyes/brains process motion or the images we see.

Can either of you comment on why you believe the 7500 is superior to the 6700 as imyourzero mentioned? I'm trying to sift through this thread, but at 99 pages it is daunting... :(
 
I'm not buying the sky-is-falling mentality concerning VRAM and 6GB cards. I rocked my 2GB 670FTWs @ 1600p with 2x and 4x on BF4 and many other games at a buttery smooth 60FPS when, according to the VRAM clan posters I shouldn't have even been able to play them. I'm not buying it.

You don't have to buy it or not, the benchmarks tell the tale, period.

And in the majority of them performed from multiple sites (including this one) when set to 4k / 4xAA show the fps returns falling disproportionately as the VRAM counts go lower. And that's in today's games, which are all 2+ year old engines. Tomorrow's engines will easily chew up 8GB, 10GB, and likely even max the 12GB the Titan X has when opting for the optional high res texture packs and max settings. And those optional graphics features are why we're doing this at all...right?

To risk $1500 instead of $2000 over what could be crippling a year from now for just a $500 differnce (and a much better card btw), is crazy.

But aren't you one of the 7xxx people Brahmzy? So you'd blow $450 on that upgrade but would gamble crazily on limited VRAM killing the entire experience for the same amount? Doesn't make sense to me.
 
I'm not buying the sky-is-falling mentality concerning VRAM and 6GB cards. I rocked my 2GB 670FTWs @ 1600p with 2x and 4x on BF4 and many other games at a buttery smooth 60FPS when, according to the VRAM clan posters I shouldn't have even been able to play them. I'm not buying it.

I didn't listen either when they said 3 GB was enough for 1440p a year ago when I picked up 780's and 780 Ti's. What a fool I was when I tried playing Evolve and it was a stutter mess, letalone playing at 4K. 6 GB may be enough for some time to come (yes, games do cache more textures than needed when given the VRAM, but you can never have too much). In Shadow of Mordor, I can hit 7 GB easily. Watch_Dogs and GTA 5 are surprisingly only in the 5 GB range. Evolve likes to hit 6 GB.

I decided to play it safe and just grab a TITAN X (two actually). I don't regret my purchase... yet. Hopefully Pascal doesn't ruin that.
 
I have been watching this forum like a hawk for over a month. I have been stuck between using the 28" Acer 4K G-Sync (only because of G-Sync), then trying the Acer 32" 4K IPS (garbage latency), then onto the current BenQ BL3201PH which fit the bill pretty well except for IPS glow and blah blah blah blah blah...(

Based on the details in your question, it's an easy answer: There are none.

It sounds like snark but it's meant in a positive way. It's just that no one can answer this to your own specific preferences, only you can, as you've had multiple bad experiences with what are amazing perfect displays to most of us.

So just buy a 7500, try it out and then take advantage of the very liberal return policy everyone offers if it isn't to your standards.

It's the only way to know, and who knows you might actually like it enough to deal with the compromises.

I decided to play it safe and just grab a TITAN X (two actually). I don't regret my purchase... yet. Hopefully Pascal doesn't ruin that.

It won't. Mainly because it's further away than Nvidia's roadmaps suggest (fab problems, Samsung now desperately being turned to as TSMC shits the bed on high volume sub-20nm production, etc).

Plus, for whatever reasons the Titans have the best resale value in the game. Shockingly to me even the original Titan still sells for a pretty penny second-hand these days whereas I would have thought they'd have zero value. Go figure.
 
Last edited:
Plus, for whatever reasons the Titans have the best resale value in the game. Shockingly to me even the original Titan still sells for a pretty penny second-hand these days whereas I would have thought they'd have zero value. Go figure.

That could be because the original Titan had great DP performance for compute, which you could only otherwise get on Nvidia's very expensive workstation cards. The same is not true for the Titan X, however, as it's purely a gaming card.
 
That could be because the original Titan had great DP performance for compute, which you could only otherwise get on Nvidia's very expensive workstation cards. The same is not true for the Titan X, however, as it's purely a gaming card.

That could be, but it's likely not the only reason or even the primary factor.

In this particular case however the Titan X being a DX12 card with 12GB will retain value extremely well as it won't be perceived as being hardware limited...ever. While 780 Ti, 970, and both the 980 and 980 Ti will all be perceived by the masses (right or wrong) as being VRAM limited, and will suffer on the secondary market as primary new cards all ship with 8GB+, just as the 390X is doing.

Either way, it's a $500 difference between Titan X SLI and 980 Ti SLI, and absolutely not worth gambling over. Hell we're not even sure that the 980 Ti will ship with full cores/clocks, which is another strike potentially.

If you can afford $3000 (48JU7500 + 980 Ti SLI) then you definitely can afford $3500 (48JU7500 + Titan X SLI) and it makes no sense to go with the inferior option.
 
Are these reps the guys you contacted via the phone or online? They are reading off the sheets and are giving you the wrong answer. These are the same guys who told an AVSForum member that the 7100 has a Nano crystal display (aka Quantum Dots). They clearly have no clue.

Ultra Clear Pro is glossy; Ultra Clear is semi-gloss. The names are made up, but the differences are real.The Pro is used on the 7100 all the way up to JS9500. One reflects more if there's a light source behind it. The other reflects less direct light, yet reflect more ambient. When off, the glossy has a more mirrored reflection than the semi, but when the two are turned on, and the screen is not black or solid dark color, the Pro (glossy) reflects less ambient light. That means you see less reflection from surroundings as long as there are not direct light sources (windows, lamps, etc.) The blacks are blacker, and colors have more punch on the glossy. It also allows more light through, which is why peak brightness is brighter and contrast is slightly superior. It's probably the same panel with a different surface. As for motion blur, I have no idea how they achieve it, but it's probably not related to the surface.

I asked a Samsung and Crutchfield rep about it. The Samsung rep said the "Ultra Clear Pro" and "Ultra Clear" panel difference is just that - a naming difference only. Apart from this, the 7500 appears to have slightly different dimming/brightness technology, but likely that is irrelevant to PC usage.

  • What is color going to be like? Close to IPS levels? Even AT IPS levels?
  • Does anyone have a true contrast ratio available, or better yet, minimum and maximum luminance levels?
  • When talking about firmware updates, this thread seems to lump the 6500, 6700, 7100, and 7500 together. Are they receiving the same-ish updates? The latest one with 4:2:2 gaming mode chroma seems like a big deal!
  • Is color shift noticeable? I HATE this about TN monitors, and worry about it with a VA panel (which this is, correct?)

Please please do this. I am on the fence with this myself, but the 7500 to me seemed like only a difference in terms of 3D capability. But if color, motion, etc. are better, it will sit on my desk!

Can either of you comment on why you believe the 7500 is superior to the 6700 as imyourzero mentioned? I'm trying to sift through this thread, but at 99 pages it is daunting... :(

The rest are subjective. Sorry, you may have a top of the line pro IPS which will be clearly superior than run of the mill IPS. IPS come in all flavors so it's impossible to say unless it's you.

There's color shifts due to the huge size, but it's much better than TN. You will notice it in the corners with solid colors (screen all same color or gradient); but if you have a game running, it's very difficult to see the shift. You trade slight color shift for minimal bloom, no IPS glow, much better contrast, etc.; don't fall into the trap that 5000 contrast is twice as good as 3000. Like all things, as the numbers increase, it's more difficult to make out the difference. My plasma has a 10k contrast ratio, and it's difficult to say that one is clearly more contrasty than the other. You should check out the Rting review for the numbers.

You're asking for another subjective opinion on superiority. The difference in price for one person is not much, whereas for another it's rent money. You should buy both and return the one you don't like.
 
If you can afford $3000 (48JU7500 + 980 Ti SLI) then you definitely can afford $3500 (48JU7500 + Titan X SLI) and it makes no sense to go with the inferior option.

Afford? Yes. Be able to explain to the wife why two video cards is better than one? Not so sure.

Directx12 is also an X-factor. It supposedly will increase FPS once games are ported to it. If one Titan X can hit 60fps with X12, it may not make sense to get two right off the bat.
 
Afford? Yes. Be able to explain to the wife why two video cards is better than one? Not so sure.

Directx12 is also an X-factor. It supposedly will increase FPS once games are ported to it. If one Titan X can hit 60fps with X12, it may not make sense to get two right off the bat.

I'm hoping for the same, but the fps it adds, (if it adds anything at all) will just quickly get chewed right back up by developers through more overhead, which is always what seems to happen haha.

Single Titan X is enough for most of us at the moment, particularly overclocked. But my comments are addressing Brahmzy, who says he's inexplicably going 980 Ti SLI instead of Titan X SLI, despite just buying a 7500.

Personally I don't mind the small jaggies that 4k non-AA produces so far. For those who must have 4xAA and 60fps+, SLI Titan X, or shortly 390X Crossfire 8GB, awaits.
 
Are these reps the guys you contacted via the phone or online? They are reading off the sheets and are giving you the wrong answer. These are the same guys who told an AVSForum member that the 7100 has a Nano crystal display (aka Quantum Dots). They clearly have no clue.

Ultra Clear Pro is glossy; Ultra Clear is semi-gloss. The names are made up, but the differences are real.The Pro is used on the 7100 all the way up to JS9500. One reflects more if there's a light source behind it. The other reflects less direct light, yet reflect more ambient. When off, the glossy has a more mirrored reflection than the semi, but when the two are turned on, and the screen is not black or solid dark color, the Pro (glossy) reflects less ambient light. That means you see less reflection from surroundings as long as there are not direct light sources (windows, lamps, etc.) The blacks are blacker, and colors have more punch on the glossy. It also allows more light through, which is why peak brightness is brighter and contrast is slightly superior. It's probably the same panel with a different surface. As for motion blur, I have no idea how they achieve it, but it's probably not related to the surface.



The rest are subjective. Sorry, you may have a top of the line pro IPS which will be clearly superior than run of the mill IPS. IPS come in all flavors so it's impossible to say unless it's you.

There's color shifts due to the huge size, but it's much better than TN. You will notice it in the corners with solid colors (screen all same color or gradient); but if you have a game running, it's very difficult to see the shift. You trade slight color shift for minimal bloom, no IPS glow, much better contrast, etc.; don't fall into the trap that 5000 contrast is twice as good as 3000. Like all things, as the numbers increase, it's more difficult to make out the difference. My plasma has a 10k contrast ratio, and it's difficult to say that one is clearly more contrasty than the other. You should check out the Rting review for the numbers.

You're asking for another subjective opinion on superiority. The difference in price for one person is not much, whereas for another it's rent money. You should buy both and return the one you don't like.

That is a point I just noticed. I thought both panels were glossy, but the 6000 series is semi-gloss. They were chat reps. At first I figured since the rep knew what I was talking about, they had good information... but obviously that wasn't the case.

Glossy is a huge deal, and wins every time in my eyes. This may be worth the extra price tag alone. I am tempted right now to buy the 7500 and try them side-by-side. Better color, clarity, contrast, and motion blur may make it all worth it. But the question is, how much are they different... if it is negligible, then perhaps not so.

I don't have an issue using this BenQ at full brightness (350 cd/m2). I love it actually... daytime games feel like you're right there in the sun, LOL. It may be a different story with a 40" screen though. I'm not sure how the measurement works... is cd/m2 total light output of the display, or just a measured portion? Because those numbers being very close between the Samsung and BenQ I have now would mean they are either similar, or very different (32" vs 40").

imyourzero, are you receiving your 7500 on Thursday? Please please please, do a great comparison! Image quality is most important, with motion blur being second (in my opinion). I can't wait to see your findings :D.

EDIT: Cyph, Brahmzy, imyourzero, are any of you able to do comparisons of the motion blur with both screen either duplicated and taking a picture, or run the UFO blur tests and compare pictures of each display individually?
 
Last edited:
imyourzero, are you receiving your 7500 on Thursday? Please please please, do a great comparison! Image quality is most important, with motion blur being second (in my opinion). I can't wait to see your findings :D.

You don't need imyourzero, or anyone else, to do anything. It's already been done in the thread by 7500 owners who have returned their 6700s and are happy.

You know you're a display elitist, you prefer perfection over slop, and every little thing has to be exactly right or you're not happy...fair to say no? So why screw with the 6700 knowing this? Why waste your time...it's not going to work knowing your tastes...shoot for the best from the start.

Suck it up, pay the extra $450, and order a 7500 to eval for yourself. Just for reference, I believe we've had 4 people in the thread do in-home side-by-side comparison between the 6xxx models and the 7xxxx models, and all of them returned the 6xxx model. That should tell you something.

But still, only YOU can determine if you'll be happy with a display in person, in your own environment, running your own tests, and internet "image comparisons" aren't going to make a difference. Best of luck.
 
^^ Agreed - just get the 7500 up front. The differences are 'enough.'

Now you have me thinking about a step-up on my 980 to a Titan X darnit!
 
The 7500 might be worth the extra $450 but I'd rather invest the extra money in a GPU upgrade or in an upcoming VR device. I'm perfectly happy with the 6700, and hopefully they'll improve on a few things (like game mode response time, etc) with upcoming firmware updates.
 
'Realistic motion blur'. Put the pipe down son!

There is NOTHING realistic about staring at a computer screen. Stop with the nonsense already. That is NOT how our eyes/brains process motion or the images we see.

So why do you bother paying much for a graphics card if not for greater realism such as motion blur which surely everyone can verify the existence of? This is not a rhetorical question.
 
Last edited:
^^ Agreed - just get the 7500 up front. The differences are 'enough.' Now you have me thinking about a step-up on my 980 to a Titan X darnit!

The 7500 might be worth the extra $450 but I'd rather invest the extra money in a GPU upgrade or in an upcoming VR device. I'm perfectly happy with the 6700, and hopefully they'll improve on a few things (like game mode response time, etc) with upcoming firmware updates.

I'm in the same boat with the second GPU...I know I'm going to need it, it's only a matter of time. Trying to hold off the massive temptation Brahmzy and the others are providing for swapping the 6700 for a 7500 because of the $450 I can then put towards another Titan lol...
 
I'm in the same boat with the second GPU...I know I'm going to need it, it's only a matter of time. Trying to hold off the massive temptation Brahmzy and the others are providing for swapping the 6700 for a 7500 because of the $450 I can then put towards another Titan lol...

If you're happy with the 6700, keep it and don't look back. I wasn't happy - I wish I would've been, I could've saved $450!
 
That could be, but it's likely not the only reason or even the primary factor.

In this particular case however the Titan X being a DX12 card with 12GB will retain value extremely well as it won't be perceived as being hardware limited...ever. While 780 Ti, 970, and both the 980 and 980 Ti will all be perceived by the masses (right or wrong) as being VRAM limited, and will suffer on the secondary market as primary new cards all ship with 8GB+, just as the 390X is doing.

Either way, it's a $500 difference between Titan X SLI and 980 Ti SLI, and absolutely not worth gambling over. Hell we're not even sure that the 980 Ti will ship with full cores/clocks, which is another strike potentially.

If you can afford $3000 (48JU7500 + 980 Ti SLI) then you definitely can afford $3500 (48JU7500 + Titan X SLI) and it makes no sense to go with the inferior option.

I've heard DX12 may enable stackable VRAM so the 980 Ti SLI would essentially be a 12gb setup. Of course there's no 100% guarantee that this will happen so you're right the Titan X sli is a safer investment.
 
What a game it is...all of this nonsense is too expensive but the GPU side is out of control in terms of price inflation the past decade.


The problem is that with any given kind of rendering (a set of low hanging fruit and tradeoff) there's an approaching law of diminishing returns. There's only so much that can be done with way that current graphics pipelines draw a representation of the world.

So despite moore's law and the parallelized nature of CG the hardware can't really keep up with expectations of the top end of the market. That last part is key because cheaper hardware like the 960 renders the latest games perfectly fine and look better with every generation of software, it's just incredibly expensive to look much better than the 960.
 
I read that with DX12 multiple gpus will no longer be processing the same data at any given time, but will work more like a single card.
Today,if you have two 4Gb cards, you essentially don't get the usable 8Gb, but only 4Gb, since they are both mirroring their workload.

But DX12 will apparently change that, so two 4Gb cards will truly mean 8Gb..

So those of you who are concerned about the amount of VRAM, there will soon be no need for that.. Even two GTX 970/980 or two upcoming 380x cards should do wonders with DX12 games in 4k..
 
Again, your post makes no sense. What does one have to do with the other?

I edited the post to make it more clear. You clearly pay a lot for hardware to run these latest games. Most people do so because both the lastest SW and HW render the world with greater realism, otherwise they'd just play Quake 1 or derivation of, or just get a 960. Since you shun realism, perhaps you have other reasons.
 
I've heard DX12 may enable stackable VRAM so the 980 Ti SLI would essentially be a 12gb setup. Of course there's no 100% guarantee that this will happen so you're right the Titan X sli is a safer investment.

I read that with DX12 multiple gpus will no longer be processing the same data at any given time, but will work more like a single card.
Today,if you have two 4Gb cards, you essentially don't get the usable 8Gb, but only 4Gb, since they are both mirroring their workload.

But DX12 will apparently change that, so two 4Gb cards will truly mean 8Gb..

So those of you who are concerned about the amount of VRAM, there will soon be no need for that.. Even two GTX 970/980 or two upcoming 380x cards should do wonders with DX12 games in 4k..

Plus the bullet point about being able to mix/match different card models in SLI or Crossfire...or even between different vendors (yeah right, Nvidia would probably install firmware code that requires a $99 unlock fee just to add an old AMD card haha).

DX12 is exciting, but the old guys on the board have been through DirectX launch PR campaigns all the way back to...DX7? Was that the first "go to the press and hype" DirectX release where MS's marketing department finally thought to promote these?

I can't remember, but with every DX generation they promise the moon and we're left a bit underwhelmed. Maybe this time it's different, very much hope so, good stuff.

That could also be the reason why AMD seems to be actually pursuing a higher 395X model with just 8GB total for the Fall (dual Fiji GPU on one PCB, with just 4GB framebuffer VRAM for each GPU) because they know the stackable vram concept actually works and that $1250-$1500 card will have longer legs than it would seem today (meaning no one would ever spend that much on a 4GB card without stackable vram).

But who knows, hard to call all this as early as it is.

For me I believe that a pair of used Titan Xs will resell a bit higher than a pair of 980 Ti cards a few years from now so the net loss of paying a premium for Titan X now and enjoying the perks it brings isn't as bad as the $500 loss it appears to be at the cash register. Might lose say $250 total on both cards when all is said and done, possibly even less...which over that much time is nothing amortized.

Fun to speculate though.
 
Last edited:
Back
Top