New Samsung 4k for everyone.

They're so stupid. 4k came out of their lips about a hundred times during their PR campaign for the Fury X, yet it ships with 4GB and HDMI 1.4. :p

Just...wow.

4K in the living room even. Living room format is automatically HDMI 2.0
 
Rtings update their test methodology, they now measure 12 pixel transitions per TV with a method similar to tftcentral
They have retroactively updated the 2015 TV reviews with according images!

JU6500:
http://www.rtings.com/images/reviews/ju6500/ju6500-response-time-large.jpg
JU6700:
http://www.rtings.com/images/reviews/ju6700/ju6700-response-time-large.jpg
JU7100:
http://www.rtings.com/images/reviews/ju7100/ju7100-response-time-large.jpg

On average the 7 series is about 3-4ms faster in pixel response times than 6 series (appears it uses a stronger overdrive impulse)
Strangely the 0%-20% (0 -> 50, black to dark grey) rise time is faster on the JU6500 than on the other two.
I may add they tested only the 55'' version, there might be slight differences at other panel sizes.
They also added Backlight graphs that show PWM modulation and duty cycle (looks like all Sony TVs this year are flicker-free)
Maybe they will eventually test the 7500 and 9 series as well.
 
Last edited:
Zarathustra[H];1041674011 said:
Yeah,

I'm usually one to wait for official information before making any decisions, and I was following the "coming soon, few weeks" thread since January. I was hoping to support AMD and get a top end AMD GPU this time around, but apparently that isn't happening.

Usually I would have waited until the final reviews, but coming - as it did - from an actual AMD Rep, I decided it was more reliable than most rumors, and I didn't want to be stuck on the flipside once Fury-X reviews drop, fighting for available 980ti inventory, so I decided to go for it.

Now I'm trying to decide if - for the older titles I play - a single 980ti will be sufficient for 3840x2160, or if I really ought to get another one...

It's difficult to find good figures on it. How much faster - on average - would you guys say a 980ti is than the original Titan, specifically in Unreal Engine 3 titles if possible? I can't find numbers on this at all.

I'm going under the assumption that since 3840x2160 has double the pixels of my 2560x1600 screen, if everything scales linearly I'll need double the GPU power for the same frame rates. (this may be a bad assumption though)

GTX 980 Ti seems to be 47% to 49% more fps than original Titan, at least with these titles @1080p...
http://imgur.com/2oitcl1
http://imgur.com/Lc7i3dQ
 
Zarathustra[H];1041674541 said:
I'm just trying to decide if I should grab a second 980ti now, so I can have two identical cards, while they are still in stock, rather than waiting for the first one to decide whether I need it or not...

This is starting to get into The Vacation Budget™, which may imply some nights of Sleeping on the Couch™.

:p

My suggestion is to wait on a second card. Not all games require the same about of video processing, so I would play the games you want and watch your frame rates. If they aren't what you want or need, then get a 2nd card. This is my methodology...!
 
My suggestion is to wait on a second card. Not all games require the same about of video processing, so I would play the games you want and watch your frame rates. If they aren't what you want or need, then get a 2nd card. This is my methodology...!

Same for me, thought I would "need" a second Titan X for sure, but after three months and half a dozen games played I haven't needed to add one yet.

Part of that is the card OCing so well, it's stable at a 1500Mhz boost clock (modded bios and water cooling), which adds a lot of fps versus the stock card. The 980 Ti should be able to hit that or get real close also.
 
In regards to the Samsung promotion: I just purchased a new JS9000 from Crutchfield and have initiated a return from my current JS9000 to Amazon. It's a shame to send back a good unit (i.e., no obvious pixel anomalies or display defects), but Amazon wouldn't honor the promotion - they even recommended I return the display.

At least for a brief period, I'll have two JS9000s in the house. What a splendid sight it will be. ;)
 
Last edited:
Zarathustra[H];1041674541 said:
I'm just trying to decide if I should grab a second 980ti now, so I can have two identical cards, while they are still in stock, rather than waiting for the first one to decide whether I need it or not...

This is starting to get into The Vacation Budget™, which may imply some nights of Sleeping on the Couch™.

:p

I actually value the benefit of two cards, especially now that I'm driving a 4k display, but this is largely dependent on the games I enjoy (e.g., IL-2: BOS, ArmA 3, etc.). You should evaluate your list of favorite games and determine if the cost/benefit ratio is favorable for you.

For reference, I have two vanilla GTX 980s.
 
They're so stupid. 4k came out of their lips about a hundred times during their PR campaign for the Fury X, yet it ships with 4GB and HDMI 1.4. :p

Just...wow.

Holy shit, if they actually shipped Fury with HDMI 1.4 they are fucking retarded. No excuse. It's hard to believe they are that stupid.
 
In regards to the Samsung promotion: I just purchased a new JS9000 from Crutchfield and have initiated a return from my current JS9000 to Amazon. It's a shame to send back a good unit (i.e., no obvious pixel anomalies or display defects), but Amazon wouldn't honor the promotion - they even recommended I return the display.

At least for a brief period, I'll have two JS9000s in the house. What a splendid sight it will be. ;)

With it being such a hassle to get a JS9000 without a pixel issue, I'd gladly give up the new promotion if I already had one without an issue.
 
In regards to the Samsung promotion: I just purchased a new JS9000 from Crutchfield and have initiated a return from my current JS9000 to Amazon. It's a shame to send back a good unit (i.e., no obvious pixel anomalies or display defects), but Amazon wouldn't honor the promotion - they even recommended I return the display.

At least for a brief period, I'll have two JS9000s in the house. What a splendid sight it will be. ;)

Definitely hold onto the Amazon one until you can validate the Crutchfieldone for pixel issues, etc.
 
With it being such a hassle to get a JS9000 without a pixel issue, I'd gladly give up the new promotion if I already had one without an issue.

I've considered that. If the new display has any problems, I can just return it to Crutchfield. My Amazon return won't activate until after I have the new display, so I'll have a brief period where I can evaluate both of them.

Definitely hold onto the Amazon one until you can validate the Crutchfieldone for pixel issues, etc.

Definitely!
 
I made that decision three weeks ago, I wanted to 'want' the 9000 honestly, but the truth was it wasn't dramatically different in my tests versus the 7500. Small jump in other words.

I did feel the 6700 to 7500 jump was worth it, but not the 7500 to 9000 jump.

All of them are great truthfully.

I also felt the jump from the 7500 to the 9000 was not enough for me to keep the 9000. Granted, the 9000 I had had some bad backlight bleed, so that helped make my decision. But the 9000 didn't have any stuck pixels, which I was worried about.
My 7500 is (almost) perfect panel wise. It has very very little backlight bleed, that you really have to look for. It also doesn’t have any stuck pixels.
I haven't had any time with the 6700, so I can't comment on that.
Now this is just my opinion, but the 7500 (48") was the best fit for me.
 
I actually value the benefit of two cards, especially now that I'm driving a 4k display, but this is largely dependent on the games I enjoy (e.g., IL-2: BOS, ArmA 3, etc.). You should evaluate your list of favorite games and determine if the cost/benefit ratio is favorable for you.

For reference, I have two vanilla GTX 980s.

Yeah, I ordered a second one, but I didn't notice that it said "limit one per customer", so they canceled my order for the second one, and now I have to wait 28 more hours for the second, and hope they are still in stock so I can get two of the same ones.
 
I know it's somewhat irrelevant, but loling at how much money AMD lost within 24 hours just to this thread alone. Probably at least 5-10 of us that were Fury leaning, now for sure getting Tis.
 
I know it's somewhat irrelevant, but loling at how much money AMD lost within 24 hours just to this thread alone. Probably at least 5-10 of us that were Fury leaning, now for sure getting Tis.

If I were Su, I would determine who alone, or which group of people, made that decision, and terminate them. That's brutal, but that's the magnitude of this mistake (at the high end).

It's nothing personal, we just can't have those sorts of mistakes when we're struggling to survive and are losing market share by the second. Pack your desks, and GTFO.
 
They're just targeting that core demographic of people who will spend the big bucks and be satisfied with 4k at 30 fps. You know, console gamers three years from now.
 
Zarathustra[H];1041675556 said:
Did you guys find your "one connect" boxes to be loud at all?

No, but I do hate that thing. Can't seem to tuck it out of the way properly.

What sort of noise is yours making?
 
So I now have the JS9500 65" and the JS9000 55" sitting side by side.

Before, I hit my main question, how do I check to ensure its a perfect monitor? I.E. 0 dead pixels, pixel bleed, accurate colors?

My issue, is that I am using this as a computer monitor. My personal preference is the 9500 at 55" Unfortunately, this is not to be had....

Was there a reason they did not make it at 55" when I compare the sheer size of the 65" JS9500 to the 55" JS9000, it's just not a larger screen, the siding/ thickness of the panel is about 3x larger. It's a massive increase in size compared to the slimmer JS9000 profile.

Computer wise, will I gain that much more at 65"? I do not think so, especially when you consider its in a smaller room being mounted to a wall with a OmniMount, which may or may not work over 60"

Have any of you here had any experience with the 2?
 
Is AMD really banking on DP connectivity that much, and assuming that many people won't be connecting their card to a display with HDMI inputs or at least not ones that would benefit from the bandwidth that 2.0 provides?

I just cannot think of one good reason to go with the older version on your cutting edge card. It seems like more than just an oversight...I mean you have an R&D team full of experts who are obviously aware of HDMI 2.0 and surely they had to consider, at some point, that people will want to use these with 4K displays using HDMI. It's like AMD assumed that the customer base who would want to connect Fury to HDTVs wasn't worth bothering with. I have to wonder what % of sales people like us make up. Even if it's small, the money lost by AMD for removing their card as an option has to be greater than the amount of money that it would have cost to put a freakin' HDMI 2.0 output on there.

Even if you're going to use the card with a traditional PC monitor, many prefer HDMI to DP anyway, so this screws them in a way too since they'd be locked at 1.4a if they wanted to use HDMI.

What a boneheaded move.
 
So I now have the JS9500 65" and the JS9000 55" sitting side by side.

Before, I hit my main question, how do I check to ensure its a perfect monitor? I.E. 0 dead pixels, pixel bleed, accurate colors?

My issue, is that I am using this as a computer monitor. My personal preference is the 9500 at 55" Unfortunately, this is not to be had....

Was there a reason they did not make it at 55" when I compare the sheer size of the 65" JS9500 to the 55" JS9000, it's just not a larger screen, the siding/ thickness of the panel is about 3x larger. It's a massive increase in size compared to the slimmer JS9000 profile.

Computer wise, will I gain that much more at 65"? I do not think so, especially when you consider its in a smaller room being mounted to a wall with a OmniMount, which may or may not work over 60"

Have any of you here had any experience with the 2?

Sad to hear... but probably necessary due to the full array backlighting. That is the only difference, correct?

I'd actually really like to have full array backlighting. I use Smart LED - surely it will make darker areas look better in games, and especially in movies.

But is it worth the price difference? Probably not...
 
Updated my JS9000 to 1217.

So far it feels like lag in all modes went down. Maybe placebo, I'm not sure... might need to test. Anyone else update yet?!
 
Sad to hear... but probably necessary due to the full array backlighting. That is the only difference, correct?

That and the camera.

But is it worth the price difference? Probably not...

My thoughts as well, plus the JS9500 series was offered in sizes that were just too darn big for me.

Updated my JS9000 to 1217.

So far it feels like lag in all modes went down. Maybe placebo, I'm not sure... might need to test. Anyone else update yet?!

Thinking about it... :)
 
Officially confirmed no HDMI 2.0 for AMD. Not possible with firmware or addon, it's fundamentally incompatible with the hardware architecture.
 
No, but I do hate that thing. Can't seem to tuck it out of the way properly.

What sort of noise is yours making?

I haven't received mine yet. Probably Monday or Tuesday, but in the mean time I am reading as much as I can, and I read about people complaining about fan noise in it.

How long is the proprietary cable that goes from the TV to the One Connect box? I'm trying to figure out where behind my desk I might be able to hide it.
 
Officially confirmed no HDMI 2.0 for AMD. Not possible with firmware or addon, it's fundamentally incompatible with the hardware architecture.

Hilarious. It's like they try to screw things up on purpose, how else can you explain oversights this simple?? :D

Too bad too, that Fury X card does look solid in the leaked benchmarks.
 
Apparently not an oversight, probably to release it sooner. It's not a ground up architecture, I believe they said in the Podcast it's an adaptation of Tonga. Yet they keep touting how the Fury beats the GTX 980 Ti in 4K. I wonder how they could even benchmark the Fury beyond 30fps in 4K...

amd-radeon-fury-x-4k-qcq6i.png
 
Apparently not an oversight, probably to release it sooner. It's not a ground up architecture, I believe they said in the Podcast it's an adaptation of Tonga. Yet they keep touting how the Fury beats the GTX 980 Ti in 4K. I wonder how they could even benchmark the Fury beyond 30fps in 4K...

DisplayPort.

Even if those benchmarks are legit, i.e. it beats 980Ti in every game, it doesn't beat it by enough of a margin to get me excited. Especially considering AMD's driver issues, historically higher power consumption and noise, lack of PhysX and other features, did I mention driver issues, etc.
 
If AMD had a bigger budget, I'm sure they would have invested in the development for HDMI 2.0. As the underdog, AMD did what they could with limited time and budget to bring a competitive GPU to market. At least they succeeded in making it slightly faster. It's good for the market. Otherwise price per performance from Nvidia would be through the roof.
 
Oh, I totally agree. Competition is essential. I can't imagine nVidia or Intel standing alone in their segments. That's why the lack of HDMI 2.0 is so disappointing and frustrating...because people in this thread WOULD want it due to its price/performance ratio. But it's just not an option for the Samsungs and other DisplayPort-less devices unless you want to be crippled.

They did a good job with the performance, they just limited who can/will buy it.
 
Before, I hit my main question, how do I check to ensure its a perfect monitor? I.E. 0 dead pixels, pixel bleed, accurate colors?

I'd like to know this as well.

I was planning on filling the screen with an all white window and looking for dead pixels, followed by an all grey window looking for uniformity, but I am not sure of this is sufficient.

Any recommendations are very appreciated.
 
If AMD had a bigger budget, I'm sure they would have invested in the development for HDMI 2.0. As the underdog, AMD did what they could with limited time and budget to bring a competitive GPU to market. At least they succeeded in making it slightly faster. It's good for the market. Otherwise price per performance from Nvidia would be through the roof.

Definitely true. There is no doubt in anyone's mind that Nvidia wanted to price the 980Ti at $749-$799. No question, those rumors a few months back were absolutely true.

But the chip that eventually became the Fury X was too strong, no way they could risk that high a price point, hence the "surprise" $649 price point.
 
Back
Top