New Samsung 4k for everyone.

And one more question : do you guys, use HDMI 2.0 cables, for 4 K ?
Is there any advantages vs 1.4 HDMI cables @4K ?

The truth is, that there is no difference in the overall cable design between HDMI 1.4 and 2.0. There is no such thing as a HDMI 1.4 cable vs. a HDMI 2.0 cable. The overall cable specification remains the same, EXCEPT for the bandwidth requirement.

Before the launch of the HDMI 2.0 spec, cables were simply tested to the old spec (10.2Gbps). Since the release of the 2.0 spec, many manufacturers have simply taken the same cables coming off the same production line and instead now test them to the new spec (18Gbps).

The difference is they simply throw away a few more cables at the end of the line, so their yield goes down. (or maybe they just bin them and label them as 10.2gbps cables, who knows)

Many of these same cables previously labeled at 10.2gbps, are fully capable of 18gbps, there simply is no guarantee.


Now, what do you need for 4k?

It depends on your display.

If you are looking to output 4k to a TV over HDMI you should ideally have a cable capable of 18.2gbps.

If the cable (or gpu) is not capable, one of three things will happen depending on the auto-negotiation between the GPU and TV.


1.) You will max out at 30hz instead of 60hz when at 4:4:4 chroma. (I tried this, even on the desktop it sucks)

2.) You will drop down to a lower chroma level and it will not look as good, especially on colored text (blue/red/purple appears to be the worst)

3.) You will get no signal at all.


If you have a 4k monitor with DP inputs, just keep using DP, it should work.


Furthermore, people in this thread have found that long cables - even when labeled as 18gbps - tend to not work well if you want 4k 60hz at 4:4:4. 6ft (2m) & 10ft (3m) cables tend to work, but once you get to 15ft (4.5M) cables it is hit or miss, and I ahve yet to see anyone get a cable over 15ft to work.
 
Zarathustra[H];1041803325 said:
It would certainly be an upgrade, but Westmere cores have a pretty huge IPC deficit to Skylake.

It hasn't been much from generation to generation, but the single digit percentages add up over multiple gens.

I would agree, as a "hold me over" strategy, $100 isn't a bad investment, but a Westmere if you can get it to 4.4ghz will still be significantly slower than a Skylake at 4.7ghz.

I'd say the skylake would be as much as 75% faster in some benchmarks.

Where in the heck do you see 75%?

http://www.hardocp.com/article/2015/08/05/intel_skylake_core_i76700k_ipc_overclocking_review/5

It doesn't even get that close in synthetic benches. Less in multimedia and even less of a difference in gaming. If I were to jump to Skylake, I bet I may see like a few fps increase in gaming. But no way is it going to be a 75% increase in realized performance.
 
Where in the heck do you see 75%?

http://www.hardocp.com/article/2015/08/05/intel_skylake_core_i76700k_ipc_overclocking_review/5

It doesn't even get that close in synthetic benches. Less in multimedia and even less of a difference in gaming. If I were to jump to Skylake, I bet I may see like a few fps increase in gaming. But no way is it going to be a 75% increase in realized performance.

In gaming, no.

You are probably going to be limited by the GPU anyway.

I'm talking theoretical max ability of the CPU based on single threaded render/computer benchmarks (like Anandtechs published single threaded 3d Particle computations and single threaded Cinebench scores).

Individual game benchmarks are better when you are trying to figure out if you'll be able to play an individual game well, but they are much more difficult to predict overall performance, or how well they might to on future unspecified titles.

I like to use these more render based single threaded benchmarks as an overall prediction tool to see what each is capable of.

The truth - however - is that in most titles, pretty much any non-low power CPU released in the last 5 years should handle the game just fine, at least until you start adding SLI or Crossfire, as they up the CPU load significantly.

There are always going to be some titles that load the CPU more. I hear Starcraft 2 did this (but I never played it). One of my personal favorites Red Orchestra 2 does as well, but they are outliers, not the norm.

In RO2 with SLI, I actually saw a pretty large improvement going from stock clocks on my 3930k to 4.8Ghz. Prior to the overclock it was a stuttery mess with frame time spikes all over the place, after it is a much smoother experience.

I had temporarily down-clocked the CPU as my old fans were failing and I didn't think I needed the overclock anyway. Once I added the 980ti's in SLI, I was wrong.
 
Okay, Nilsen, and what's the advantages with this Audioquest Cinnamon cable, vs some regular HDMI 1.4 cables ?
Where do you see some improvements ?

Cheap cables are unreliable, known to break, and you never know what you get.

Brand cables, like mine, delivers on promise. They just work. Last I checked, a brand cable was only twice the price of the the cheapest one I could find. Mine was a little more than three times as expensive. Why bother?

I have none of the cable issues reported earlier in this thread. Everything is rock solid and stable. No crappy cable artifacts. I can trust that piece cable. I find this new tech surprisingly mature and hick-up free.

As for comparing IQ, I have not looked into that yet. But I will. Hopefully, I get to test the Audioquest Coffee. Usually, sharing cable improvements, just result in a lot of nonsense arguments about peoples brains and their perception. But if people show any interest, sure.
 
Usually branded cables have absolutely no impact at all, that being said HDMI2 is relatively new so I'm not sure all the nonames are to be trusted yet :p

As far as quality goes, HDMI is a digital standard, as such there is absolutely 0 impact in quality between cables, as long as the cable is good enough that it works.

If it works it works, and the output will be bit for bit identical regardless of the cable.

The argument for "quality" cables was always pretty weak, but it made a little bit more sense for analogue cables than it does for digital signals.

With digital if the cable is bad you know. If it doesn't look bad, improving the cable "quality" has no pact at all. The output will be bit for bit identical.

So called "enthusiast" cables or "audiophile" cables have always just been a way to separate fools from their money. :p

Cable quality DOES matter in one way though, and that is that you want a cable that is going to be sturdy and not come apart.

Anyway, do yourself a favor, get a cheapish (sub $10) cable from a semi trusted discount brand (mediabridge, Monoprice, etc.) and if you have trouble just get a different one.

It never makes sense to spend more than $10 on a 6-10 ft AV or Ethernet cable, and you can usually get away with much less.
 
I tend to buy whatever's the cheapest when it comes to cables. back in the day, I was fed all the BS on high performance cables:rolleyes: and would spend triple the cost for "Monster" labeled cables...yeah, I was a misinformed dumbass....I did end up buying a new $10 HDMI spec 2.0 cable. Even though my 2$ eBay cable worked when I plugged it in. I figure, I paid good money for the TV and building my computer, why go cheap on the item that is used to send information back between the 2? my .02$
 
firmware 1224 for which set?...my JS9000 says 1219 is the latest for it

I think they stagger the updates to go easy on their servers.

I had complained to Samsung about my one connect box being unusually loud, so they sent me an RMA unit (which is much quieter)

The new box looks identical to and has the same part number as the old one.

(This is where I shoould note that the screen itself is just a panel, all the logic resides in then one connect box)

Before swapping out the box, I was on firmware 1220.

One installing the new box, I upgraded the firmware to 1219, and it claims there are noore updates for me.

I'm guessing the updates are staggered so they don't kill the servers.


Side note and unrelated.

We know that if you have a one connect box, all the logic resides there, and the panel is just a panel.

We also know that Display Port is able to natively communicate with panels, without any logic in between.

I wonder if it would be possible to create a display port to Samsung one connect adapter and use the panel natively, possibly at 120hz?

That would be fantastic. One would have to reverse engineer the backlight controls though.
 
Just got upgraded to the latest firmware for my 9000 when I woke my system up this morning.

The mouse cursors disappears on white now.

Had to go into Control Panel > Hardware > Mouse and change the cursor scheme to inverted and assign a new Hand icon to the hand pointer.

Its working but I expect that there will be another update soon if this is related to samsung.

I have not installed or changed anything on my maching since yesterday other than the samsung update that fired when i started up.

Running windows 7 Pro
 
Hey buddy!
Suoermi here from OCN!!! REPRESENTING HAHAHA

Just Google stuck pixel tester, open on run browser in F11 full screen , cycle through the colors, comb the screen for hot/stuck pixels.

If you are pressed for time try white screen for dark dead pixels and black screen for hot pixels.

If you wanna check backlight uniformity etc then use the black screen, and even grey screen at your desired brightess/backlight level.

Then it is up to your eyes whether it meets your quality expectations! :D

Hey so I have a 55" Js9000 which I am comparing to a 65" JS9000. I had the JS9500 but the bezel width was HUGE. I couldn't stand it. Did anyone possibly make a dead pixel check screen / calibration color screen for us getting new monitors?

Would be great if we had a sticky with this, since I assume this has been asked by other people than me repeatedly.

Thanks!
 
Just got upgraded to the latest firmware for my 9000 when I woke my system up this morning.

The mouse cursors disappears on white now.

Had to go into Control Panel > Hardware > Mouse and change the cursor scheme to inverted and assign a new Hand icon to the hand pointer.

Its working but I expect that there will be another update soon if this is related to samsung.

I have not installed or changed anything on my maching since yesterday other than the samsung update that fired when i started up.

Running windows 7 Pro

Which firmware revision is this? It looks like we are all getting them at different times.
 
Also, the GTX 950 launch today seems to be a new good choice for those who want to output 60hz 4:4:4 chroma 4k resolution to these screens, but aren't too concerned with games.
 
Anybody hear back from Samsung on the free Galaxy S 6 offer? Last email I got was three weeks ago saying that I'd get my final status in two weeks...
 
Same here a month ago.

We still need to check all your submission’s data to ensure compliance with the Offer Terms and Conditions before your submission is approved. Please allow approximately 2 weeks to receive an update email letting you know your final status.
 
Zarathustra[H];1041804076 said:
...
So called "enthusiast" cables or "audiophile" cables have always just been a way to separate fools from their money. :p
...

Wow, that did not take long.

You are aware that you just called me a fool. Without any bases what so ever.

Zarathustra[H];1041804076 said:
... As far as quality goes, HDMI is a digital standard, as such there is absolutely 0 impact in quality between cables, as long as the cable is good enough that it works.

If it works it works, and the output will be bit for bit identical regardless of the cable. ...

I cannot find a single HDMI cable test, that really document any claim at all. Not a single site got any where near solid methodology. It is all basically "believe me, this is how it is, because I say so" stuff. Claims are all over the place. There is no repeatability, thus no real proof of anything really.

Given that there is a difference by cables or not, and that the difference is detected or not, there are four possible results, not just one. Just discussing the result where there is no difference and it cannot be detected, is simply unscientific: In the world of science, it is a huge logical blunder.

Anyway. The more I think of it, the more I look into it, trying that heigh end cable seems like the reasonable route to take. After all, the 65/7005 image is a tad soft, and it sharpens as if it is a tad noisy. There is room for improvements.

After reading a ton of Cable tests, in which people for instance put a film on pause, in 1080p, for testing IQ, makes me wonder what they expected to find, with such methodology? How are you going to separate compression artifacts from noise? They are using a heavily lossy compressed and uncontrolled image? To test sharpness a pixel level?

None of these tests speaks of what smart feature that is disabled or enabled either, as some of these alter the image uncontrollable.

So everything is up in the air to me, all four of the possible results.

And I would like a tad sharper image.
 
Wow, that did not take long.

You are aware that you just called me a fool. Without any bases what so ever.



I cannot find a single HDMI cable test, that really document any claim at all. Not a single site got any where near solid methodology. It is all basically "believe me, this is how it is, because I say so" stuff. Claims are all over the place. There is no repeatability, thus no real proof of anything really.

Given that there is a difference by cables or not, and that the difference is detected or not, there are four possible results, not just one. Just discussing the result where there is no difference and it cannot be detected, is simply unscientific: In the world of science, it is a huge logical blunder.

Anyway. The more I think of it, the more I look into it, trying that heigh end cable seems like the reasonable route to take. After all, the 65/7005 image is a tad soft, and it sharpens as if it is a tad noisy. There is room for improvements.

After reading a ton of Cable tests, in which people for instance put a film on pause, in 1080p, for testing IQ, makes me wonder what they expected to find, with such methodology? How are you going to separate compression artifacts from noise? They are using a heavily lossy compressed and uncontrolled image? To test sharpness a pixel level?

None of these tests speaks of what smart feature that is disabled or enabled either, as some of these alter the image uncontrollable.

So everything is up in the air to me, all four of the possible results.

And I would like a tad sharper image.

It has to do with how digital technology works.

With analogue signals you get noise. A bad cable could theoretically introduce more noise (though in practice this rarely happens, something else is usually the weak point in the system)

With digital signals the output is bit perfect (unless you flip a bit) with a strong signal. As the signal degrades (due to, among other things) cable quality there will be no difference as long as the receiving side gets the same 1's and 0's as are transmitted. As soon as the signal quality gets poor enough that the signal received is different than what is sent, you get horrific artifacts, or malfunctions.

So, with an analogue signal, as signal weakens your quality progressively gets worse.

With a digital signal, you have perfect bit identical quality as the signal degrades, up until a point when you hit a threshold, and all of a sudden the quality is awful, with artifacting, freezes, disconnects, pixellation, etc.. Really horrible stuff that you can't ignore.

For a digital signal, if you already have quality that is non-awful, replacing the cable will do absolutely nothing but waste your money. This is not opinion, this is fact.

Think of it like this. If you have a cassette tape (analog), and it degrades, the sound quality worsens, gets more noisy and hisses.

If you have a CD (digital) and it degrades and gets scratched, at first with small scratches, there is no difference at all. Then all of a sudden you reach the point where it can no longer be properly read, and you have horrible skipping or freezing.

Or with a hard drive, (digital). As it degrades, everything is smooth sailing, until you have your first bad sector, then all hell breaks loose.

The same thing goes for cables.

If your signal is digital, and it works, the most expensive cable in the world can never make a difference.

With an analogue signal, there COULD be a difference (at least it is within the realms of scientific possibility) but even then, it is more often placebo than not, as even most basic cables are sufficiently shielded to not let audible interference in and make enough electrical contact that increased resistance isn't a problem.
 
I have right now an older CPU, Intel I7 920@2,67 GHz, 8 GB DDR3, GTX 980Ti.
Do you think that the CPU bottleneck the gpu ? I do not like to oc, i want to upgrade soon the cpu + mobo + psu.

If Fire Strike is any indication, Yes your I7 920 is a bottleneck. I have a I7 930 2.8Ghz OC to 3.8Ghz with 2 EVGA 980 Ti's in SLI and I get a score of almost 16k. The minimum score for a 5930 @ 4.2Ghz and 2 980ti's in SLI is 21k. So that is a 30% increase. Right now with my I7 930 system I can run Star Citizen 4K Ultra Settings 60 Hz @ 50 fps, hanger and Area Commander.

Since the 6700 @ 4.7Ghz seems to be equivalent to a 5820 @ 4.3Ghz in gaming, I am thinking about a Z170 system so I can test out m.2 raid. I was also thinking about getting the EVGA Z170 Classified that has the PLX chip so I could add a third 980 Ti Down the road. Three might be worth it in 4K if NVidia improved their drivers.

So this is what I was thinking with the 5 PCIe slots on the EVGA Z170 Classified there is a maximum of 40 lanes CPU and PLX. 3 way SLI with no other PCIe slots used would be 8/16/16 (slots 1,2 and 4). If I were to use Slots 3 and 6 for add in card M.2. Slots 1,2 and 4 would be 8/8/8. The 2 M.2's are PCIe 3.0 4x ea, for a total of 32 lanes used out of 40.

In addition there are 20 PCIe 3.0 lanes to the PCH. Slot 5 on the motherboard is PCIe 3.0 4x to the PCH so is the on board M.2 slot. So theoretically you should be able to do 3 way SLI and do 4 M.2's in Raid 0.

http://www.evga.com/support/manuals/files/151-SS-E179.pdf
 
Last edited:
Zarathustra[H];1041806804 said:
... For a digital signal, if you already have quality that is non-awful, replacing the cable will do absolutely nothing but waste your money. This is not opinion, this is fact. ...

"believe me, this is how it is, because I say so"

This has been even mathematically proven to be wrong, using a DAC. This hypothesis, for all digital interconnects, has been proven wrong. In the 1990s. You only need to prove a hypothesis wrong once. That is all it takes.

I do not argue that you cannot save data safely to a harddrive or a USB-stick.

There simply is no proof of anything when it comes to these HDMI cables, but a lot of claims all over the place. I like that, so I will try a cable or two, and see for myself.

If I detect any meaningful difference, and is able to document it, I am going to enjoy every second of it.
 
@Nilsen, did you update too the latest firmware, 1214 ?
Does it work for you ?
I did update but nothing changed. Still no BIOS / F12 / etc ...
Should i try again with DVI-HDMI cable ? Should i try to change the TV HDMI slot ?
LE:
I chatted with some Samsung center person, and he said, the problem with no BIOS has nothing to do with the TV. I should research on my PC, etc etc etc
 
Last edited:
"believe me, this is how it is, because I say so"

This has been even mathematically proven to be wrong, using a DAC. This hypothesis, for all digital interconnects, has been proven wrong. In the 1990s. You only need to prove a hypothesis wrong once. That is all it takes.

I do not argue that you cannot save data safely to a harddrive or a USB-stick.

There simply is no proof of anything when it comes to these HDMI cables, but a lot of claims all over the place. I like that, so I will try a cable or two, and see for myself.

If I detect any meaningful difference, and is able to document it, I am going to enjoy every second of it.

Fine, here is a test summary that uses expensive equipment (X-rite i1pro Spectrophotometer, X-right Hubble Colorimeter) to compare a ~$12 generic cable, a ~$20 brand name cable, and a ~$200 ridiculous enthusiast cable.

Granted it is a few years old now, so they were testing 1080p, and testing it as displayed on a Samsung PS50C7000 Plasma TV which was as close as you could get to a reference TV at the time, but it shows exactlyu what one would expect.

No significant difference in measurements between the cables. Any differences were due to the margin of error of the test, not attributable to cable quality difference.

Detailed test here.
 
Back on topic to our glorious TV :)

I have to say that playing games on Game mode on dynamic with AMP turned off, looks the best to my eyes. Colors pop, and the picture quality just looks awesome. PC mode even set to entertain profile, looks washed out to me. For surfing though, PC mode in standard looks the best as my eyeball don't get fried from the contrast/glare.
 
Honestly, anything other then 4k looks like crap. This set needs PC/444 or Game/422 with UHD/4k to really appreciate and get this TV to look its best. I tried to scale it to my older Dell U3011 and 2560x1600 looks blown up and washed out. I would recommend getting another 980 Ti if you can to enjoy this TV

Why though? Why does 1080p look worse? The TV's are built to run 1080p and 4k as native res so 1080p should not look any worse than a 1080p TV. If I bought a 48" J8500 and run it at 1080p, it should look identical to a 48" 1080p screen? Obviously 4k will look a lot better but I am comparing 1080p on a samsung 4k TV vs 1080p on a 1080p TV.. It should be the same?
 
If Fire Strike is any indication, Yes your I7 920 is a bottleneck. I have a I7 930 2.8Ghz OC to 3.8Ghz with 2 EVGA 980 Ti's in SLI and I get a score of almost 16k. The minimum score for a 5930 @ 4.2Ghz and 2 980ti's in SLI is 21k. So that is a 30% increase. Right now with my I7 930 system I can run Star Citizen 4K Ultra Settings 60 Hz @ 50 fps, hanger and Area Commander.

3D Mark is never an indication of anything. It's numbers are entirely arbitrary. The factors that improve 3D mark scores don't have the same effect in actual games. That's been the case pretty much since the beginning. Well at least when they stopped using an actual game engine in the benchmark.

Since the 6700 @ 4.7Ghz seems to be equivalent to a 5820 @ 4.3Ghz in gaming, I am thinking about a Z170 system so I can test out m.2 raid. I was also thinking about getting the EVGA Z170 Classified that has the PLX chip so I could add a third 980 Ti Down the road. Three might be worth it in 4K if NVidia improved their drivers.

Unfortunately at least for the Titan X a third card does very little in most games if anything at all. Scaling past two GPUs blows as usual. If you really want to go the triple SLI route I'd go with X99. The PLX chip still shoehorns data through the CPU's 16x PCIe lanes. The PLX also adds latency so keep that in mind. It isn't huge, but neither is the "gains" from 3x cards in SLI.

So this is what I was thinking with the 5 PCIe slots on the EVGA Z170 Classified there is a maximum of 40 lanes CPU and PLX. 3 way SLI with no other PCIe slots used would be 8/16/16 (slots 1,2 and 4). If I were to use Slots 3 and 6 for add in card M.2. Slots 1,2 and 4 would be 8/8/8. The 2 M.2's are PCIe 3.0 4x ea, for a total of 32 lanes used out of 40.

Again the M.2 drives will have to connect to the PCH. You will have to contend with DMI 3.0's limited bandwidth which basically tops out at 40Gbps. So you won't get much benefit out of having that second drive. At least not for reads. Write speeds are much slower and as a result still scale normally.

In addition there are 20 PCIe 3.0 lanes to the PCH. Slot 5 on the motherboard is PCIe 3.0 4x to the PCH so is the on board M.2 slot. So theoretically you should be able to do 3 way SLI and do 4 M.2's in Raid 0.

http://www.evga.com/support/manuals/files/151-SS-E179.pdf

Any M.2 drives connected via adapter cards (or PCIe based drives) that go through the PLX are going directly to the CPU. You can do that but software RAID will be your only option. The onboard RAID OROM will not be able to detect those. So you will not be able to create a bootable array that way. It's simply how Intel's IRST implementation works.
 
Why though? Why does 1080p look worse? The TV's are built to run 1080p and 4k as native res so 1080p should not look any worse than a 1080p TV. If I bought a 48" J8500 and run it at 1080p, it should look identical to a 48" 1080p screen? Obviously 4k will look a lot better but I am comparing 1080p on a samsung 4k TV vs 1080p on a 1080p TV.. It should be the same?

It doesn't look terrible, it just doesn't look as good as native 4k. I think once you see your computer display at 4k native resolution, you just get spoiled and anything less doesn't compare. Just like when you go with a huge TV. I have an 80" TV in my living room, and now anything less then that looks small, and I had a 63" Samsung Plasma before that and I used to think the 63" was big enough. So, that's my analogy anyway
 
It doesn't look terrible, it just doesn't look as good as native 4k. I think once you see your computer display at 4k native resolution, you just get spoiled and anything less doesn't compare.

This is true.
 
It doesn't look terrible, it just doesn't look as good as native 4k. I think once you see your computer display at 4k native resolution, you just get spoiled and anything less doesn't compare. Just like when you go with a huge TV. I have an 80" TV in my living room, and now anything less then that looks small, and I had a 63" Samsung Plasma before that and I used to think the 63" was big enough. So, that's my analogy anyway

Yeh I mean obviously 1080p is not going to look as good as 4k.... But 1080p on one of these samsung 4k TV's should look exactly the same as 1080p on a 1080p TV?
 
Why though? Why does 1080p look worse? The TV's are built to run 1080p and 4k as native res so 1080p should not look any worse than a 1080p TV. If I bought a 48" J8500 and run it at 1080p, it should look identical to a 48" 1080p screen? Obviously 4k will look a lot better but I am comparing 1080p on a samsung 4k TV vs 1080p on a 1080p TV.. It should be the same?

Yeh I mean obviously 1080p is not going to look as good as 4k.... But 1080p on one of these samsung 4k TV's should look exactly the same as 1080p on a 1080p TV?


Here's the million dollar question though. Why in the hell would you even care as if you bought this TV, why on earth would you want to display unconverted 1080P content? I mean, yeah, you tube videos will look fine at 1080p, but for Windows desktop environment, to gaming, just leave it at native 4k, and to hell with 1080p. Looking at your sig, you have a 980 Ti, so more to the point of keep it at 4k, and move along....
 
Back
Top