whether 0 or 16 is used to mean black and whether 235 or 255 is meant to mean white
black=0-255=white PC Levels
black=16-235=white Video Levels
normally PCs use 0-255 and almost all monitors expect that
video equipment varies, all HDTV can take video level input, many can also take PC...
What does MPC do for the desktop? or games? or photo viewing?
Why should be have to live with 0-255 levels getting compressed to 16-235 and not be able to just use the PC with all 8bits per channel?
They already have the code in the drivers to send either 0-255 or 16-235 how hard is it to add a...
They still have no user toggle for 0-255 vs 16-235 levels (as ATI has had for years now).
Some guy on another forum just asked them about it and they told him that they have known about the issue of not having the option for ages.
Then he asks them when they will finally fix it and, get this...
Don;t use 26x drivers they have totally screwed up HDTV.projector support. The last time Nvidia had such support even partially working was back early last Fall. I'd try 258 drivers or 259 at most. (custom res works with those, although there is still stuff broken with them and you may need to...
That is weird. The 8800 GTX 640MB version (although maybe you have the 512?) didnt even have full h.264 decoding and would often choke up a bit on blu-ray or Canon DSLR video files.
Not sure about the 4350 maybe it is too low end to have full decode support? That is a lower card than the 8800...
Labeling it PC can be useful for PC for some things on samsungs since it disables chroma sub-sampling so you don't get blurry red and blue text but it also kills many of the color engine options so it's not really very good at all to use the the PC label for the input for HTPC.
HTPC problems:
1. It tends to set the scan rates slightly off desired on many HDTVs for the auto-detected modes
2. While it at least often manages to detect 1920x1080 modes for HD auto-detect modes for many models it fails to detect 1920x1080 for PC auto-detect modes and gives 1680x1050...
unbelievable but they have been broken for full, proper HTPC usage since October (and since 500 series came out later than that....)
Hard to believe but AMD drivers are actually vastly better than Nvidia, at least for HTPC at this point. Also been lots of weird issues with Fermi + AA in quite...
it would be horrible coding if the CPU was used! yikes that would be like an old MAC, IBM PC clone or APple II junk; even Atari 800/C64/Amiga/Atari ST didn't use CPU much for 2D graphics.
I have read that 2D graphics of non-gaming definitely vary radically by card and OS and driver. Sometimes...
don't forget it has an insanely better color engine than the 2408 and a proper sRGB emulation mode (that is better than basically any sRGB only monitor)
hmm but My old Mitsub DiamondPro 900u had much deeper blacks than any LCD i've ever had, really dark scenes in a dark room looked better but in a bright room or in brighter scenes the LCDs have a lot more pop
I had an early revision and returned it for a more expensive samsung 244t (about $1100 at the time, close to release, i don't think the 2407 ever really sold for $1300 it was a few hundred less than the samsung) since it had a really nasty problem with the overdrive effect and i waited a good...
It's actually there more than you think and hardly only at extreme angles. With the latest HDTV sets anyone anything but utterly dead center AND sitting far back will see noticeable shift and even at 24" it's there even if it doesn't jump out as you but switch to an IPS and all of a sudden there...
EIZO pretty much only pushes the $2000+ for 24" models and a few of their not worth it and junkier low tier in the US and none of their top lower tier. I've not seen B&H ever get 2333 stock in. Their 24" wide gamut only appeared very briefly. I wrote them a good while back and they said they'd...
It really is a shme that they got rid of the A-TW polarizer. THat said, despite all the white glow I still think my NEC PA241W has a much better overall image than the Samsung 244T I just sold a couple months ago. Not that the Samsung was bad (although the CR is nothing compared to an SPVA HDTV...
weird.
maybe the lacie software has started applying generic wide gamut compensation for their i1d2?
i've recently discovered that the new spyder 3 software, as of nearing a year ago already actually, actually appears to apply generic wide gamut compensation matrix (at least partially so, the...
Indeed, even though DPs are VERY careful when they do pans you still see a lot of stutter in hollywood films because 24fps is really badly synched to our eyes/brains and it ends up seeming even worse than not just totally smooth but like it it hitching unevenly.
The exact same thing happened to me the other year (only MB not VC), I'd hear these huge snap-pops and be like WTH was that, and then one day the computer wouldn't even get to the BIOS screen and I popped it open and there were popped caps all over the EVGA motherboard. It was built during the...
damn dude 181
now that is something
and with the price of copper these days you may literally have yourself there a 580 SLI or 6970 CF there hah
melt it!
Yes, that is another one!
I swear that I have gently tossed down cables, straight out, and yet the next morning when I go to use them they have knots in them that even old school fisherman have never seen before!
I was once at physics conference and someone joked that this was the greatest...
haha, that is why I sometimes end up with 4 (end up buying another)
(and as for how I know I have more than just the new one? the rest turn up as soon as I thrown out the return receipt!)
Some HDTVs, many Samsungs for sure but I believe some models from a few others brands as well, do chroma sub-sampling unless set to PC mode. So red text on black, in particular will look really fuzzy on them in regular mode. (on samsung hdtvs they lock lots of calibrations out when you go to PC...
Hands down no question Eizo 2333!!!
The only problems is that if you live in the US it is VERYYYYYYYYYYYY hard to find it for sale here. Utterly destroys the Dell for what you want to do (the Dell has wide gamut which you won't even use for anything you do and will just be running it in sRGB...
wow, that's too bad I actually just bought NWN for the first time hah, yuck, maybe I need to wait a little before swapping in a 570? :(
(and it sounds like AMD is perhaps in an even worse mess with this :( )
Nvidia must have not emulated the fixed pipeline very well, it's hard to see why it...
I have yet to own a video card that could handle every game I have at above 30fps with max effects and all I have ever owned are upperish mid-end to high-end, for the most part low high-end, and when you get to demos it gets far worse (and, for some, there are other things too). Seriously...
I wasn't talking nvidia vs. AMD here. The topic was low end vs higher end cards and why would anyone ever buy more than low to low-mid for only 1920x1200 and I was arguing that there can be a reason to go for higher end vs. lower end cards even at 'just' 1920x1200.
Well maybe that 15% puts you...
Yeah, I know they still use the CPU to decode h.264 (which is a beast for the CPU to manage, luckily their new code is super efficient and makes full use of all cores so it is workable now but it does seem a bit odd they don't use CUDA for the decoding/encoding) and CUDA only helps certain SFX...
I think what you are taking is simply the difference between calibrating and profiling your monitor with some probe and softwre and then using a color-managed app that makes use of your created profile vs. not doing any of that.
It's gonna be hard to eye-ball in the correct colors.
For color-managed stuff like photowork you want to get a good probe and good calibration/profiling software. That also works to some extent for non-managed stuff that doesn't dump the LUT and it can figure out how to adjust to the LUT to...
I'm not really sure what you mean by this. There is no way a video card could magically know what a monitor is actually displaying and no way for it to automatically correct for it.
If you mean the color adjusters in the control panels, those are manual since automatic would be impossible and...
1. It doesn't matter if there are few if you are playing one of them, then it's 100% chances.
2. In past I've hit upon games with a 570-priced card that didn't do all I wanted at 1920x1200 and there are some now and I'm sure there will be more next year
3. I HATE, DESPISE, CAN"T STAND...
It depends, there are some games and programs where the 570 creams the 6970 never mind the 6950 and some where it manages to break about even along with all the ones where it loses by 5-20%.
Once again my point is that even with a number of current games, some even a bit old, you can't maintain max settings, a very high-level of AA and filtering and 1920x1200 at always above 30fps, heck not even with the 570/6970 for a few games. And even more so with tech demos. And with GPGPU apps...