That's why this is special. It does have 120hz in the edid. AFAIK it's the first tv to do it (other than that one seiki?).
240hz, otoh, is terrible market-speak. I know Sharp has some 1080p 4ms panels, but still...not a common average pixel refresh/persistence (especially with any kind of...
According to vizio manual, hdcp 2.2 is on ports 1,2, and 5.
So...could someone try 3 and 4?
Unless vizio is completely punking us, that should still be capable of 1080p120 (in theory), right?
(Yeah, I know it's not HVM, yes, I know it will be 4k30 [or potentially 4k60 4:2:0] but...worth a...
I went ahead and ordered a 65'' P. I needed about a fingertips touch to push me over the edge, and reading (the verge? I forget) an article they mentioned at the launch party they had the 65'' P going against the 65'' 8500 with identical settings (so I assume they could verify it's wasn't...
I would love a 65 ax900...if it ever comes out...and if it ever comes anywhere even remotely approaching these prices. 65'' (hopefully higher-quality, ie closer to 192hz like better LG models than 165hz like their lower-end models, ie 5-5.5ms rather than 6-6.5ms) ips with full-array (local...
Thanks for all the updates, guys.
Any luck with modding it through cru/nvidia's drivers?
Has anyone e-mailed nvidia/vizio about the issue (especially 4:4:4 for nvidia on top of 60hz)?
For the guy that asked why 4:4:4 is important: Video is encoded in 4:2:0, so (if they get that...
Dammit. You're telling me 120fps-encoded dvds still won't work!? All none of them? But I've been waiting like 15 years...:(
(But seriously, that's probably for 120fps mpeg video which you prolly know often isn't actually 720x480 in many cases. I don't know if anything like that commercially...
Huh.
I looked long and hard at those Red cables (for this approx setup) and decided it was one more thing that could potentially go wrong, or I would wonder about. They seem very intentionally specced per use, vs a thicker dumb cable in which that shouldn't be a problem. Obviously my...
Very excited for you, brother. I plan on stalking this thread tonight.
I (probably like you) have been waiting for this setup (P + 2.0 video card) since the January announcement. Pretty stoked to see how this all turns out, as while I ordered a 970 (as an initial card to get everything...
I have heard some *interesting* stories as well (probably from a select vocal few) on the net, but I figure those possibilities (like damage/screen problems they could blame on me when not my fault) will fall in the time I can return it to store. I have also heard their direct support has...
I can understand completely. A one year warranty does suck, but coming from a guy that has always used LGs, I have given it a lot of thought, especially since their lower-end series this year is winning the praises of everybody and their cat.
Consumer Reports says their returns are as low as...
Luckily for you it seems you're just finding out, the rest of us have been waiting since the announcement in January at CES...You only have to wait approx a month (slightly more or less, launch party is supposedly end of month, and 55/70 can be pre-ordered at amazon).
*edit: apparently you...
This to the nth degree. It will be sorely, sorely missed. I appreciate what guys like Ryan at PcPer and Scott at Tech Report do, but Anand really knows how to ask the pertinent questions, and get key information out of people without going into bias. He also is very in tune with the zeitgeist...
+1. Coming from a guy that went from crt to crt to tn, to old va, to currently LG ips, my next panel will be a AHVA (auo) unless they are substantially worse for RT/lag than their other (non ips-like) tech, which seems doubtful...auo has gotten pretty good about making the right compromises...
It's slightly farther down the foodchain...but is still the type of card I would recommend. It should be pretty solid for 60fps at 1680x1050 on high settings in BF4. It will also keep it above 30fps min at 1920x1080 even on very high settings...pretty much what you want.
The 270x is a decent...
Pretty much...but consider possibilities. For instance:
The reality is 760 was purposely clocked to compete with a 1536sp part at 1ghz*. You can expect at least 1ghz/5400...likely slightly higher in one regard or another to match/slightly beat where most 760 cards perform...up to...
Bandwidth? It isn't. You could think of 7950 as essentially having similar performance as a gpu with perfect gpu efficiency (ie about 29-30 cus) at default clock because of the extra bandwidth it has.
7970 could run at 1250mhz with 6ghz/384-bit and still be efficient...this is why a 256-bit...
A quote that I think people need to remember from ryan at anandtech:
AMD will always opt for a chip design with just enough pad space for a larger bus if feasible. rv670, pitcairn, redwood...even r600 (but that is a different can of worms). The justifications are many...but look at...
I just have this sinking feeling for Brent AMD (or certain anal readership) will expect a suite retest with the tweaked drivers while expounding they have mitigated the expected '~2-3%' performance loss to ~1.99_-2.99_%.
Situations like this have got to suck for people like them.
Back when I was younger, we used to call that Munning (because while others have pandered to youth before, that woman really pioneered what you're talking about.)
Perhaps not correct to the Urban Dictionary defection T, but pandering to the geek/gamer demo is what it is. I give Conan credit...
'Pull the plug' (or shutting off psu) has worked on more than one occasion for me when concerning bios issues and certainly some others I'm forgetting. Granted that is a 1% thing and probably not a problem 'dad' would have though. Yeah, it surely doesn't fix hardware failure, (most) software...
I know it's probably not the popular opinion, but I appreciate ads like this for trying to stir up some passion. I miss things like the (xfx?) site swatting Dawn, or my favorite, the G4saurus. Sure, part of it's intention is to create pure irrational fanboy devotees toward which billion dollar...
While what you both say is true, the tests show the coolers performing where one would think they should based on other reviews.
Glad to see more love for Noctua. I really dig the C-14 in my kinda-tiny HTPC case. Wish they were cheaper though!
Hey guys, I have a few questions about a gaming HTPC I'm building to serve double-duty.
Case: Looking for something with USB 3.0. Would like full-height slots with room to fit a 10'' card (I figure that's where the 'sweet-spot' gpus will stay). PSU is 140mm (standard). Only need one optical...
Yeah, it certainly does. I think my personal favorite was when Garcia was psyched that a CCTV security office had a single computer with a top-of-the-line nVIDIA consumer graphics card in it. Picky, I know, and there are more extreme examples (like random remote computer control), but some...
Did jump on CL, and unfortunately there wasn't ANYTHING remotely anywhere close to me. (Damn you North Dakota and your sparse population!) Believe me, I'm with ya going that route if I could find one of those classic Dells. I've come to the conclusion buying something new in this case...
I understand this is often a [H]ardcore forum and this isn't a hardcore question, but if I could get a few suggestions from the crowd I would sincerely appreciate it. I'm sure some of you have been in a similar situation or have radar toward bang-for-buckage in different scenarios, and I...
Kinda what I was thinking. If I was a company on the path to taking over the world, large mock-ups of pastries (associated with product launches or not) would be on my list of things to acquire post-haste.
In all seriousness though, I feel sorry for the poor son of a bitch/gallery of cube...
While I agree with you, right now they're competing with the 5850 and 5830. Wouldn't be surprised if both the 768MB model competes with the 5830 at stock and with overclocking considered (on both) the 1GB model achieves close to parity with the 5850. The later is conceivably not a bad deal at...
This is an interesting conversation, and one with a small splash of irony.
We all agree the 465, at it's current price, power consumption, etc is terrible. Fermi must die, and it will, and the godsend will be GF104...What Fermi should've been all along, but we have to wait for because of...
http://en.wikipedia.org/wiki/Torrenza
http://it.anandtech.com/show/2018/2
(HTX slots...Yes, this article is OLD. I had to dig for it.)
I knew they had planned some kind of "APU" socket before APU meant CPU+GPU on one processor die. I thought it was a socket, perhaps it was just the...
Yep! You posted while I was writing my big-ass thesis. Didn't think anyone else would add that,and it's very important to note. Being as close to 18nm as possible is important for longevity, and whatever commission fab creates the first well-yielding working sub 18nm is going to have a...
That's MOSTLY right from what we've heard.
I would just add that TSMC is supposed to be 30% greater clocking at the same power/die size on 28nm general process vs. 40nmG, not 15%. If you parse GF and TSMC's low-power 28nm claims, you get (an unmade claim of) 24% greater clocking for GF's...
This sounds like the start of DVT. That would mean initial samples have already been out there, and now development (validation testing) has started.
I know nothing about time frames for AMD's internal development, but Xbit has mentioned on occasion that their products go from internal...
I'm wondering why the G104 (mentioned in the VERY SAME ARTICLE) isn't getting more attention. According to the article It's launching in Q210, and will be replacing the former high-end generation. Cutting Fermi in half, giving it a 256-bit bus and 4/5ghz GDDR5, this looks like a feasible...
nVIDIA:
Most importantly:
As for AMD, you can very well ascertain they will not block the technology. Open standards, remember? There is NO precedent for it, and plenty to the otherwise. AMD is fine with SLI on their own chipsets, and it even worked until nVIDIA locked it down.
As for...
Congrats on your current celebrity. :D
Any possibility of popping the top and taking a shot of the die for us? A centimeter reading of the length/width of the die (not glue) would be great too. Hopefully that's not asking for way too much. :p
It'd be appreciated. Thanks in advance.
Yeah, Kyle and Charlie are kind of renowned for their knowledge of the inner workings of nvidia/aibs. ;)
Well, at least pounding the pavement to try to figure out what the hell is going on.
While we didn't know jack about G80 until a couple weeks before launch, we knew WHEN it was launching...
Fixed, and agreed.
Not saying they'd ever go that route, or even if it's feasible. It is certainly an interesting concept though. If we do see it in the near future, I imagine it will be part of their new architecture beginning with what's coming a year from now (R1000/28nm?); the product...