ManCannon, highly recommended that you upgrade to 4GB RAM. (Vista 64 bit)
No stuttering, no hitching, regardless of CPU present. Crysis is a jerk particularly in RAM usage, it takes 4+GB to get a total smooth experience.
Also consider 4850X2, probably a very nice => GTX280 in most sites (besides of [H], but you seriously aren't going to base your purchase off a single site, right?)
You can use Edge-detect 24x without much pain with 2 chips too.
The 4850X2 sells at $400 (real MSRP), which is lower than what...
Lose complete control? No. It's still partially owned, and will serve AMD/ATI's interests first, others second. Same as IBM East Fishkill. Power processors first, nVidia's chips second.
Are we going to judge cards/cars by performance, or pricing and agendas? ;)
If [H] had not reviewed the 9600GT- which was a card that reeked of redundancy all round compared to the 8800GT- I wouldn't have minded.
Unknown-One, same for the Radeon 2k+s. They had problems with fog initially in HL2: EP1.
Looking at nVidia's 8xQ/16xQ modes, whoa not good. They're supposed to be the better modes (much more slower) yet they have artefacting in hi-bright areas/edges in Oblivion.
Sorry, the vendor you mentioned wasn't responsible for 70% of Vista instability problems. They weren't responsible for stuttering in Unreal Engine 2 that took a year + to solve, either.
Performance aggregreators show that the 9600GT lag the 8800GT by on average, 20%, at 1280*1024/1600*1200 with 4xAA or not.
If you can say that it's "with" the 8800GT, then the 3850 which lags the 9600GT by ~20% (little more with AA, less without) on the same site is "with" the 9600GT. And it...
The 4850 is in some cases 2X faster than the 9600GT. The GTX280 is generally ~3x faster in non limited cases.
Just to correct that. The 9600GT is way overrated.
The reason it really got bad rep was the way it was 2x+ slower than the second order highend card. The gap has been increasing steadily since the 6600GT's extremely superb price performance, and at the 8600GTS level people actually realized that.
Price/performance is supposed to go up with...
I forgot Motorstorm. I wanted that too (badly when the proper version came). But now that Pure is coming, it's sidelined. Also, Ratchet was quite nice when I went to play at my friends'.
The others- I'd say they just aren't bringing anything new in. You can find substitutes for those easily...
I dunno, why don't we try FUN games instead of just SHINY games?
Simply- Uno, Fable and Too Human play funner than anything Sony or Nintendo (ugh) has to offer. I would give Pixeljunk Monsters and Eden some kudos, but really, nothing's moderately interested me besides MGS4 due to the nice...
LOL @ btdvox.
No, a die shrink will not help games much.
---
So where are the Nehalem optimists now?
No i7 for me until DDR3 is cheap here. And only if I can get motherboards without that wretched NF200 chip that incurs additional latency and heat, and of course the $50+ markup...
You said you're gonna upgrade this winter...
4850 FTW. When winter comes, 90% chance that you'd be grabbing a CF-supporting motherboard, then add another 4850.
They ARE competitive right now, price/perf is quite good, and the nVidia slanted are still happily getting GTX 200s- that's an indicator.
As for the margins, they have cash. GT200b in my prediction should get the 270 > 4870 and 290 > 4850X2, albeit at higher load consumption (their coolers...
Yeah, but at what cost?
2 280s in load take 340-380W depending on how you measure already. As the 9800GTX+ obviously showed, the 55nm process for nVidia offers little in terms of improvement (I would say it shaves 10% off power max).
Even then, you'd have a card that's 310-350W at current...
Unlike a certain other graphics chip vendor, current Catalyst versions on Vista still officially support Radeon 9800s. Speaks volumes on which company wants your money and which one actually gives competent support.
If you're worrying about driver profiles, I don't think they will neglect it...
Well, from what I've gathered from other websites, nVidia originally banked on the G92b to get them through 2008 Winter.
After that, I highly suspect that they'll just make a die shrinked/improved version taking on the GT300/GTX350 (--) monikers.
http://www.computerbase.de/artikel/hardware/grafikkarten/2008/test_ati_radeon_hd_4870_x2/14/#abschnitt_stalker
STALKER's AA isn't really AA. It's just edge blurring. Once you use it you'll know- it's downright unimpressive.
And I'd daresay that every video card from the Radeon 3870/8800GTX...
It does not automatically make the chip less hot.
Heat/power use is the main problem here. Moreover, using GDDR5 = no power saving abilities anymore (GT200 turns off/super-low clocks its GDDR3 RAM, which you can't do smoothly using retraining GDDR5) = idle power would get back to something...
It won't help.
The GT200 needs a revamp on how the whole architecture deals with logic units ("stream processors") in order to get to the next generation.
Plus, the memory controller on the GT200 has not been modified to support longer burst rates yet, which means it won't support even...
They aren't adding anything. It's a linear shrink, not even a partial redesign like RV670.
This goes back to G70->G71 and G92->G92b (G80/G92 involved quite some improvements, but obviously even with faster R&D that was not a rushjob project)
Somebody bought into shilling a little too early. :rolleyes:
p/s: Far Cry 2 doesn't even use the same frigging engine. Shows how much you (don't) know.
Without the L3 and on 45nm the Quadcore "budget" CPU would be quite viable- if not performance then clocks.
It would be smaller than the sum of a Core 2 Quad which requires further packaging (MCM) costs.
IP35 PRO. No talk, really. Get different graphics cards if you have to.
Unless you're that rich to be buying 2 GTX280s there's no real reason to invest in the 680i, which by all means is a POS compared to the P35.
The current 360s use less wattage than the PS3 I think. Either way the 60GB will prolly be the end of RROD et al. And if the rumoured pricecuts are true, you can't go wrong with $300...
Back on topic then. Nintendo has succeeded in making a console that I spit on.
This is a triumph
The PCPartner websites state that they were one of the OEMs making reference boards of the oh-so beloved 8800GTX/S (G80).
Since G92 nVidia went back to Flextronics I guess. PCPartner produced the 2900XT, the 3800s and the 4800s (along with Asus)
Zotac is just a reseller. It's belonging to...