Some work. Some don't. Since that particular model of laptop is available with both Core Duos and Core 2 Duos, I think it's pretty safe to say that it supports both.
Next-gen cards will (probably) be available in November. If you're going to upgrade now, at least make it a more modest one (like to a non-overpriced GTX). Having your brand new, overpriced, high-end card eclipsed like that in 3 months would really blow.
Well, yes and no. I upgraded to 4GB for the hell of it (how can you not? the prices are sooo low) but my PC can actually see and use 3327MB, which surprised the hell out of me. This is with a single 640MB 8800GTS, which I thought would put me at around 2.9GB.
Ok sweet. So then an evga 680i board would be suitable for the OP.
I still think waiting to see what X38 is all about would be smart though.
Thanks for the info.
There's currently no guarantee that the 680i boards will support Penryn/Wolfdale. They do support the 1333 FSB, but the boards don't necessarily meet the 45nm chips' voltage needs.
I hear nVidia's working away on a replacement for 680i due out this fall, and that X38 may support SLI, so I'd...
Yep. Croatia's economy is growing much faster than that of the US. That doesn't mean it's bigger, just that it's growing faster. The smaller you are, the easier it is to grow. It's a measurement that, more or less, means absolutely nothing.
Way to ignore the OP's entire point about not wanting to upgrade his mobo.
Back to the topic, I second the x1950xt. It may not be DX10, but there's so much doubt as to whether current DX10 cards will even run DX10 games well that support may be irrelevant anyways.
EDIT: I recall reading...
My OCZ GameXStream fits adequately, though I think pretty much any PSU is a tight fit in the P180. But, so far as fitting is concerned, mine fits just fine.
Mine is around 3 weeks old. Bought on special from NCIX in Canadia. I believe they were ordered directly from EVGA at the time, so they'd have been the newest product off the line.
I think any major online retailer will have A3's by now. It's the brick and mortar stores I'd be wary of...
I'm not sure what's safe, but mine reaches the mid 90's when overclocked to its max. I also cannot adjust my fan speeds due to Vista :(, but I know I need my case fans turned to max in order to get it that far. It's also damned cool here, by comparison. I wonder if I can push it a tiny bit...
My A3 GTS runs at up to 700/1080, but with any more it starts artifacting pretty bad. Paired with 2GB of 4-4-4-12 PC2-6400 and an E6600 at 400x8 I got 11321 in 3dMark06.
I'm pleased.
Vista is designed to not fuck up in the event of a hardware change, unlike XP. In theory, you could even replace the motherboard and have it still work. I have yet to see if this is true, however.
1) Not really. There's several solutions that people commonly recommend. The problem is that none of them work. Refer to this for more info: http://www.dansdata.com/askdan00015.htm
2) Yes, no problem there.
Yep. Thanks, that's exactly how I feel. Just because whatshisface up there hates the interface, doesn't mean everybody will. I've noticed that some people just like to hate things.
If Vista isn't for you that's fine, but that hardly means that it's for nobody.
Vista works wonderfully for me. Given a choice, I would NOT switch back to XP. For a lot of people, once they get used to Vista there's no going back. There's too many little improvements that make a difference, despite those who would tell you that it's XP with a facelife (which is bullshit...
If you won't be overclocking, there's no reason not to go AMD. I prefer AMD's motherboard and platform anyways. I just bought an Intel rig and I found that every chipset and board left something to be desired (for features) while with AMD boards I find that a nice high-end board has everything...
That is not the case at all. The XTX was intended to be a version of the same card (at 80nm) with 1GB of GDDR4 memory and slightly higher core clocks (~800mHz). It was cancelled/delayed indefinitely because little to no gain was seen from the extra memory capacity/bandwidth and it was too...
lol
If I was that CSR you'd be the kind of customer I'd tell all my friends about, in the sense of "Man, wait until I tell you guys about this *bleep* I had on the phone today".
Another Jerker here! I didn't realize there were so many out there. My GF has one too, as does my former flatmate.
The chair does look awesome though. I have a really crappy chair that's probably destroying my lower back. I need a new one.
I probably will, though it doesn't mean I'm devoted. I simply want an Intel Mobo and the ability to run dual graphics cards, which kind of rules out nVidia (unless I feel like using hacked drivers, which I don't).
So ATI should abandon years of R&D and rely entirely on their last-generation cards for 6 months while they bring forward their next next-gen design simply because (hypothetically speaking of course, we only have interweb rumors to base this on) it doesn't outperform their competitor? A...
rofl no
Just get the non-ultra. It's NOT worth the extra cost. The ultra is a sham, merely to make money of dumb people who have way too much. Even one ultra is too many. It's not worth spending like 60% more for like 10-15% more performance.
I wouldn't say soft just yet. They're...
We will actually probably see both. High-end boards will be DDR2 only or DDR3 only, but there will certainly be some low-end boards that support both memory types (but only one at a time, of course).
True, however the R600 will run with 2x6-pin. The 8-pin is optional, as has already been said.
Therefore, the 2x75W for the connectors and 75W for the pci-e slot still totals 225W max. As has been said, if the 8-pin is connected, which is optional, it will provide additional power which...