NVIDIA GeForce GTX 295 Preview @ [H]

I was expecting it to do much better honestly. nVidia seems to be really lacking in the memory bandwidth department.
 
That's what case fans are for? This card was not meant for some itty bitty SFF boxputer. The cooling from the pics looks much more serious and effective overall. Who knows if it will it be on the final version though?

I vote this pic for the best badassery involving an oem video card heatsink.
http://enthusiast.hardocp.com/image.html?image=MTIyOTU5MDAzNXNjRFphWklyZjJfMV8xMl9sLmpwZw==

The thing is, inside a case, the GPU fan might suck the hot air back into the cooler so the cooling won't be as effective as running the card without a case. This could affect the temperature reading in the review.
 
I'm looking forward to power/heat.

I want to know if I can upgrade some of my folding hardware to this. :D

 
I'm just hoping that GTX260/216 will be less pricey after 295 release and/or christmas hype.
 
I completely agree, i wish success to both ATI and Nvidia. Their never ending war will result only in better deals for us. I wish they keep on fighting...we don't want another Monopolysoft..do we?

It's a bad time to be a fanboy of either graphic card brand IMO. Best to always look at the major offerings from both when shopping, and then decide.

Or to put it another way, why be a fanboy of just one brand, do you think they care if you scream like a schoolgirl or just shrug when you spend your money, as long as it's on their product?
 
Too little, too late.
The only good news is that the GTX 295 may put some pressure on the 4870X2 price.

I will wait for the GTX 380 / 5870 before upgrading.
 
I expected better performance. I will keep my HD4870 and wait for next generation from AMD and NV.

 
Yawn! Kinda makes you wonder why people buy nVidia when ATI rules the roost. GPU's are king in the gaming dept. Why buy nV crap?
 
I'm looking forward to power/heat.

From early reports it sounds like it's quiet and its power draw shouldn't be any higher than the 4870X2 based on comparisons between the 65nm 260 and the 1GB 4870. Temps are still a mystery though.

Not sure if you guys have looked at any other previews but the card does very well against the 4870X2 unless it's choking on a mouthful of 2560x1600 8xAA. It's actually significantly faster in a lot of scenarios so unless you're running a 30 incher it's worth a look.
 
Yawn! Kinda makes you wonder why people buy nVidia when ATI rules the roost. GPU's are king in the gaming dept. Why buy nV crap?

please refrain from making un-insightful, pretentious comments whenever possible.
 
Thanks for the review Kyle and crew.

Serious question, at a pricetag of $499 is nvidia going to be selling these cards at a loss just so consumers don't go out an alternative which is ati's 4870x2?
 
Yet ANOTHER super sucky stock cooling solution.

And dual PCBs.... ughhhh.

The stock cooler seems to be pretty damn good from what I read, so I don't understand where your negative opinion is coming from. I definitely agree with you about the dual boards though.
 
So...about how much can we expect GTX 260 to be after this launches?

I wouldn't expect any change in price for the GTX260. This card will probably end up costing slightly more than two 260s combined anyway, so there's no need for nVidia to change any prices.
 
Expect a price drop of the HD 4870 X2 when the GTX 295 is out, it is now overpriced compared to 2x HD 4870 1GB.
 
Why is there no power consumption tests?

I think it would be worthwhile to compare the power used to an SLI GTX 260 setup, and of course, the 4870 X2. A system load test (and maybe system idle) is all I'm asking for.

There's more to this picture than pure performance - that's why I bought a 7900GT when there were faster cards (x1900) available that ate tons of power. When you have performance results that are this close, it would be a good idea to test the new product on power.
 
Nvidia didn't want people previewing power consumption. Presumably because the voltages/BIOS might not be finalized yet.
 
I thought this card wasn't supposed to be made...it couldn't and wouldn't be done, the [H]ardforum experts said. :confused:
 
Are you limited to driving two displays with the card? or can you disable sli to power three with both DVI ports and the HDMI port?

-Thanks
 
I thought this card wasn't supposed to be made...it couldn't and wouldn't be done, the [H]ardforum experts said. :confused:

If I recall all such comments were strictly related to the fact that the original die size of the GeForce products are currently too large to make this possible...

Well that is true, see this has a smaller die making this possible, and there will be a refresh of cards coming for the rest of the series in regards to this die shrink...

So yea, this is exactly what the [H]ardforum experts were alluding towards.

I hope I have cleared up your confusion. :D
 
I thought this card wasn't supposed to be made...it couldn't and wouldn't be done, the [H]ardforum experts said. :confused:
Couldn't be done with the 65nm GT200. This card uses the newer version of the GPU manufactured on a 55nm process, so power consumption and heat output is lower.
Are you limited to driving two displays with the card? or can you disable sli to power three with both DVI ports and the HDMI port?
Can't remember if I read it in this article or in another, but you can indeed disable SLI to use all three ports.
 
Can't remember if I read it in this article or in another, but you can indeed disable SLI to use all three ports.

Well how about two of these in quad sli? Can that drive 4 monitors then or still just two (and from which ports)?

Quad SLI with 4 monitors?
SLI enabled on one card, disabled on the other for 5 monitors?
 
Well how about two of these in quad sli? Can that drive 4 monitors then or still just two (and from which ports)?

Quad SLI with 4 monitors?
SLI enabled on one card, disabled on the other for 5 monitors?

I'm curious about this also, I thought some newer drivers fixed "SLI" type setups to run multiples monitors. :D
 
Premature drivers most likely. It should improve greatly once Nvidia optimizes the cards performance. We won't have to worry about AMD to much longer anyways. There stock just keeps on falling ;). They have enough for probably 2 more revisions of there CPU/GPU line tops before they are tapped out of cash.

And that would make you happy ? You should pray that it doesnt happen if you wanna still buy gfx cards at reasonable prices LOL
 
Well how about two of these in quad sli? Can that drive 4 monitors then or still just two (and from which ports)?

Quad SLI with 4 monitors?
SLI enabled on one card, disabled on the other for 5 monitors?

As far as I know, enabling any sort of SLI will limit you to two monitors. I'm not that up to date with nVidia's current drivers though, so I don't know if it's possible to run one card in SLI mode and the other in single-card mode (although to be honest, I wouldn't expect that to be the case).
 
I generally disapprove of slapping two GPUs into one card, whether it's ATi or nVidia. I especially disapprove of starving it for RAM, because you're supposed to be using these cards for the highest resolutions, where it needs the RAM the most! Giving the 9800GX2 only 512MB was a tragedy, and only 896MB on the GTX295 just says to me that nVidia didn't learn from that mistake. ATi learned that lesson, why didn't nVidia?
 
Giving the 9800GX2 only 512MB was a tragedy, and only 896MB on the GTX295 just says to me that nVidia didn't learn from that mistake. ATi learned that lesson, why didn't nVidia?

I agree with you completely but I think the reason for this maybe to restrict the card from it's true potential so customers will consider purchasing two single GPU based GeForce 260 / 280 55nm refreshed cards and SLI them.

Nvidia may also want it's next few sets of refreshes or the next gen card release to show much greater numbers than the current generation of Nvidia cards. By restricting the design of the 295 and making it better than a single 280 and close enough to a 4870X2 they basically meet the market needs that the 4870X2 is currently filling without risking the card holding numbers and performance close to the next generation of cards.

For ATI they hadn't held the performance crown in a while and they were in desperate need to get in the eye of the enthusiast market again. They didn't pull any punches and hopefully their next generation of single GPU cards can easily out do the 4870X2, but at this time ATI / AMD is looking to make as much money as they can and gain more market share back since the company is obviously hurting...

If ATI's drivers weren't so horrible when I tested one of their current generation cards in my PC while using my 1080P TV I would have gone for the 4870X2 over my 280 GTX SSC... Other than this it seems it is only some 1080P TVs that are affected and I don't see this as a reflection of the rest of their drivers performance. I hope ATI continues to give Nvidia a run for it's money, since Nvidia obviously knows how to sit on a product and milk it's nipples dry... AKA 8 series and it's many still ongoing refreshes.
 
As far as I know, enabling any sort of SLI will limit you to two monitors. I'm not that up to date with nVidia's current drivers though, so I don't know if it's possible to run one card in SLI mode and the other in single-card mode (although to be honest, I wouldn't expect that to be the case).

curently nVidia will allow SLI and >2 monitors (and span across them, i think) but you need a third nvidia graphics card with a GPU from a different generation - but the game itself needs to support this, and it's only featured in a few games.

source: http://www.slizone.com/page/slizone_faq.html#s7

however (and i asked about this above) it didn't seem clear to me from the article whether you have to disable the "internal" SLI on the 295 to use all three graphics ports, or if that only happens when you enable quad-SLI. probably the former, but one can hope.
 
Seeings how I have $275 of "step up" credit from eVGA for my GTX 260 192, it be waste to NOT get the GTX 295 vs a GTX 280. If I don't do something by Feb 5th, I waste it. Only downside to this is I gotta upgrade the PSU.
 
however (and i asked about this above) it didn't seem clear to me from the article whether you have to disable the "internal" SLI on the 295 to use all three graphics ports, or if that only happens when you enable quad-SLI. probably the former, but one can hope.

I'm pretty sure you do have to disable the internal SLI to use all of the ports. Otherwise, you're limited to two of them.
 
I've been reading people saying that the fact the card vents back into the case via the top vents is crap. I say look at your case, and tell me if it's vented there. On my Lian Li A05, which is the entry model of Lian Li, there are vent holes above the PCI slots. Thus, my 9600GT, and this GTX295, can both vent heat OUTSIDE my case, and not into it. Just something to point out.
 
I'm pretty sure you do have to disable the internal SLI to use all of the ports. Otherwise, you're limited to two of them.

i figured that was more likely. NV's triple-monitor support seems like it's mainly a way to push extra products on consumers (and PhysX on developers). i can't see them giving that up just because you buy a $500 product.
 
i don't understand why everyone is getting their panties balled up their assholes over this issue of the card venting some of the heat back into the case. the cooling unit of the 8800gts/gtx/ultra (g80) had three vent slots at the opposite end of the fan. no one remembers?
 
With DX10.1 7.1 over HDMI I think the 4870X2 wins.
GTX295 is a good card though and if you cant afford the 4870X2 I would jump at it.
I expect price drops.
 
ive always been a big fan of Dual GPU cards, Since I had good experience with Asus 7950 GX2 which I still use it in my old RIGS. Depends on how the final card performs, I think its a good upgrade for me from 8800 Ultra SLI
 
Seems like Nvidia have more or less matched the 4870X2, seems a little late to the party though...ugh these cards are tempting with only 512mb of memory on my 4870 Xfire setup Im feeling the pinch in games like GTA4 with only medium textures.
 
I've been reading people saying that the fact the card vents back into the case via the top vents is crap. I say look at your case, and tell me if it's vented there. On my Lian Li A05, which is the entry model of Lian Li, there are vent holes above the PCI slots. Thus, my 9600GT, and this GTX295, can both vent heat OUTSIDE my case, and not into it. Just something to point out.

These holes are not enough to prevent the hot air to be sucked back by the GPU fan. Unless you have a fan to push the hot air through these holes, I'm sure that you will see a temperature difference between running the card inside a case and running it without a case. This card consumes more power than the 8800GTX.
 
Thanks for the review Kyle and crew.

Serious question, at a pricetag of $499 is nvidia going to be selling these cards at a loss just so consumers don't go out an alternative which is ati's 4870x2?

I seriously doubt they'll be sold at a loss. They are just going to loose quite a bit of their margin. Usually high end cards have about $100-$200 of margin on them if not more. In this case I'd be surprised if they made much if anything over $100 on each one sold at full price.
 
Back
Top