Geforce GTX 200 launch on June 18th

Status
Not open for further replies.
3xGTX 280 lol

you just need a 2000watt psu for it, and the heat will make you cry
3x 250w of heat is 2557.5 BTU/h will heat a small house
 
3xGTX 280 lol

you just need a 2000watt psu for it, and the heat will make you cry
3x 250w of heat is 2557.5 BTU/h will heat a small house
Well I'm starting to be serious on these ideas of building heatpipe from computer to my sauna which actually just behind the wall from this computer :D..3xGTX280 + 2xQX9770 (all five overclocked) on skulltrail platform + 3dMark06 looping -> you'll get those sweet 200-200 F temperatures like you get in finnish sauna :p

That 250W will be official TDP and that same number as 9800 GX2 has [unofficial number from Nvidia is 197W]

edit:
or perhaps not.. typical sauna has 5-10kW unit :p
 
These new cards will put out more heat than the gx2 right? Quad sli with my gx2's was hotter than my small space heater. I'll be interested to see if it will be more efficient to sell my 9800GTX's or step them up considering i got them for 260 each.
 
Well 9800 GX2's official TDP value is 250W and GTX 280's will be 250W. This is basically theoretical peak consumption number..so..it's hard to tell.
 
250Watts is official? where are you getting the "official" part?
Official for 9800 GX2..don't know about GTX 280 but every rumour would indicate 250W number.

Nvidia on other hand has said unofficially that 9800 GX2's more realistic TDP is 197W
 
Official for 9800 GX2..don't know about GTX 280 but every rumour would indicate 250W number.

Nvidia on other hand has said unofficially that 9800 GX2's more realistic TDP is 197W
well not that 10 watts makes a difference but the rumor has been 240 watts for the gt200 since the beginning.
 
Saw some mention a few links back that the GTX 280 won't support Tri-SLI. Undoubtedly that would be due to the TDP and also suggests that the rumors of a 55nm variant being released shortly after hold quite a bit of water.
 
Saw some mention a few links back that the GTX 280 won't support Tri-SLI. Undoubtedly that would be due to the TDP and also suggests that the rumors of a 55nm variant being released shortly after hold quite a bit of water.
Then why does GTX 280 have TWO SLI connectors if it wouldn't support TRI-SLI?
 
Then why does GTX 280 have TWO SLI connectors if it wouldn't support TRI-SLI?

The 8800GTX has always had two SLI connectors and yet only relatively recently has Tri-SLI been an actual possibility. nVidia could be initially limiting Tri-SLI in an attempt to better optimize drivers for that, to try to ensure that GTX 280's are available to more people at launch, or whatever, I don't know. As the 8800GTX proved, just because the hardware configuration supports Tri-SLI does not mean nVidia will immediately have to (perhaps they want to wait for the 55nm variant so they can say to those who complain about the heat/power draw of GTX 280 Tri-SLI, "See? We told you! If you wanted Tri-SLI you should have gotten the 55nm ones!").
 
maybe because there not a psu out that will handle 3 gtx280's till they go to the 55nm variant
 
so nvidias new high end will require 2x 8 pin connectors? ive been out of the loop a bit.
 
No, but in that GTX 280 picture one can see it:
200805201808234174cz4.png
 
http://www.dailytech.com/article.aspx?newsid=11842


NVIDIA's upcoming Summer 2008 lineup gets some additional details

Later this week NVIDIA will enact an embargo on its upcoming next-generation graphics core, codenamed D10U. The launch schedule of this processor, verified by DailyTech, claims the GPU will make its debut as two separate graphics cards, currently named GeForce GTX 280 (D10U-30) and GeForce GTX 260 (D10U-20).

The GTX 280 enables all features of the D10U processor; the GTX 260 version will consist of a significantly cut-down version of the same GPU. The D10U-30 will enable all 240 unified stream processors designed into the processor. NVIDIA documentation claims these second-generation unified shaders perform 50 percent better than the shaders found on the D9 cards released earlier this year.

The main difference between the two new GeForce GTX variants revolves around the number of shaders and memory bus width. Most importantly, NVIDIA disables 48 stream processors on the GTX 260. GTX 280 ships with a 512-bit memory bus capable of supporting 1GB GDDR3 memory; the GTX 260 alternative has a 448-bit bus with support for 896MB.

GTX 280 and 260 add virtually all of the same features as GeForce 9800GTX: PCIe 2.0, OpenGL 2.1, SLI and PureVideoHD. The company also claims both cards will support two SLI-risers for 3-way SLI support.

Unlike the upcoming AMD Radeon 4000 series, currently scheduled to launch in early June, the D10U chipset does not support DirectX extentions above 10.0. Next-generation Radeon will also ship with GDDR5 while the June GeForce refresh is confined to just GDDR3.

The GTX series is NVIDIA's first attempt at incorporating the PhysX stream engine into the D10U shader engine. The press decks currently do not shed a lot of information on this support, and the company will likely not elaborate on this before the June 18 launch date.

After NVIDIA purchased PhysX developer AGEIA in February 2008, the company announced all CUDA-enabled processors would support PhysX. NVIDIA has not delivered on this promise yet, though D10U will support CUDA, and therefore PhysX, right out of the gate.

NVIDIA's documentation does not list an estimated street price for the new cards.
 
Thanks, I've seen this picture and it looks to be one 6 and one 8 pin plug. Was trying to find something definitive if possible at this time. I need to decide how many modular cables I need for my Galaxy 1000.
 
Someone with technical skills

GTX 280 is said to be 240SP (240FP, 240MADD) card can some tell..erm..something about those two last ones? For example how this compares to G80 and G92 and what effects those differencies would have?
 
Imo, given that three 9800GTX's were getting close and the GTX 280, going from a purely theoretical standpoint:
-Has twice the bus width of the 9800GTX (that card's biggest limiting factor)
-Has almost twice as many stream processors (on a single card which are not subject to any of the efficiency drop SLI generally entails), each of which are supposedly 50% more effective

You may not need SLI to get Crysis going at 30fps and max settings at 1600x1200 or 1920x1200.

Btw, before someone makes the inevitably stupid comment, "Why so much focus on just one game!?"

It's a PC game; it's far from "just one game"
 
Does anyone know the length of the new 280? my 8800 ultra just squeezed into my case.. If it is any longer it will be dremel time
 
ATI not being to strong lately means a 9800 GTX that performs less with aa and anisotrophic then 8800 GTX lol. Really I think ATI will be the preferred choice now that nVidia got lazy with development. Also I am so disappointed it seemed like nvidia started cheating with their drivers again with the 175.xx series. blurry textures compared to the 169... ATI doesn´t tend to do this not as heavilly anyway.
 
ATI not being to strong lately means a 9800 GTX that performs less with aa and anisotrophic then 8800 GTX lol. Really I think ATI will be the preferred choice now that nVidia got lazy with development. Also I am so disappointed it seemed like nvidia started cheating with their drivers again with the 175.xx series. blurry textures compared to the 169... ATI doesn´t tend to do this not as heavilly anyway.

Quack3.exe anyone? :p

You'll see soon that nVidia was anything but lazy ;).
 
That's one huge chip...:eek:

I can't wait to see benchmarks of these cards. Those rumors about 50% faster Stream Processors, surely gave some grounds for the "GTX 280 twice as fast as the 9800 GX2" rumors. If true, this will be the biggest jump in performance, ever...
 
That's one huge chip...:eek:

I can't wait to see benchmarks of these cards. Those rumors about 50% faster Stream Processors, surely gave some grounds for the "GTX 280 twice as fast as the 9800 GX2" rumors. If true, this will be the biggest jump in performance, ever...

I should run some benches on my 6800GT so I can compare the jump in performance when I get this behemoth :D
 
That's one huge chip...:eek:

I can't wait to see benchmarks of these cards. Those rumors about 50% faster Stream Processors, surely gave some grounds for the "GTX 280 twice as fast as the 9800 GX2" rumors. If true, this will be the biggest jump in performance, ever...

You can't see the actual chip on the pictures, it's hidden under the..what is the english word...heatsink?
 
Is it just my eyes and lack of sleep, but does that GTX 280 board look like it would be slightly shorter than that 8800 GTX board?
 
Status
Not open for further replies.
Back
Top