Picked up G80's

Status
Not open for further replies.
I need this card RIGHT NOW

any1 knows where can I order it online that ships to europe?
 
Could you measure how long the card is at all ? I'd like to know as I'm certain I have exactly enough room but need confirmation!
 
Arcygenical said:
Aluminum and expensive...

I'll just take my MCW, get a new hold down kit, and fabricate a cooler for the extra RAMDAC :)
i called swiftech and theyre making a new mounting hardware kit for it :) shouldnt cost more than a few $
 
decapitator said:
I called Fry's and asked if they would exchange the cards on release day because the cards were recalled, and they said they know nothing of a recall but if I was not happy with them they would take them back and I had 30 days to do so, their receipt says 14 days so its one lie after another.

They said they would not exchange them with out printed proof of a recall and they could careless if I return them or not, this was a supervisor. :D

Thanks again man. I'm probably going to go the Newegg route just to be safe. Buy a card from Newegg tomorrow morning, and when I get it on Thursday, return the one that I bought from Fry's. That way I know I'll have a good card, since Newegg is probably really tight with EVGA. Plus the card I get from Newegg should have the real EVGA black design, not the silly reference sticker. :p
 
jcll2002 said:
i called swiftech and theyre making a new mounting hardware kit for it :) shouldnt cost more than a few $

Yeah man, I emailed them awhile ago :) I can't wait!
 
so AF does not control the angle adjustments, but the distance adjustment, keeping textures from being blurry at farther distances?

it deals with both. quality in terms of distance is referred to as the LOD bias, as you can see in the slider in the AF tester. the lower this bias, the further the "good" texture will extend before being switched to a lower texture. the angles you see some video cards produce are mostly due to the fact that they've decided "its okay" to skimp on some angles.

this is why most of us are giddy over G80's AF, as it seemingly renders AF properly at all angles, making a much, much more natural scene.
 
phez said:
it deals with both. quality in terms of distance is referred to as the LOD bias, as you can see in the slider in the AF tester. the lower this bias, the further the "good" texture will extend before being switched to a lower texture. the angles you see some video cards produce are mostly due to the fact that they've decided "its okay" to skimp on some angles.

this is why most of us are giddy over G80's AF, as it seemingly renders AF properly at all angles, making a much, much more natural scene.

It's very true. The cobblestone patterns and such in Oblivion are simply amazing looking at HQ 16xAF. No blurryness at all.
 
is the launch time going to correspond to announcement at the Glan? Or is it a midnight thing...... :confused:
 
Nuvian said:
there is a 30 pages chinese review up,the babel fish translation isnt that bad ither.

http://www.pconline.com.cn/diy/graphics/reviews/0611/898744.html

G80拥有足够强大马力应付在XHD分辨率和高画面质量设定下畅快进行游戏,16xAA所带来的效果也仅仅是带来相当于4xAA所带来的系统花销。SLI 技术会带来接近翻倍性能的提升的表现。128bit精度的HDR和16x抗锯齿的结合带来顶级画质。内建的PureVideo HD功能的G80可以在低CPU占用率上面确保流畅完美的HD和SD Playback回放。有效的功耗占用和管理给Geforce8800带来更好的Performance per wat(每瓦特性能)和Performance per square millimeter(每平方毫米效能)表现。

Clearly this guy is shellshocked by the 8800GTX.
 
I might pass on G80, as I currently own a 7950 GX2. I would go G80 SLi but nVidia won't allow SLi to run on Intel motherboard chipsets. I don't like nForce motherboards, they just aren't very solid boards (and I've owned quite a few of them). We all know that it's just the driver disabling SLi on other chipsets, nVidia should just allow SLi on non nForce chipsets. I would get 2 8800 GTX's if they supported SLi on the 975X, but since they don't I will probably end up going R600 Crossfire.

nVidia should rethink there stance on SLI, although it might help them sell more motherboards, it may also make them sell less video cards. And in my case helping them sell neither.
 
Arcygenical said:
Aluminum and expensive...I'll just take my MCW, get a new hold down kit, and fabricate a cooler for the extra RAMDAC :)
No kidding. Those German designers have yet to grasp the concept of galvanic corrosion and how radically it can destroy their well-engineered components, not to mention how poorly aluminum fares to copper in terms of thermal conductivity. If they want to use a silver-colored metal so badly, why not use silver?

Christ, I want an 8800 now. You're all horrible people.

Does anyone have any idea how reputable this Chinese site is? I call these results into great question:
xl600thumbup6.jpg
 
Sabrewulf165 said:
Wow the GTS looks weak. Really weak. Hard to believe that's all $500 buys you :mad:


Right, even though it's $450, and kickedthe shit out of the x1950 in those tests, except for a couple of games like serious sam and nfs. If the performance isn't what you want then get a gtx :p
 
The benchmarks from that Chinese site seem more than a little off to me. I don't doubt that the performance of the G80s beats out the competition, but the numbers on all the GPUs they tested seem pretty inflated.
 
saan44 said:
The benchmarks from that Chinese site seem more than a little off to me. I don't doubt that the performance of the G80s beats out the competition, but the numbers on all the GPUs they tested seem pretty inflated.

They look like maximum fps numbers to me.
 
ITSTHINKING said:
Right, even though it's $450, and kickedthe shit out of the x1950 in those tests, except for a couple of games like serious sam and nfs. If the performance isn't what you want then get a gtx :p

I would hardly call it a shit-kicking. Look at the AA/AF numbers. The GTS is not that much faster. And don't worry, I AM going to get the GTX, I was just commenting ;)
 
mike said:
They look like maximum fps numbers to me.
That's a fair assumption, though them testing indoors may be a better assumption, as the rest of the numbers they have seem about right.

In any case, the GTX is 2.2 times faster than the previous generation in their odd Oblivion benchmark, which is massively impressive.
 
Sabrewulf165 said:
I would hardly call it a shit-kicking. Look at the AA/AF numbers. The GTS is not that much faster. And don't worry, I AM going to get the GTX, I was just commenting ;)
good good. I was worried there for a second!
 
ppl need to remmber that this is only 1 review,with 1 rig,and even if thats how the other sites will bench that GTS as....we must remmber that the drivers are quite imature,things can only go up the next 2-3 months,i still belive the GTS is a good investment,altho the GTX is a better 1,so ill be going for the GTX.
 
sumofatguy said:
good good. I was worried there for a second!

Yeah I could tell. I thought about even not upgrading at all, but I couldn't live with myself knowing I made you worry *tear*
 
Oooh hot obliv #'s! That's why I'm getting this card...

So does anyone know when NDA's up? Midnight? 9AM PST on Wednesday the 8th?
 
Jodiuh said:
Oooh hot obliv #'s! That's why I'm getting this card...

So does anyone know when NDA's up? Midnight? 9AM PST on Wednesday the 8th?

i hope it be on newegg at midnight i got my credit card out ready to order before they sold out :D :p
 
LOCO LAPTOP said:
i hope it be on newegg at midnight i got my credit card out ready to order before they sold out :D :p

Ditto that, but I don't think it's midnight, I think it's sometime tomorrow morning, which means getting up early or staying up reeeeally late :(
 
phide said:
That's a fair assumption, though them testing indoors may be a better assumption, as the rest of the numbers they have seem about right.

In any case, the GTX is 2.2 times faster than the previous generation in their odd Oblivion benchmark, which is massively impressive.

I am cautiously optimistic that the G80 will be a massive upgrade in Oblivion. The numbers in their test do seem high, but considering what a shader-heavy game Oblivion is and the fact that G80 has 3x more shaders than the X1950, it seems reasonable to expect it could provide double or better performance.

What I really want to know is whether or not nvidia will update their drivers to allow HDR+AA in Oblivion. That, in my opinion, is the critical factor.
 
Sabrewulf165 said:
I am cautiously optimistic that the G80 will be a massive upgrade in Oblivion. The numbers in their test do seem high, but considering what a shader-heavy game Oblivion is and the fact that G80 has 3x more shaders than the X1950, it seems reasonable to expect it could provide double or better performance.

What I really want to know is whether or not nvidia will update their drivers to allow HDR+AA in Oblivion. That, in my opinion, is the critical factor.

Thankfully, tomorrow's (presumably) the day that all of this gets figured out. I can't wait.
 
Sabrewulf165 said:
the fact that G80 has 3x more shaders than the X1950, it seems reasonable to expect it could provide double or better performance.

Well it's not that simple since the shaders are definitely not comparable and the clockspeeds are different.

G80 ~ 518 Gflops
R580 ~ 426 Gflops
G70 ~ 305 Gflops

So it's not close to 3x based on theoretical flops, more like 1.2x. But there should be efficiency gains as well.
 
trinibwoy said:
Well it's not that simple since the shaders are definitely not comparable and the clockspeeds are different.

G80 ~ 518 Gflops
R580 ~ 426 Gflops
G70 ~ 305 Gflops

So it's not close to 3x based on theoretical flops, more like 1.2x. But there should be efficiency gains as well.

I guess we'll find out tomorrow. I'm most likely ordering one regardless. If the benches are awful I can always cancel the order.
 
Sabrewulf165 said:
What I really want to know is whether or not nvidia will update their drivers to allow HDR+AA in Oblivion. That, in my opinion, is the critical factor.
I honestly hope that they don't. I don't want developers to maintain this "eh, let the driver teams fix it" mentality. For future titles, I don't want to bypass a "HDR and anti-aliasing cannot be enabled at the same time" pop-up box; I just want it to work.
 
Sabrewulf165 said:
I am cautiously optimistic that the G80 will be a massive upgrade in Oblivion. The numbers in their test do seem high, but considering what a shader-heavy game Oblivion is and the fact that G80 has 3x more shaders than the X1950, it seems reasonable to expect it could provide double or better performance.

What I really want to know is whether or not nvidia will update their drivers to allow HDR+AA in Oblivion. That, in my opinion, is the critical factor.

Haven't you seen my tests?

One 8800GTX, Oblivion in foilage area with all sliders and options maxed and HDR, at 1600x1200, is about 30fps minimum, 43 average.
 
i've been wanting to play through oblivion again as a mage, but i think i'm going to have to wait until i get my G80

THIS IS THE FINAL COUNTDOWN!!!! dbooom dbooom shaboomboom


this is the final.... COUNTDOWN!!
 
Jasonx82 said:
AHHH 9 more hrs! :eek: :D

I'm pretty sure it's happening in the morning (6am-9am), not at midnight. I contacted Newegg and asked them "If you were to launch a new product tomorrow, what time would it show up?". The guy told me "Anywhere from 6am-5pm, depending on stock and inventory",
 
mike686 said:
Haven't you seen my tests?

One 8800GTX, Oblivion in foilage area with all sliders and options maxed and HDR, at 1600x1200, is about 30fps minimum, 43 average.

Yeah I saw that, but I'm not playing at 16x12. Also I don't mean "double performance" in all situations necessarily, just the ones that are choking me up right now. the difference between 15 and 30 fps goes from unplayable to quite playable. Also it sounded like you had AA enabled in the driver when you ran those tests... I know it's not applied in the game, but I'm not sure if that might still slow things down or not.

If you don't mind, maybe you could run this Oblivion test for me (you don't have to of course but it would help give me an idea :) ):

DISABLE HDR in the Oblivion launcher (Choose Bloom instead)
SET a resolution of 1280x1024 fullscreen in the launcher
ENABLE 4xAA in the driver as well as Transparency Antialiasing (supersampling)
ENABLE 16xAF High Quality in the driver
MAX all in-game sliders and turn all settings on (except HDR, obviously)

Go to the Cheydinhall West Gate, go out of it, and walk directly from there towards Lake Arrius. This path should take you through a lot of dense forest and vegetation, which with 4xAA and TSSAA enabled will give you the "worst case scenario" from a graphics standpoint. Bench this walk with FRAPS and post your min/avg. You don't need to walk all the way to the lake of course, a short jaunt should be good enough, just make sure the "area" is done loading.

If you don't have time to do this or don't feel like it I totally understand, but it would give me a good idea of what the limits of the card in Oblivion are. You can try these same settings and walkthrough at other resolutions too if you want but I'm mainly interested in 12x10 since that roughly equates to the native res on my 940BW (1440x900).
 
phide said:
I honestly hope that they don't. I don't want developers to maintain this "eh, let the driver teams fix it" mentality. For future titles, I don't want to bypass a "HDR and anti-aliasing cannot be enabled at the same time" pop-up box; I just want it to work.

HDR+AA is the way it is in Oblivion because developers wanted it not to have that capability, not because it needed something to be fixed.
 
Sabrewulf165 said:
Yeah I saw that, but I'm not playing at 16x12. Also I don't mean "double performance" in all situations necessarily, just the ones that are choking me up right now. the difference between 15 and 30 fps goes from unplayable to quite playable. Also it sounded like you had AA enabled in the driver when you ran those tests... I know it's not applied in the game, but I'm not sure if that might still slow things down or not.

If you don't mind, maybe you could run this Oblivion test for me (you don't have to of course but it would help give me an idea :) ):

DISABLE HDR in the Oblivion launcher (Choose Bloom instead)
SET a resolution of 1280x1024 fullscreen in the launcher
ENABLE 4xAA in the driver as well as Transparency Antialiasing (supersampling)
ENABLE 16xAF High Quality in the driver
MAX all in-game sliders and turn all settings on (except HDR, obviously)

Go to the Cheydinhall West Gate, go out of it, and walk directly from there towards Lake Arrius. This path should take you through a lot of dense forest and vegetation, which with 4xAA and TSSAA enabled will give you the "worst case scenario" from a graphics standpoint. Bench this walk with FRAPS and post your min/avg. You don't need to walk all the way to the lake of course, a short jaunt should be good enough, just make sure the "area" is done loading.

If you don't have time to do this or don't feel like it I totally understand, but it would give me a good idea of what the limits of the card in Oblivion are. You can try these same settings and walkthrough at other resolutions too if you want but I'm mainly interested in 12x10 since that roughly equates to the native res on my 940BW (1440x900).

Sure, give me some time, and I'll be back with the results.
 
Status
Not open for further replies.
Back
Top