G71 - NVidia's Secret Weapon vs. R520

FanATIc said:
Its going to completely obsolete the rest of Nvidia's 7 series at those speeds and make them all look like bargain cards.

The only thing that will make a video card obsolete are the apps. And since current games certainly aren't limited by the 7800GTX now, I don't think releasing a more powerful card will change things.
 
DarkBahamut said:
Not really. I paid 'top dollar' for my GTX, but honestly, i couldnt care less if the R520 or G71 is faster. I mean, whats happenes if i waited and got a '7800 Ultra' or R520? No doubt 1 month after they launch there will be G80 or R600 stuff coming out or whatever, and we'll have the whole thing again.

Simple fact is, if you keep waiting for the next fastest thing in the video card market, then you will never upgrade, because there is always something faster coming, no matter what card you buy.
Well said. If the R520 or G71 is THAT much faster than my 7800GTX, I'll get one of them. It's not hard to sell your hardware to pay for new hardware. Yeah, you lose some money, but if your not willing to lose money then your "Hobby" shouldn't be computers.
 
Historically speaking, a g71 would be a step down from a g70. If nVidia follows thier past gpu numbering scheme.
 
I just hope these new cards will bring the 6800 ultra to around $299. there needs to be some price changes before quakecon. brotha man needs an upgrade.
 
pettybone said:
I just hope these new cards will bring the 6800 ultra to around $299. there needs to be some price changes before quakecon. brotha man needs an upgrade.
nah, Nvidia will probably just release a lower end 7xxx series card to replace the 6800U. In the past, ATI and Nvidia always seem to do this (instead of reselling the GeForce2 MX Nvidia just released the GeForce4 MX; and ATI did the same thing when they released the 9600XT instead of cutting the price of the 9500pro).

I don't know why they do this, however. Wouldn't it be cheaper to just slash the prices on an older card rather than put money into the development of a brand new one that does exactly the same thing?
 
GVX said:
I don't know why they do this, however. Wouldn't it be cheaper to just slash the prices on an older card rather than put money into the development of a brand new one that does exactly the same thing?

No, because a G70 with 6800U performance would run cooler, quieter and be cheaper to produce.
 
DarkBahamut said:
Not really. I paid 'top dollar' for my GTX, but honestly, i couldnt care less if the R520 or G71 is faster. I mean, whats happenes if i waited and got a '7800 Ultra' or R520? No doubt 1 month after they launch there will be G80 or R600 stuff coming out or whatever, and we'll have the whole thing again.

Simple fact is, if you keep waiting for the next fastest thing in the video card market, then you will never upgrade, because there is always something faster coming, no matter what card you buy.

so true.
 
eno-on said:
Historically speaking, a g71 would be a step down from a g70. If nVidia follows thier past gpu numbering scheme.


thats the thing, they didnt think anyone would catch on
:D
 
Electric Boogaloo said:
The only thing that will make a video card obsolete are the apps. And since current games certainly aren't limited by the 7800GTX now, I don't think releasing a more powerful card will change things.


considering the gtx still struggles quite a bit with soft shadows and HDR, i still havent seen a card that can fully utilize every aspect that todays games can enable.

See what happens when you run HDR at 1280x1024 with full AF, or soft shadows at the same resoltution with full AA and AF. You wont be able to play at an acceptable average. A 370mhz increase on the core might enable that, but i have to ask, what the hell they're doing with a core no where near capable of that and labeling it their best or at least charging highest end prices if they release an ultra that much better. Even a winter refresh, for a jump that big, would be a first. I have to honostly call BS on this because of the speed increase, i mean the G70 is barely capable of that on phase. Regaurdless of whether it should or shouldnt be done, i'd simply feel quite betrayed as a customer if this were real. Its not a dismal increase we're talking about here, its an about at 85% improvment over the current GTX reference speeds. Thats nothing short of rediculous. Its not a generation leap where i wouldnt care, its the same series only one card would be way way above the rest, asolutly no sense what so ever in this. The INQ has really gotten bad.

edit- Everything points to the G71 to being a midrange core, not a highend.
 
FanATIc said:
considering the gtx still struggles quite a bit with soft shadows and HDR, i still havent seen a card that can fully utilize every aspect that todays games can enable.

See what happens when you run HDR at 1280x1024 with full AF, or soft shadows at the same resoltution with full AA and AF. You wont be able to play at an acceptable average. A 370mhz increase on the core might enable that, but i have to ask, what the hell they're doing with a core no where near capable of that and labeling it their best or at least charging highest end prices if they release an ultra that much better. Even a winter refresh, for a jump that big, would be a first. I have to honostly call BS on this because of the speed increase, i mean the G70 is barely capable of that on phase. Regaurdless of whether it should or shouldnt be done, i'd simply feel quite betrayed as a customer if this were real. Its not a dismal increase we're talking about here, its an about at 85% improvment over the current GTX reference speeds. Thats nothing short of rediculous.

Blanket statement: Complaining about being "betrayed" because somebody makes something better than what you bought...yeah that's about the lamest thing I can think of.

I mean hell...why don't you go round up all the people that paid $400 for a 9800XT the week before the x800xt came out and drive to ATi's headquarters in Canada with some picket signs...

LAME
 
Well whatever the case, I hope they offer it in 512MB versions. (i know its not needed now, but it will be sometime, and if I buy a 700$+ card it better last me)
 
FanATIc said:
considering the gtx still struggles quite a bit with soft shadows and HDR, i still havent seen a card that can fully utilize every aspect that todays games can enable.
<snip>
Wow that has to be one of the most brain dead posts I've ever rea... Oh it's by fanATIc.. nevermind.

People... It's from the INQ, of course it's BS! Gees, I can't believe this has to be said again. :rolleyes:
 
CrimandEvil said:
Wow that has to be one of the most brain dead posts I've ever rea... Oh it's by fanATIc.. nevermind.

People... It's from the INQ, of course it's BS! Gees, I can't believe this has to be said again. :rolleyes:
Normally I would agree with youo on the Inq, however, looking back at my personal notes... They have been right abouot everything that has been said about the Athlon X2 and the 7800GTX. The past 4 or 5 months have been real good for them.

So I dont totally disregaurd what they say. But its not always my first and / or only source of info.
 
FanATIc said:
considering the gtx still struggles quite a bit with soft shadows and HDR, i still havent seen a card that can fully utilize every aspect that todays games can enable.

See what happens when you run HDR at 1280x1024 with full AF, or soft shadows at the same resoltution with full AA and AF. You wont be able to play at an acceptable average. A 370mhz increase on the core might enable that, but i have to ask, what the hell they're doing with a core no where near capable of that and labeling it their best or at least charging highest end prices if they release an ultra that much better. Even a winter refresh, for a jump that big, would be a first. I have to honostly call BS on this because of the speed increase, i mean the G70 is barely capable of that on phase. Regaurdless of whether it should or shouldnt be done, i'd simply feel quite betrayed as a customer if this were real. Its not a dismal increase we're talking about here, its an about at 85% improvment over the current GTX reference speeds. Thats nothing short of rediculous. Its not a generation leap where i wouldnt care, its the same series only one card would be way way above the rest, asolutly no sense what so ever in this. The INQ has really gotten bad.
.

If you're talking about Fear, with all settings on high at lower res 1024x786 with x4 aa and x8 af runs around 45 fps with HDR and softshadows. Also there really is no use for AA in Fear.

And Chronicles of Riddick was fine up to 1024x786 also with x4 aa and x8 af. With soft shadows and HDR.
 
You guys care WAY TO MUCH!! Just go play yer games and when they aren't rendered fast enough for you, upgrade. Till then why argue and trade insults with retards over the internet??!?!
 
razor1 said:
If you're talking about Fear, with all settings on high at lower res 1024x786 with x4 aa and x8 af runs around 45 fps with HDR and softshadows. Also there really is no use for AA in Fear.

And Chronicles of Riddick was fine up to 1024x786 also with x4 aa and x8 af. With soft shadows and HDR.


Those resolutions arent acceptable to many people. Didnt know the beta had an HDR option and i definitly see a need for AA when the resolution is that low unfortunetly. Thats not really the point at all though. Its just too much of a jump to be even close to believable.
 
aa seems to cause input lag, or is it just my imagination, but when i do 8xS i get input lag in ut2k4 but fine fps
 
FanATIc said:
Those resolutions arent acceptable to many people. Didnt know the beta had an HDR option and i definitly see a need for AA when the resolution is that low unfortunetly. Thats not really the point at all though. Its just too much of a jump to be even close to believable.


Why? is it really the architecture's fault that they get hurt by these types of shaders and shadows? No matter what you do unless you quadruple the processing power will these games run at highest of resolution with all bells and whistles.

HDR takes 2 passes, softshadows in 2 pass, plus an expensive shader to create the penumbra. Doesn't matter what you do you have to recalcuate the data at least 4 times and with more expensive shaders involved this gets costly. Take Doom 3 and render it in 4 windows on one monitor from one graphics card. Thats what you have with Fear and Chronicles of Riddick.
 
VR-Zone said:
At first, we thought that G71 is for mid range segment to replace 6600 series but it turned out otherwise. The secret weapon for NVIDIA against ATi's R520 is actually G71, the successor to the current G70. VR-Zone has learned that G71 is based on a finer 90nm process technology at TSMC like the R520. NVIDIA is likely to make a switch when the process technology is more mature like towards end of the year. No doubt, the core frequency can be clocked higher on finer process. However, we can't rule out the possibility of releasing 7800Ultra on the 90nm G71 core when R520 launched since volume production are not expected for such a high-end GPU. Of course it could be based on 110nm process initially as INQ has suggested. 32-pipes or not, it is still unknown.

http://216.239.59.104/search?q=cache:06nQzDA-PGEJ:www.vr-zone.com.sg/?i=2445+G71+NVidia&hl=en

Didn't VR-Zone also say that the 7800 GT was due to be released last week with still unconfirmed specs?


VR-Zone said:
NVIDIA is preparing their GeForce 7800 GT for launch within this week and similarly retail stores worldwide will carry the stock as soon as it is launched. GeForce 7800 GT possess 24 pipes and 8 vertex shaders just like the 7800GTX except that the core and memory clock speeds are lowered. The reference 7800GT card is supposingly clocked at 335MHz core and has 1.1GHz memory clock using 2.0ns GDDR3 memories. No doubt, some card manufacturers will be fitting 1.6ns memories on this card to make their 7800GT cards more attractive. The best thing is that this card will be priced at US$499 which is $100 lower than the current 7800GTX card.

http://www.vr-zone.com/?i=2403&s=1

The rumor mill is working over time lately and more respected sites are buying into the rumors and posting as if it were the truth. :rolleyes:
 
me personally, i am waiting for my b-day... (late october) to get a R520... i am settling with my x800xl atm... but i have a FX-55 that is screaming for more work.......

the fx-55 can handle the x800xl, like it were not even there.... i swear.... the bottleneck in my computer has probbably got to be my memory...

which is the OCZ EL 2-3-2-6-1t pc3200 memory... basically it's value ram with tighter timings, and heatspreaders....

Link

i am planning on upgrading my computer to dual Acer al1715's, and upgrading my memory with another set of the 2x512 ram mentioned above....

it will be a pretty slick system :)
 
DarkBahamut said:
Not really. I paid 'top dollar' for my GTX, but honestly, i couldnt care less if the R520 or G71 is faster. I mean, whats happenes if i waited and got a '7800 Ultra' or R520? No doubt 1 month after they launch there will be G80 or R600 stuff coming out or whatever, and we'll have the whole thing again.

Simple fact is, if you keep waiting for the next fastest thing in the video card market, then you will never upgrade, because there is always something faster coming, no matter what card you buy.

Exactly. I always try to keep up to date.

I'm also glad I only paid $487.50 for mine. That was less than my X800XT PE cost me last year! .
 
Well as long as it comes out in 3 months time, I can just use eVGA step-up program.
 
Mister E said:
You guys care WAY TO MUCH!! Just go play yer games and when they aren't rendered fast enough for you, upgrade. Till then why argue and trade insults with retards over the internet??!?!


This is a computer enthusiast forum and the video card section of it, OF COURSE we are going to care a lot of current events of the topic. :)
 
Well I understand that. And I'm all for civilized discussions. Just seems like some people take graphics very personally. :p
 
razor1 said:
If you're talking about Fear, with all settings on high at lower res 1024x786 with x4 aa and x8 af runs around 45 fps with HDR and softshadows. Also there really is no use for AA in Fear.

And Chronicles of Riddick was fine up to 1024x786 also with x4 aa and x8 af. With soft shadows and HDR.

Riddick doesn't have HDR.
 
serbiaNem said:
Please, feel free to proliferate.

7800 GTX in SLI can quote "fully utilize every aspect that todays games can enable"

many games I have maxed out the in-game graphics settings, resolution, AA and AF

shoot, HL2 is playable at 1600x1200 8xS TRSSAA / 16XAF maximum settings
 
Brent_Justice said:
um no don't

it helps drive prices on previous cards down, and pushes technology forward and creates competition

... and it gives you a job :p.
 
What I'm thinking is that the real upgrades from now to soon is gonna be a M2 and DDR2 and a 7800 "ultra." Think of that compared to 939, DDR, and 7800GTX. Sounds a lot better, huh? I don't think that kind of setup will be out of date for a while as long as you get it when it comes out. I'm just worried about OCability. And still debating if I should wait a few months for all this stuff that is supposedly coming out "soon."
 
From a corporate point of view I think an AGP 7800 series makes sense. Slap on a bridge chip, rework the PCB, sell it to all those people with perfectly good 3+GHz/3000++ OEM machines that came with a lesser card. But unless you're upgrading an OEM system like the "plebians", I'm one of the few people who could actually sorta justify a high end AGP card at this point. If I had a $120 socket 939 mobo, I'd just toss it and get a PCI-e board. But I don't. I've got a $430 dual Opteron board. OTOH, I can't think of a good reason to upgrade from my 6800GT at the moment.

The 90nm G71 thing doesn't suprise me a bit. Hell, even Matrox does die shrinks. Even if it gains no performance, as long as yields don't suffer too much it should cut production costs. That would also explain why we got a "GTX" instead of an "Ultra".
 
OHHH NOOO....not more "The Inq" stuff :(

Terra - Anyone considered that the story got pulled(by pressure from NVIDIA) because it was total BS? :)
 
Why would they pull something that isn't true and isn't insulting them? If I just make up a story and post it on a news site, what right do they have to pull it if I didn't sneak into their factory or something? I think the VR-Tech guys saw something or interviewed someone, but nVidia doesn't want that interview out there.

a 90nm Ultra would be very exciting. Even if it performs the same, as you say a smaller process would mean less cost so it would be cheaper.
 
Russ said:
Why would they pull something that isn't true and isn't insulting them? If I just make up a story and post it on a news site, what right do they have to pull it if I didn't sneak into their factory or something? I think the VR-Tech guys saw something or interviewed someone, but nVidia doesn't want that interview out there.

Because a bogus-story could hurt current(7800GTX) sales, and give someone a bad taste(bad for image)when they waited out..for a "mainstream" part....and oh yeah...we all know "The Inq" is full of BS, VR-zone has a better reputation, and besides where does it say that NVIDA was the "requesting" part?
Again I see more speculation than facts, infact the "source" for that it was NVIDIA that requsted the article removed...is "The Inq"..perhaps the same source that gave them their powerclaims about the 7800GTX? :rolleyes:
Perhaps their "source" goofed up, and told them he/she had given them bad info, and requested that they dropped the "story"?
But then "The Inq" did what they are know for...reported speculations as "facts"...
Hell we can all do that ;)

I repeat:
"The Inq" is full of BS...fact...

Terra...
 
Terra said:
Terra - Anyone considered that the story got pulled(by pressure from NVIDIA) because it was total BS?
That would be a world first - an IHV pulling a story about itself that was wrong. What about the other ten zillion BS stories out there - why haven't they been pulled too, aren't they wrong enough or something?
 
Back
Top