FIRST retail box picture of a G92/8800GT, *everything* true!

Plus, if it was CPU bound, the GTX and GTS would be at the same fps, not stacked in Tiers.
Not really. Being CPU bound isn't typically a binary situation -- either CPU bound or not CPU bound. As different frames may produce different levels of stresses on various subsystems, you certainly can't say that something is or isn't "CPU bound" just because all cards at a certain level of performance aren't showing precisely the same average frame rates. That's a very unrealistic way to view the data.

We can reasonably assume that significant bottlenecking is occurring if the typical performance spread between platforms narrows significantly. If the spread is practically non-existent, then we can infer that there's a severe bottleneck.

I'm really surprised at your level of apprehension about coming to the same realization as everyone else is about this. Most of us are looking at the GT's specs and calling its position among other cards a "no-brainer".

would this card run good on an x2 4200+ AMD oc'd to 2.8ghz Socket 939 ?
Of course. Most tend to severely overemphasize the need for a blazing-fast proc to keep frame rates up. You're at a disadvantage, yeah, but you're still going to get good performance on that platform.
 
Not really. Being CPU bound isn't typically a binary situation -- either CPU bound or not CPU bound. As different frames may produce different levels of stresses on various subsystems, you certainly can't say that something is or isn't "CPU bound" just because all cards at a certain level of performance aren't showing precisely the same average frame rates. That's a very unrealistic way to view the data.

We can reasonably assume that significant bottlenecking is occurring if the typical performance spread between platforms narrows significantly. If the spread is practically non-existent, then we can infer that there's a severe bottleneck.

I'm really surprised at your level of apprehension about coming to the same realization as everyone else is about this. Most of us are looking at the GT's specs and calling its position among other cards a "no-brainer".

Really... so ok we have a GTX and an Ultra. The Ultra obviously having a significant gain in all games vs the GTX. Yet here they have the exact same.

And what happens if we OC a GTS to near GTX performance? Well because its still called a GTS according to you it will stay at the gts level and not show off any of that extra horse power. :rolleyes:

I I didn't say that the GT can't get to GTX levels, but when an Ultra and GTX have the same fps, and a OCed GTS vs stock GTS have the same levels, it looks like someone just made up #s w/o thinking. So I'm calling BS on the benchmark.
 
Not really. Being CPU bound isn't typically a binary situation -- either CPU bound or not CPU bound. As different frames may produce different levels of stresses on various subsystems, you certainly can't say that something is or isn't "CPU bound" just because all cards at a certain level of performance aren't showing precisely the same average frame rates. That's a very unrealistic way to view the data.

We can reasonably assume that significant bottlenecking is occurring if the typical performance spread between platforms narrows significantly. If the spread is practically non-existent, then we can infer that there's a severe bottleneck.

I'm really surprised at your level of apprehension about coming to the same realization as everyone else is about this. Most of us are looking at the GT's specs and calling its position among other cards a "no-brainer".


Of course. Most tend to severely overemphasize the need for a blazing-fast proc to keep frame rates up. You're at a disadvantage, yeah, but you're still going to get good performance on that platform.

thanks!!
 
And what happens if we OC a GTS to near GTX performance? Well because its still called a GTS according to you it will stay at the gts level and not show off any of that extra horse power. :rolleyes:
I never said anything like that. Are you confusing me with someone else, perhaps?

So I'm calling BS on the benchmark.
That's fine, but I don't think the benchmarks are even a requirement to make judgments at this point. Unless the narrower bus width impacts performance more than we're assuming it will, the GT's a slam dunk. Do you agree, or were you not debating that?
 
Thought you guys might be interested, related to price of new Nvidia cards:

Came across Shacknews' interview with Cevat Yerli (Crytek honcho), here's a quote:

Shack: Do you have any insight to how well the upcoming range of cards will support Crysis, not just on the high end but lower down the ladder as well?

Cevat Yerli: Very, very well. Stay tuned for more on this. In mid November you will see the new NVidia cards. They are a blast for Crysis and really, really very good deals.
 
So I'm calling BS on the benchmark.

What are you calling BS on now?

You don't have a very good record on calling fakes so far.

If you are talking about that 19 page Chinese review, I call that legit, just like I did the early leaked pictures of the card. Performance is exactly what I would expect from a faster clocked card with 112 shaders.
 
Thought you guys might be interested, related to price of new Nvidia cards:

Came across Shacknews' interview with Cevat Yerli (Crytek honcho), here's a quote:

Shack: Do you have any insight to how well the upcoming range of cards will support Crysis, not just on the high end but lower down the ladder as well?

Cevat Yerli: Very, very well. Stay tuned for more on this. In mid November you will see the new NVidia cards. They are a blast for Crysis and really, really very good deals.

Mid nov? I thought the 8800GT came out on the 29th.
 
What are you calling BS on now?

You don't have a very good record on calling fakes so far.

If you are talking about that 19 page Chinese review, I call that legit, just like I did the early leaked pictures of the card. Performance is exactly what I would expect from a faster clocked card with 112 shaders.

I wasn't, check back a few pages for the blog "benchmarks".

And btw, just because they post up pictures that have been edited of real products does not mean that the product itself doesn't exist, it just means that its been photoshoped. So I was right in that they were modified and "faked", even if they were real products.

An easy example: If we take your photo and change how you look, its still a faked photo even if you exist.

I never said anything like that. Are you confusing me with someone else, perhaps?

That's fine, but I don't think the benchmarks are even a requirement to make judgments at this point. Unless the narrower bus width impacts performance more than we're assuming it will, the GT's a slam dunk. Do you agree, or were you not debating that?

I wasn't debating if the cards were real or not, I know they exist now as they have shown up on major sites such as newegg. I was just debating that one benchmark from a blogpost.

Hell if we are to believe all the benchmarks, the GT is Faster, Slower and about the same as a GTX, all at the same time :p
 
And btw, just because they post up pictures that have been edited of real products does not mean that the product itself doesn't exist, it just means that its been photoshoped. So I was right in that they were modified and "faked", even if they were real products.

Except that nothing was "faked" about those photos. Everything gets "photoshopped" and processed these days. There is a far cry from photoshop processing to faked.

I hang out on digital camera forums and I see a lot of fakes and can spot them a mile off. Faked is when people add numbers/buttons/features to old models and then pass them off as new.

There was nothing added to these photos. They were poor quality, with cranked contrast, but there was nothing fake about them. They looked legit and are legit.
 
As far as I can remember, wasn't the 8800GT originally supposed to come out in November but the date was moved up? I think that's what Yerli is referring to.
 
So I'm calling BS on the benchmark.

As am I... Even at stock CPU clocks, my 8800GTS scores about 7fps higher at 670mhz than 570mhz.

I'm definitely CPU limited (as I can go from 2x to 4x AA with a 0fps hit... Yet, the higher clock still produces better results.

This is in oblivion, settings maxxed (minus AA).
 
As far as I can remember, wasn't the 8800GT originally supposed to come out in November but the date was moved up? I think that's what Yerli is referring to.

Yeah, weeks ago I read the proposed release dates and even marked my calendar for g92/98 release in mid-nov.. people love to speculate lol just wait and see what happens, after waiting THREE LONG AGONIZING YEARS to buy a new card I think I will be ready this round, I just can't hold out any longer :)
 
Thought you guys might be interested, related to price of new Nvidia cards:

Came across Shacknews' interview with Cevat Yerli (Crytek honcho), here's a quote:

Shack: Do you have any insight to how well the upcoming range of cards will support Crysis, not just on the high end but lower down the ladder as well?

Cevat Yerli: Very, very well. Stay tuned for more on this. In mid November you will see the new NVidia cards. They are a blast for Crysis and really, really very good deals.

Which would mean the big boy card is out this year, and we will see the 1 taraflop card before year end, just as Nvidia has said the entire time.....

Excellant!! Mabey the nay-sayers will start to re-consider their positions? Ya think? :)
 
im telling you, mid-Nov, 512bit interface 1Gb memory 8850GTX/Ultra Extreme...whatever they want to call it.....for the same or less that GTX/Ultra is today
 
I have my credit card in hand, ready to buy!!

This is good news.... :) :) :)

The main event is about to arrive.
 
I can't even care. I'm lucky to get 30 mins of gaming a week, and I'll be stuck with a 17" CRT . 7800GTX will simply have to cut the 1024x768 mustard for a long time yet.
 
even though there already is a card with 1gb and 512 bit bus, because it performs so poorly, im still waiting for a REAL high end card worthy of a 512 bit bus and 1gb of mem.
 
I can't even care. I'm lucky to get 30 mins of gaming a week, and I'll be stuck with a 17" CRT . 7800GTX will simply have to cut the 1024x768 mustard for a long time yet.

exactly why i sold my gaming rig and bought a 360....
 
People are now talking about a re-reporting of the possible release of the 1TF "monster" (separate from the 8800gt) first mentioned informally by an nvidia rep five or six months ago as a November release.

Many (including me) had assumed that this release was being pushed back. Especially as the 8800gt news/rumors started heating up.

Now, due to the statement by the Crysis guy above, some are once again hoping for a new top-tier card from nv in November. And, of course, some never stopped believing/hoping.

Hope that clears things up.
 
People are now talking about a re-reporting of the possible release of the 1TF "monster" (separate from the 8800gt) first mentioned informally by an nvidia rep five or six months ago as a November release.

Many (including me) had assumed that this release was being pushed back. Especially as the 8800gt news/rumors started heating up.

Now, due to the statement by the Crysis guy above, some are once again hoping for a new top-tier card from nv in November. And, of course, some never stopped believing/hoping.

Hope that clears things up.

Except that there is no proof that this interview was conducted recently or even before they pushed the release date up. I find it hard to believe they would call a new high end "value" when itll likely be at the $500+ price range. Likely, this interview was done along the same time as this article:

http://www.maximumpc.com/article/nvidia_launching_new_hardware_to_coincide_with_crysis_release

This was back in early October, before the 8800GT release date was pushed up. My sources inside the industry said that the 8800GT was due in mid November for a long time and that's like what Cevat Yerli was referencing.

Anyways, the 8800GT was originally named 8700GTS but when RV670 news was very optimistic about their yields and power, the moniker was changed to 8800GT, clocks were ramped up, etc. That's also why those Nvidia rumors about thermal issues with those sent out might be true: increased clocks mean increased heat, and if the benchmark of 86 degrees C is true, then it would make sense that Nvidia was worried about the heat dissipation capabilities of a single slot card.
 
Back in early October, before the 8800GT release date was pushed up. My sources inside the industry said that the 8800GT was due in mid November for a long time and that's like what Cevat Yerli was referencing.
Good point. You're probably right.
 
Except that nothing was "faked" about those photos. Everything gets "photoshopped" and processed these days. There is a far cry from photoshop processing to faked.

I hang out on digital camera forums and I see a lot of fakes and can spot them a mile off. Faked is when people add numbers/buttons/features to old models and then pass them off as new.

There was nothing added to these photos. They were poor quality, with cranked contrast, but there was nothing fake about them. They looked legit and are legit.

Look, I honestly don't care. The photos looked faked because they were modified a lot. All I asked for was for some real pictures with other things in the background eg: pictures of the cards on a table / in a computer / in someone's hand. Now we have some so its easy to say they were real now.

But come on, you want to say nothing was faked? Thats just ignorant.

8800gttz1.jpg


Someone took a GTX and removed the "X". Its obvious because they were dumb enough to leave the 768mb of ram.

77446113ye8.jpg


The above is one I just put together, what looks real... Tom's with its blurry text and crystal clear very bright "G92" or the newer Expert one? Funny enough if you overlay them in Photoshop they line up almost perfectly even though the chip themselves is at an angle in Tom's shot...

Also this one

gf4us5.jpg


Looks very fake because of the high contrast in the black and white section, there is no extra light that should be shown on the reflective surface of the heatsink.

Notice how dull it appears compared to the white CD sleeve here?

gigabyte2qw8of6.jpg


Anyway, enough is enough, there was plenty of reason to doubt the leaks, like I've been saying, take everything you hear with a grain of salt.
 
Good point. You're probably right.

FWIW the GTS 640 revision (128 SP's) is supposedly coming out in Mid Nov. still so there is some truth in there being a new card in mid Nov. I think we're just forgetting that.

Basically,

8800GT takes the 8800 GTS 320 price point but is at 8800 GTS 640 to 8800 GTX levels in performance.

New 8800 GTS takes the 8800 GTS 640 price point but is at the 8800 GTX to Ultra + levels.

Next card? God and Nvidia knows whats in store...
 
FWIW the GTS 640 revision (128 SP's) is supposedly coming out in Mid Nov. still so there is some truth in there being a new card in mid Nov. I think we're just forgetting that.
While you may turn out to be correct in that nvidia may indeed release a card in November. . . the real debate (here and elsewhere) has been whether the new top-tier card will be released in November. So the release of a mid-range card isn't particularly relevant to that "debate" (and I use that word loosely).
 
Look, I honestly don't care. The photos looked faked because they were modified a lot. All I asked for was for some real pictures with other things in the background eg: pictures of the cards on a table / in a computer / in someone's hand. Now we have some so its easy to say they were real now.

But come on, you want to say nothing was faked? Thats just ignorant.



Someone took a GTX and removed the "X". Its obvious because they were dumb enough to leave the 768mb of ram.


The above is one I just put together, what looks real... Tom's with its blurry text and crystal clear very bright "G92" or the newer Expert one? Funny enough if you overlay them in Photoshop they line up almost perfectly even though the chip themselves is at an angle in Tom's shot...

Also this one


Looks very fake because of the high contrast in the black and white section, there is no extra light that should be shown on the reflective surface of the heatsink.

Notice how dull it appears compared to the white CD sleeve here?


Anyway, enough is enough, there was plenty of reason to doubt the leaks, like I've been saying, take everything you hear with a grain of salt.


Ok the first picture clearly looks very fake due to the ram issue, BUT I have seen new hardware shipped with errors on the packaging over the years. I have then seen second shipments come in with a correction sticker over the mistake lol. I accept this picture is very suspect however.

Your argument over the second picture is very un convincing, if you are taking a photo of a GPU you will zoom out so it "just" all fits. Overlapping the images I would expect it to nearly match.

The 3rd picture looks fake, or is this just artwork to show how the finished product will look is the question

Your argument over the last picture is just unfounded. I see no problem at all after zooming the image, you are looking for things that are not there, the picture is legit.
 
Back
Top