the "new" GTX280M is actually a....

AmongTheChosenX

Supreme [H]ardness
Joined
Sep 24, 2007
Messages
7,151
G92b core.

http://techreport.com/discussions.x/16513

Poll: Your take on Nvidia's mobile GPU branding

If you didn't know any better, you might think Nvidia's new GeForce GTX 280M and 260M graphics processors were mobile versions of the similarly named desktop parts. As we explained early this morning, though, that's just not the case: both mobile offerings are based on the same 55nm G92b GPU that powers the GeForce 9800 GTX+ and the new GeForce GTS 250 1GB. By contrast, the desktop GeForce GTX 280 and 260 use a significantly more powerful (but bigger and more power-hungry) GT200 GPU.

That leads us to our latest poll question: what do you think of Nvidia's branding choice here? Is it fair to associate the new mobile GPUs with considerably faster desktop offerings? Admittedly, Nvidia doesn't make any higher-end mobile GPUs, and the GT200 seems like a poor fit for mobile use. However, one could also say Nvidia is purposefully misleading customers into buying older technology with a new brand name. After all, the original, 65nm G92 launched almost a year and a half ago. Cast your vote either below or on our front page.

In our previous poll, we asked what role your netbook was filling. 3% of respondents said they use their netbook as a primary system, while 8% use it as a laptop replacement, 11% see their netbook as a second laptop, and 3% use it for another purpose entirely. The remaining 75% said they don't own a netbook.


i'm sorry, but nvidia this is just effed up at this point.

theyre literally putting out this "new" GTX280M Mobile GPU, when its totally based off a G92b 55nm core.

more info here:http://techreport.com/discussions.x/16507
 
This is common practice for both Nvidia and ATI.

Examples:

ATI released the 9700 and 9800 Mobile parts, but they were really just the 9600 desktop core.

http://www.firingsquad.com/hardware/mobility_9700/page4.asp

Even today, they mark-up mobile GPUs, although it's not as bad. This "Mobility HD 4850" has performance closer to a 4830 because of the reduced clocks.

http://www.notebookcheck.net/AMD-ATI-Mobility-Radeon-HD-4850.13975.0.html

Nvidia's mobile 9600M GT was not really 9600 GT hardware. It's really a re-branded 8600 GT.

http://www.product-reviews.net/2008/10/15/nvidia-geforce-9600m-gt-specs/

Par for the course, buddy. Let it ride.
 
Nvidia is making an ATI fanboy out of me. they change name, ATI lowers prices. Guess who probably going to get my next high end video card purchase?
 
yeah I guess if 280, 285 is high end desktop, it is par for the course to make your high end part for laptops have a close name. Though I am less confused looking at the specs of cards now. :p

I laid back on PC gaming for about 2 1/2 years now until recently. It's so funny to see the term "laptop replacement." Didn't even know, or at least recall, what a netbook was until a deal recently. How long until "Dude is that your phone or your netbook replacement?"
 
So at least we know what Bruce Banner will keep himself distanced from in the next Hulk movie.
 
All semantics.

The Geforce 9-series wasn't that much of an improvement over the 8-series cards, anyways. And that's for the desktop market.

In my head, I pretty much do a "n - 1" calculation whenever it comes to mobile components; mainly due to their tight chassis, heat constraints, and power requirements, mobile GPUs will always be slower than their desktop counterparts. Don't be fooled by the marketing!
 
Actually some noobs might fall for this.

Ah, when sales go down, drastic measures are called for.
 
Is it me or is nVidia trapped in some kind of recycling, er... cycle? AMD is by and large bringing out cards and GPUs based on the RV7xx core, while nVidia has got some high-end parts out based on the GT200 core (260, 280, 285, 295) and is further completely recycling a 2-year old core (G92) albeit with a die shrink (55nm). Is the extra performance GT200 could bring to mainstream and budget cards just not worth it or so?

Quite puzzling..
 
I think Anandtech put it best:

In the beginning there was the GeForce 8800 GT, and we were happy.

Then, we then got a faster version: the 8800 GTS 512MB. It was more expensive, but we were still happy.

And then it got complicated.

And they were getting the shaft too:
Cards had already gone out to other reviewers but we weren't on any lists. Oh, pout.

Magically, a couple of days after Charlie's article we got invited to a NVIDIA briefing and we had a GTS 250 to test. Perhaps NVIDIA was simply uncharacteristically late in briefing us about a new GPU launch. Perhaps NVIDIA was afraid we'd point out that it was nothing more than a 9800 GTX+ that ran a little cooler. Or perhaps we haven't been positive enough about CUDA and PhysX and NVIDIA was trying to punish us.

Who knows what went on at NVIDIA prior to the launch, we're here to review the card, but for what it's worth - thank you Charlie :)
 
Wow, this thing is going to be slower than a 8800GTS 512. Disappointing.
 
ATI released the 9700 and 9800 Mobile parts, but they were really just the 9600 desktop core.

At least those were part of the same generation. This is like ATI re-branding an 8000-series chip and calling it a 9700M. Two separate generations.

Even today, they mark-up mobile GPUs, although it's not as bad. This "Mobility HD 4850" has performance closer to a 4830 because of the reduced clocks.

Again, same generation. The mobile 4850 is still a 4800-series card. It doesn't have the broken AA engine of the 3800 series, and it comes with the full 800 stream processors, not 320 like the previous generation.

Nvidia's mobile 9600M GT was not really 9600 GT hardware. It's really a re-branded 8600 GT.

Yeah, they've been at it for a while :p Then again there's a lot of overlap between the 8800 and the 9800 series..they both essentially belong to the same generation with just a die shrink and some new letter and number combinations on the box.

Nvidia is trying to fool people into thinking they're constantly innovating and releasing new products. It's obvious that they are having serious problems adapting the GTX200 core to mobile and mid-range products. This is exactly what ATI foresaw when they decided on the change of strategy that brought us the 3800 and 4800 series - adapting the ever larger and more complex high-end chips to the mainstream and mobile platforms will only get more difficult, forcing the GPU vendor to rely on old hardware for a long time while the new tech trickles down to the mainstream. Then again the G92 is already like a cut-down GTX200 since the latter doesn't introduce any new features so I guess it worked reasonably well for them.

When DirectX11 comes out, we'll find out if Nvidia is still able to innovate. I feel confident that ATI has an impressive Shader Model 5 card in the works at the right price point.. with Nvidia's behavior at the moment, I'm not so sure. Also if Nvidia's chip is once again 50% larger than ATI's, how long will it take for Nvidia to get mainstream SM5 hardware out compared to ATI?
 
This whole nVidia horseshitting scheme began with the success of 8800GT, right? I mean, nVidia makes great GPUs, but their naming scheme is downright purposefully misleading.
 
Then again the G92 is already like a cut-down GTX200 since the latter doesn't introduce any new features so I guess it worked reasonably well for them.

The main thing I can think of with the GT200 core is that it added double-precision floating point support, which is nifty for things like CUDA. Other than that it are the same cards when it comes to OpenGL (3.0) and DX (10) support.

GT300 should support DX11 and a whole host of new OGL extensions (pre-3.1).

People have suggested that nVidia recycling an older core like this is mostly due to issues with scaling down the GT200 core, which I find somewhat plausible (biggest GPU core ever).
 
The main thing I can think of with the GT200 core is that it added double-precision floating point support, which is nifty for things like CUDA. Other than that it are the same cards when it comes to OpenGL (3.0) and DX (10) support.

GT300 should support DX11 and a whole host of new OGL extensions (pre-3.1).

People have suggested that nVidia recycling an older core like this is mostly due to issues with scaling down the GT200 core, which I find somewhat plausible (biggest GPU core ever).



the fact that we need to make excuses like this because the chip is so old is just... pathetic.
 
If they want to destroy their brand image and water down the image of their product line, more power to them.
 
Pretty bogus choice of name imo.

Something below the GTX280/260 I could see, but actually calling it a GTX280, and just adding the M to the end is really, really lame imo. It'll probably trick plenty of people into thinking it's the really good GTX280, but in a mobile form. And it most certainly is not.
 
Ah...hilarious. NVIDIA's love affair with G92 borders on depravity.

I hinted before that we may see an entire religious formed around worshiping G92, and, today, it doesn't even seem that far-fetched. I'd imagine that there's a special room in NVIDIA's H.Q. that contains pews, an altar and a holy obelisk formed from hundreds of G92 chips. Each day at midday, NVIDIA employees journey to this room to pray to their Lord the God: the almighty G92.

G92 is faith. G92 is Lord.
 
This is pretty standard sleight of hand for mobile graphics chips, but coming on the heels of the G92 renaming scheme it looks even worse. As far as NV not being able to scale the GT200 core goes, I'm skeptical. I don't think they want to "scale down" a core meant for a wide bus and high resolution gaming for a 15" laptop--not when a G92 will do that job and with a lot less heat.

I haven't followed all the threads on the subject so I may not be the first one to say this, but it seemed fairly obvious to me that NV realized what we all did months ago: graphics intensity has plateaued since late 2007 and they don't need to develop anything new for the low-mid range--they can just recycle their tech and bring the names into line with their latest generation. The only real increase in graphical demand is the spike in average resolution of mid-high end gamers jumping to 1920x1200 (with an increasing number of 2560x1600 users) which produced a real market for denser chips on wider buses.

That doesn't excuse their obfuscation, or their juvenile behaviour in shutting out the [H] from reviews, but when a company takes as many hits as NV has in the last year (price fixing, bad chips, rights issues with intel, ATI taking 30% of the market) and starts haemorrhaging money they will try nearly anything to stop the bleeding. All things considered, renaming good (though old) tech that still performs decently isn't that bad.
 
Well yes, the G92b isn't a bad core by any stretch of the imagination. I fully support their decision to keep using it a bit longer. It just irks me and others that they seem so reluctant to even admit that they're doing so, instead pretending they're bringing out new cards. Just admit you've got some really good GPUs and that they're so great you can even afford to not release a new GPU because they're just so darn good.

Or something. It would beat the current 'please pretend you didn't see us slap on the new label' kind of tactics at least.
 
Well yes, the G92b isn't a bad core by any stretch of the imagination. I fully support their decision to keep using it a bit longer. It just irks me and others that they seem so reluctant to even admit that they're doing so, instead pretending they're bringing out new cards. Just admit you've got some really good GPUs and that they're so great you can even afford to not release a new GPU because they're just so darn good.

Or something. It would beat the current 'please pretend you didn't see us slap on the new label' kind of tactics at least.

yeah i agree. if they could do this i'd be totally down with it.
 
If they want to destroy their brand image and water down the image of their product line, more power to them.

Yeah, this silly business has got to hurt them in the end.

Thankfully DX11 hardware will hit the market soon, hopefully putting an end to all the madness. If you're already running any 8800-based card (except maybe the 320MB version) or the equivalent 9800/GTS2xx/GTX2xxM card, my advice is to just stick with it for now. When DX11 comes around, Nvidia will have to prove that they still have R&D people working at the company, not just some guy flashing old cards with new name strings...

Or something. It would beat the current 'please pretend you didn't see us slap on the new label' kind of tactics at least.

Yeah, Nvidia has every reason to be proud of the G80 and the subsequent die shrink. It's probably the best GPU ever designed. It took ATI three tries to catch up and eventually beat it (2900, 3800, and finally they did it with the 4800). But the damn thing was released in 2006 which means it was probably designed in the 2003-2006 time frame. It's time for something new and it's time to admit that the tech is old.
 
This is pretty standard sleight of hand for mobile graphics chips, but coming on the heels of the G92 renaming scheme it looks even worse. As far as NV not being able to scale the GT200 core goes, I'm skeptical. I don't think they want to "scale down" a core meant for a wide bus and high resolution gaming for a 15" laptop--not when a G92 will do that job and with a lot less heat.

A scaled down GT200 is G92b.

And yeah, this is standard procedure for mobile GPUs. The 9800M GTX was a gimp 8800GT. Laptops always get named one generation ahead of what they really are. At least on the nVidia side, which has been the only side that mattered in mobile for years.

This naming thing is even less of an issue now that the AMD mobile parts will stomp this 'GTX 280M'.
 
3000 series were just die shrinks nothing more

It's not just a matter of pushing the Shrink button. It's a lot more involved than that. A die shrink is still a major change even if the architecture itself doesn't change. ATI was not capable of increasing the number of stream processors until the 55nm process had matured. So the 3800 series definitely count as a "failed" attempt to dethrone G80/G92 IMO.
 
It's not just a matter of pushing the Shrink button. It's a lot more involved than that. A die shrink is still a major change even if the architecture itself doesn't change. ATI was not capable of increasing the number of stream processors until the 55nm process had matured. So the 3800 series definitely count as a "failed" attempt to dethrone G80/G92 IMO.
lol dethrone g80 with same specs (even lower actually 512 bit to 256 bit) are you crazy 3870x2 dethroned 8800ultra but 3000 series weren't supposed to dethrone anything they were value cards except r700
 
Nvidia is making an ATI fanboy out of me. they change name, ATI lowers prices. Guess who probably going to get my next high end video card purchase?
Same here. I've been buying Nvidia since I first got into gaming but I'm tired of their naming and pseudo-new card bullshit. When I build my next pc later this year, ATI will definitely be getting my money.
 
nVidia's naming scheme is all screwed up lately, I don't even know what the letters mean any more...
 
Mobile parts are never as strong as the desktop counterparts but to name it the GTX280m when its a freaking 9800GTX? Thats a bit fucking far.
 
All of this while ATI launches the 4870X2 mobile.

This. ATi has rocked nVidia so hard in the mobile sector this round it's not even funny. It's flat out pathetic. I wouldn't be surprised if ATi releases two more single-chip mobile solutions that both cream this "GTX280" mobile.
 
Can you say 4830m-x2? Low Power, Low Heat, Small and Powerful for that screen size.
 
Actually while the naming is horribly bad, throwing a 9800gtx+ onto a laptop is a great idea if they hybrid SLI it with a 730i chipset (9300).
 
Back
Top