4870 512mb VS 9800 gx2 WOW

I'll say it again. Based on the confidence of NVIDIA's CEO at Analyst Day, I wouldn't be surprised to see GT200 and its derivatives be less and less dependent on a good CPU, as most current GPUs are. That means that, if these provisions are in place, having a powerful (and expensive) CPU coupled with a GT200 (or derivative) based card, could yield very similar results to a mid-low end CPU coupled with a GT200 (or derivative) based card. NEVER have I said that GT200 and its derivatives, will render CPUs obsolete. That's just something you made up and accused me of doing, for whatever delusional reason...

This is exceedingly frustrating because you cant seem to fully comprehend my arguments, nor NV's, nor see the bigger picture.

Yes, NV are stating that Intel CPU's are virtually "dead" in this day and age because according to them a low-mid end CPU coupled with a high end GPU will work just as well for most users, and thus they don't have to upgrade their CPU because games are less and less dependent on them.

That is the argument they made, not one that you made, but they are confident about themselves nonetheless and you seem to be conflating that confidence with the GT200 for some odd reason? :rolleyes::rolleyes::confused:

The GT200 is irrelevant at this point, whereas you feel that it represents the fuel for their fire.
That is not the case at all.
The GT200 is JUST ANOTHER GRAPHICS CARD.

NV are simply waging war against Intel because they feel that we don't need faster CPU's anymore, and they would argue that point whether or not the GT200 was any faster than the 9800 GX2 even.


It's a silly argument though as CPU handles numerous instruction sets in games and every single game out there in existence shows tangible and appreciable gains when coupled with faster CPU's.

At any rate, Taylor stated:
Basically the CPU is dead. Yes, that processor you see advertised everywhere from Intel. Its run out of steam. The fact is that it no longer makes anything run faster. You don’t need a fast one anymore. This is why AMD is in trouble and its why Intel are panicking. They are panicking so much that they have started attacking us. This is because you do still [need] one chip to get faster and faster – the GPU. That GeForce chip. Yes honestly. No I am not making this up. You are my friends and so I am not selling you.

Fact of the matter is that they, NV, are panicking.

They know they have a smaller share than Intel in the graphics adapter market.
They know that Intel is coming out with a chip that will possibly widen that gap even further, even if for the sake of argument it's slower than their GT200, it will still prove to diminish NV's market share in the mid-end sector at the very least.
They also know that Intel are working on a design that will combine a GPU+CPU on a single chip, and in doing so, drive a stake into NV's heart.

So, what do NV do?
They make an outlandish and premeditative attack out of desperation and try to go for Intel's jugular, their cash cow, the CPU.

If they can convince OEM's and users alike that faster CPU's are superfluous, maybe, then maybe, they can hurt Intel where it matters most, their wallet, thus keeping them from making billions more which could eventually be used to encroach further upon NV territory -- by putting that money made from CPU's into GPU R&D and running NV out of town.

It's all very much common sense stuff, and you unfortunately can't see the forest from the trees as you are fixated on the GT200.

The GT200 is just another graphics card in the grand scheme of things, and due to the aforementioned reasons, NV would be making the same argument they have recently been making even if it was another 8800 refresh hitting the market instead of the GT200.

It's a foolish argument however as neither the CPU is dead contrary to what NV may claim, and neither will the GT200 help NV defeat Intel in overall market share.
Even if as I stated, the GT200 is faster than Larrabee, so what?
NV have released faster GPU's than Intel for the last 10 years, but where has that gotten them?

Oh yeah, that's right, 2nd place. :eek:

NV simply want to go for the jugular and convince OEM's / users alike to stop spending their money on CPU's so they can spend them on GPU's instead, be it GT200 or whatever else they have that's higher end.

In doing so, guess what happens?
Less money for Intel from their CPU division = less money for Intel to spend on their GPU division, thus limiting their threat as an all-powerful competitor.

Common sense stuff.

Read NV's recently released 10-K report, it substantiates everything I have said thus far.
Then come back and talk.
http://www.secinfo.com/dvT9t.t4.htm#14yv
 
In fact, let me post a few tidbits from NV's 10-K report.

A significant source of competition is from companies that provide or intend to provide GPU, MCP, and application processors that support PMPs, PDAs, cellular phones or other handheld devices. Some of our competitors may have greater marketing, financial, distribution and manufacturing resources than we do and may be more able to adapt to customer or technological changes. Currently, Intel, which has greater resources than we do, is working on a multi-core architecture code-named Larrabee, which may compete with our products in various markets. Intel may also release an enthusiast level discrete GPU based on the Larrabee architecture.
http://www.secinfo.com/dvT9t.t4.htm#36uz


Intel have greater resources than NV - check
Larrabee slated to compete with NV's products in various markets - check
Enthusiast level discrete GPU based on Larrabee architecture presumed to compete with NV's enthusiast level cards - check

NV agree with all of the points that I have made, why can't you? :eek:

So, what's left?
Intel already have the larger graphics market share.
The low-end is theirs.
The mid-end they will definitely be encroaching upon with Larrabee and thus widen the gap between them and NV.
The high-end will also be sought by Intel, and whether or not they attain market leadership in that sector is of secondary importance since Larrabee will almost certainly garner more market share for them in a sector they currently have no presence in, the mid-end.

NV of course have to hit Intel where it hurts, their CPU division by denouncing faster CPU's.

The GT200 is of little significance -- it's just another card, nothing more, nothing less.
That's the only role it serves in Intel's denunciation and in NV's new-found confidence, the role of just another card in a day and age where NV feels faster CPU's are superfluous.
They just have to convince OEM's and end-users of that now, GT200 or not.
 
i think Nvidia is mad retarded for trying to say the CPU is dead, and that intel is panicking. because intel's not. Nvidia can't do alot against them, even if they can its going to take ALOT of time. Intel also has AMD in chains, so basically AMD is not a threat, just mere crappy competition.

ATi has nothing to do with this :) thats one thing i like about them. I'm sure if ATi got their butts into this argument all the green fans would be all over ATi, saying "THEYRE SO G.A.Y. FOR DOING THAT, WHAT THE F IS WRONG WITH THEM?!"

i think thats messed up.

quite honestly, until EVERY program can take advantage of a GPU, like those folding@home clients that use ATI GPU's, i don't think the CPU will be dead for awhile, probably never. I'd bet you 50 bucks that the GPU gets integrated onto a CPU, NOT vice versa (meaning the CPU is what sticks around, it doesn't "die")
 
Blah blah blah...btw did anyone see the info on the GT200 cards. Whew those things are gonna be fast. Looks like I won't have to make that switch to ATi afterall. There is talk that we will see GT200 based cards as soon as early June. Awesome.
 
4870 - 800/3400mhz

3400 is that ddr 3 4 or 5 ?

The may release is just ddr3 isn't it ?
 
Newsflash, G92 is a die shrink of G80, which came out 18 months ago. A die shrink is not an architecture change, its simply reduces power usage, reduces production costs, and enables higher clocking.:rolleyes:

G71 -> G80 was an architecture change. Pixel pipelines to unified shader architecture. Know what your talking about before you post.

WTF else can they compare it too..? you need to compare it to what is out on the market now....
 
Hamidxa said:
This is exceedingly frustrating because you cant seem to fully comprehend my arguments, nor NV's, nor see the bigger picture."

Hamidxa, what is your point? Nvidia's doomed, Intel's going to crush them with Larabee. Is that it? Your statements are redundant and/or useless (read: serve no purpose to any arguement I can draw from them) and/or convaluted.


If I'm correct and that is indeed your arguement then you have little comprehension of the resilience of businesses (thanks, in part, to the stupidity of some consumers *ahem*). Nvidia has penetrated all of Intels markets except for the CPU one. Mobile chipsets, motherboard chipsets, Integrated graphics, and even motherboards to some technical extent. Nvidia (and I hate to turn this into some kind of PR) continues to develope sucessful mobile chipsets. Nvidia's motherboard chipset market penetration since nForce [1] hasn't gone anywhere but up. That said, neither has Intel's really. I know with a certainty that if a customer was on a tight budget and had some small graphical need I would not hesitate to grab a motherboard using Integrated Geforce 7150, (and to futher this end, would under no conditions buy one using G35, I would literally hand them off to someone else before using a board with integrated G35). Theres an outstanding chance that if I were to buy a board with Geforce 7150 graphics, it would be from EVGA making it, effectivly, an Nvidia made motherboard.

I will correct you on one thing:
It's a silly argument though as CPU handles numerous instruction sets in games and every single game out there in existence shows tangible and appreciable gains when coupled with faster CPU's.

This is something that needs citation. RTS games such as Supreme Commander with intricate bot actions and comprehension are games that will show appreciable gains when jumping from lower speed CPUs to higher speed CPUs. Presumably, games using a physics API developed for CPUs might, in the future, show appreciable performance gains. Thats all I got and as far as I know all thats out there, certainly a far cry from "Every single game". The vast majority of games including FPS's are not games which show "tangible(?You've touched an FPS? whats it like?) and appreciable gains when coupled with faster CPU's."

Firingsquad's Phenom X3 review
Hard OCPs 750i SLI FTW review
Hard OCPs Spider vs QX9770

I could go on and on but theres a common theme: when the computers with higher end CPUs start to show a significant raw delta, its already above our refreshrates, rendering it moot.
 
Hyperbole aside, Nvidia's basically right. The CPU is not totally irrelevant, but far less important than the GPU.
 
Newsflash, G92 is a die shrink of G80, which came out 18 months ago. A die shrink is not an architecture change, its simply reduces power usage, reduces production costs, and enables higher clocking.:rolleyes:

G71 -> G80 was an architecture change. Pixel pipelines to unified shader architecture. Know what your talking about before you post.
News flash Nvidia is behind on the ball then so compare away.
 
Hamidxa, what is your point? Nvidia's doomed, Intel's going to crush them with Larabee. Is that it? Your statements are redundant and/or useless (read: serve no purpose to any arguement I can draw from them) and/or convaluted.


If I'm correct and that is indeed your arguement then you have little comprehension of the resilience of businesses (thanks, in part, to the stupidity of some consumers *ahem*). Nvidia has penetrated all of Intels markets except for the CPU one. Mobile chipsets, motherboard chipsets, Integrated graphics, and even motherboards to some technical extent. Nvidia (and I hate to turn this into some kind of PR) continues to develope sucessful mobile chipsets. Nvidia's motherboard chipset market penetration since nForce [1] hasn't gone anywhere but up. That said, neither has Intel's really. I know with a certainty that if a customer was on a tight budget and had some small graphical need I would not hesitate to grab a motherboard using Integrated Geforce 7150, (and to futher this end, would under no conditions buy one using G35, I would literally hand them off to someone else before using a board with integrated G35). Theres an outstanding chance that if I were to buy a board with Geforce 7150 graphics, it would be from EVGA making it, effectivly, an Nvidia made motherboard.

I will correct you on one thing:


This is something that needs citation. RTS games such as Supreme Commander with intricate bot actions and comprehension are games that will show appreciable gains when jumping from lower speed CPUs to higher speed CPUs. Presumably, games using a physics API developed for CPUs might, in the future, show appreciable performance gains. Thats all I got and as far as I know all thats out there, certainly a far cry from "Every single game". The vast majority of games including FPS's are not games which show "tangible(?You've touched an FPS? whats it like?) and appreciable gains when coupled with faster CPU's."

Firingsquad's Phenom X3 review
Hard OCPs 750i SLI FTW review
Hard OCPs Spider vs QX9770

I could go on and on but theres a common theme: when the computers with higher end CPUs start to show a significant raw delta, its already above our refreshrates, rendering it moot.

You must have only skimmed over my arguments at best, which would explain why you would characterize them as convoluted.

If you would have meticulously perused through them, which I had articulated quite clearly, then you would have understood everything I had said.

As it stands, I did not say that Larrabee will "crush" whatever NV has up its sleeve, nothing even remotely to that effect.

On the other hand, I said that Larrabee will, if nothing else, help to only further increase Intel's lead over NV in terms of overall market share, because as it stands, Intel has no mid-end product to compete with NV. Even if for the sake of argument, Larrabee trails NV's high end products and cant compete in that sector, it will still manage to carve away at NV's mid-end simply by virtue of it being another product for consumers to choose from rather than the duopoly that currently exists in that market segment.

I also added that GT200 is just another graphics card, nothing more, nothing less. It is not a CPU killer.
This is rather self-explanatory, and needs no further elaboration.
NV could have for instance launched another G80 refresh part and still made that same comment that the CPU is dead. GT200 wont handle any CPU computing tasks, so it's all very obvious. GT200 is just another card, and NV made that statement out of desperation, trying to deter OEM's and end users alike from spending more on CPU's so they can instead spend more on GPU's. In doing so, they hope to curb Intel's CPU sales enough to where their GPU R&D expenses could be hindered as a consequence.

This is all very rudimentary PR, marketing, and economics.


Edit:
For the record, I am fully aware of all aspects of NV's business.
I wrote a 40 page paper on them not so long ago, examining their phenomenal growth and rise to success since 1993.
However, you would be extremely naive to think that their business strategy and model is not perpetually threatened by both external and internal forces.

As I pointed out earlier, 81.6% of NV's annual revenue stems from GPU's.
That only leaves a rather small percentage (relatively speaking) of revenue streaming in from their MCP business, which primarily conists of motherboard chipsets.

If Intel does manage to come out with a chip down the line (and im not saying that Larrabee is that chip) which will somehow render GPU's as redundant technologies, then guess what happens to NV?

NV sinks like the Titanic.

They have way too much overhead and fixed costs tied to their business as a whole, and their business model has no contingencies in place for recovering from such an eventuality.
In fact, their 10-K report is peppered with constant warnings to their stakeholders about things such as:

As Intel and AMD continue to pursue platform solutions, we may not be able to successfully compete and our business would be negatively impacted.
We sell our products to a small number of customers and our business could suffer if we lose any of these customers.
Our failure to identify new market or product opportunities or to develop new products could harm our business.
We may have to invest more resources in research and development than anticipated, which could increase our operating expenses and negatively impact our operating results.
Our operating expenses are relatively fixed and we may not be able to reduce operating expenses quickly in response to any revenue shortfalls.
Our operating results are unpredictable and may fluctuate, and if our operating results are below the expectations of securities analysts or investors, the trading price of our stock could decline.
If our products do not continue to be adopted by the desktop PC, notebook PC, workstation, high-performance computing, PMP, PDA, cellular handheld devices, and video game console markets or if the demand for new and innovative products in these markets decreases, our business and operating results would suffer.
We are dependent on the PC market and its rate of growth in the future may have a negative impact on our business.
Our business is cyclical in nature and an industry downturn could harm our financial results.
A significant source of competition is from companies that provide or intend to provide GPU, MCP, and application processors that support PMPs, PDAs, cellular phones or other handheld devices. Some of our competitors may have greater marketing, financial, distribution and manufacturing resources than we do and may be more able to adapt to customer or technological changes. Currently, Intel, which has greater resources than we do, is working on a multi-core architecture code-named Larrabee, which may compete with our products in various markets. Intel may also release an enthusiast level discrete GPU based on the Larrabee architecture.

If Intel does manage to release a GPU+CPU on chip somewhere down the line that performs on par or close enough to what NV has to offer, and if the price is right, NV = out of business.

81+ % of their revenue comes from their GPUs, and guess what, they don't have IP rights to X86 technologies.

NV would be royally screwed.
Resilient yes, but to an extent.

They may be in 4 different markets, but when > 80% of your annual revenues come from only one (the GPU market in all of its forms), and when you already trail Intel in terms of market share in the GPU business anyways, then any further encroachment by Intel into that market (i.e. Larrabee in the immediate future and potentially some 100+ core processor in the more distant future) could spell trouble for NV.
 
News flash Nvidia is behind on the ball then so compare away.

What? They're coming out with a new architecture in a month or two..... How is that behind the ball when this 4870 isn't out yet either?

IMO wait until BOTH cards come out and compare them.
 
Newsflash, G92 is a die shrink of G80, which came out 18 months ago. A die shrink is not an architecture change, its simply reduces power usage, reduces production costs, and enables higher clocking.:rolleyes:

G71 -> G80 was an architecture change. Pixel pipelines to unified shader architecture. Know what your talking about before you post.

G80 to G92 was NOT just a die shrink. you're obviously still a newbie. hang in there though, one day you will really know it all....like me. but congratulation are due, you win the award for most ignorant poster on 5/3/2008 on all of Earth.
 
NV needs CUDA to take off ASAP as far as I see it (buying physix was just the first step). They need to expand into scientific computing more as there's no future for them if they go for general computing CPUs, Intel's isn't going to licence x86 to them, period.
 
You must have only skimmed over my arguments at best, which would explain why you would characterize them as convoluted.

If you would have meticulously perused through them, which I had articulated quite clearly, then you would have understood everything I had said.

As it stands, I did not say that Larrabee will "crush" whatever NV has up its sleeve, nothing even remotely to that effect.

On the other hand, I said that Larrabee will, if nothing else, help to only further increase Intel's lead over NV in terms of overall market share, because as it stands, Intel has no mid-end product to compete with NV. Even if for the sake of argument, Larrabee trails NV's high end products and cant compete in that sector, it will still manage to carve away at NV's mid-end simply by virtue of it being another product for consumers to choose from rather than the duopoly that currently exists in that market segment.

I also added that GT200 is just another graphics card, nothing more, nothing less. It is not a CPU killer.
This is rather self-explanatory, and needs no further elaboration.
NV could have for instance launched another G80 refresh part and still made that same comment that the CPU is dead. GT200 wont handle any CPU computing tasks, so it's all very obvious. GT200 is just another card, and NV made that statement out of desperation, trying to deter OEM's and end users alike from spending more on CPU's so they can instead spend more on GPU's. In doing so, they hope to curb Intel's CPU sales enough to where their GPU R&D expenses could be hindered as a consequence.

This is all very rudimentary PR, marketing, and economics.


Edit:
For the record, I am fully aware of all aspects of NV's business.
I wrote a 40 page paper on them not so long ago, examining their phenomenal growth and rise to success since 1993.
However, you would be extremely naive to think that their business strategy and model is not perpetually threatened by both external and internal forces.

As I pointed out earlier, 81.6% of NV's annual revenue stems from GPU's.
That only leaves a rather small percentage (relatively speaking) of revenue streaming in from their MCP business, which primarily conists of motherboard chipsets.

If Intel does manage to come out with a chip down the line (and im not saying that Larrabee is that chip) which will somehow render GPU's as redundant technologies, then guess what happens to NV?

NV sinks like the Titanic.

They have way too much overhead and fixed costs tied to their business as a whole, and their business model has no contingencies in place for recovering from such an eventuality.
In fact, their 10-K report is peppered with constant warnings to their stakeholders about things such as:












If Intel does manage to release a GPU+CPU on chip somewhere down the line that performs on par or close enough to what NV has to offer, and if the price is right, NV = out of business.

81+ % of their revenue comes from their GPUs, and guess what, they don't have IP rights to X86 technologies.

NV would be royally screwed.
Resilient yes, but to an extent.

They may be in 4 different markets, but when > 80% of your annual revenues come from only one (the GPU market in all of its forms), and when you already trail Intel in terms of market share in the GPU business anyways, then any further encroachment by Intel into that market (i.e. Larrabee in the immediate future and potentially some 100+ core processor in the more distant future) could spell trouble for NV.

That settles it, the future is Intel vs Nvidia. Intel has everything they need to crush NV due to their size alone and their now ironclad hold on the CPU market.

It would be horrible news for us if Intel comes out with something amazing in the GPU sector or come out with CPU that can do all the graphics work without the need of an external videocard.

Can you imagine what would happen to NV? Intel would be the only one making new graphic chips, prices will soar and everyone will be screwed due to the lack of competition.

Maybe at some point down the line we'll see AMD merge with NV. NV + ATI + AMD vs Intel, with AMD beefing up their CPU while NV and ATI will work on the GPU department.
 
G80 to G92 was NOT just a die shrink. you're obviously still a newbie. hang in there though, one day you will really know it all....like me. but congratulation are due, you win the award for most ignorant poster on 5/3/2008 on all of Earth.
What else was it then? Looked like just a simple die shrink to me and I am hardly a newbie. Been into cutting edge technology like this for about 20 years.

Edit: Nevermind. I thought about it for a second and remembered that nVidia ALSO took the nerf bat to the memory bus. Technically aside from doing a die shrink they technically designed cards that clock higher but have shite for memory bandwidth. Thanks for jogging my memory.
 
Back
Top