4870 512mb VS 9800 gx2 WOW

I'm waiting for that Level505 website to post something like how they posted how 2900XTX would crush 8800GTX.
 
I'm waiting for that Level505 website to post something like how they posted how 2900XTX would crush 8800GTX.

I remember some guy named Gibbo, which was labeled as "hardly ever wrong" also saying that he tested the HD 2900 XT and claimed that it was not only faster than the 8800 GTX, but also consumed less power, produced less heat and its cooler was silent, compared to the one of the GTX.

Here it is:
http://forums.overclockers.co.uk/showthread.php?t=17729129

I especially like how his numbers are all amazing for the HD 2900 XT, except in FEAR, which he promptly says must be a driver issue...

If this guy had credibility at the time, he sure lost it completely with this.

Wonder if he'll do the same with the HD 4800...
 
I'm waiting for that Level505 website to post something like how they posted how 2900XTX would crush 8800GTX.

I remember some guy named Gibbo, which was labeled as "hardly ever wrong" also saying that he tested the HD 2900 XT and claimed that it was not only faster than the 8800 GTX, but also consumed less power, produced less heat and its cooler was silent, compared to the one of the GTX.

Which is exactly why I said what I said on an earlier page. Everything is speculation at this point, even most canned benches :eek:

That is not how the world works, fan-girl
it is not that nvidia does not bother, but it has nothing to offer, that is why its stock price drop more than 50% in 11 months. That is more than 10 BILLIONs in market value. You fan-girls can cry and dream for a new product for ever, but the investors view the world differently.

If they had nothing to offer, the GT200 wouldn't be in development. There was just a decrease in demand due to stagnation in the market. It's much more important for them to claim another win against ATI by dropping a new arch when ATI does the same thing... Maybe fiscally, maybe not, but this plan is most definitely egotistically rewarding.

Also, don't break the rules here by flaming and insulting others.
 
Which is exactly why I said what I said on an earlier page. Everything is speculation at this point, even most canned benches :eek:

Which everyone NOT a fanboy knows. And I wouldn't say "most canned benches". I would say "all benches". Until the NDA is lifted and reviews are posted, everything is just a rumor.
 
Which everyone NOT a fanboy knows. And I wouldn't say "most canned benches". I would say "all benches". Until the NDA is lifted and reviews are posted, everything is just a rumor.

Well, I'm willing to give the benefit of the doubt SOMETIMES ;)... but I agree with you for the most part.
 
Which is exactly why I said what I said on an earlier page. Everything is speculation at this point, even most canned benches :eek:



If they had nothing to offer, the GT200 wouldn't be in development. There was just a decrease in demand due to stagnation in the market. It's much more important for them to claim another win against ATI by dropping a new arch when ATI does the same thing... Maybe fiscally, maybe not, but this plan is most definitely egotistically rewarding.

Also, don't break the rules here by flaming and insulting others.

greanted yes nvidia can drop a bomb that can kill everything...

but what if they screw up? has anyone thought about that? what if Nvidia pulls an R600 on us? what will we say then?

my god people just because nvidia is good now doesn't mean they'll be good forever. again, it doesn't mean they won't be either, but i'm jsut saying.
 
greanted yes nvidia can drop a bomb that can kill everything...

but what if they screw up? has anyone thought about that? what if Nvidia pulls an R600 on us? what will we say then?

my god people just because nvidia is good now doesn't mean they'll be good forever. again, it doesn't mean they won't be either, but i'm jsut saying.

I point you to the Analyst Day and Jen-Hsun Huang's remarks. He wasn't just attacking Intel, for the sake of defending their "GPU territory". He was extremely confident and it seems to me that by then, they had already tested their GT200 prototypes and it must be extremely powerful, or they wouldn't be saying what they did. That said, I won't be surprised to see a GT200 powered card, be more than twice as fast as a 8800 GTX.
 
I point you to the Analyst Day and Jen-Hsun Huang's remarks. He wasn't just attacking Intel, for the sake of defending their "GPU territory". He was extremely confident and it seems to me that by then, they had already tested their GT200 prototypes and it must be extremely powerful, or they wouldn't be saying what they did. That said, I won't be surprised to see a GT200 powered card, be more than twice as fast as a 8800 GTX.



we're talking about the same people that forced the makers of assasin's creed to take directX 10.1 out of it cause nvidia cards cant render it... thus ATi cards do better, and the @$$ holes over there can't handle that.
 
greanted yes nvidia can drop a bomb that can kill everything...

but what if they screw up? has anyone thought about that? what if Nvidia pulls an R600 on us? what will we say then?

my god people just because nvidia is good now doesn't mean they'll be good forever. again, it doesn't mean they won't be either, but i'm jsut saying.

Well, of course nVidia has the potential to stuff this new card up... And I'll be the first to speak out against them if they do. But, ATI looks like it really has something all of a sudden... nVidia can't afford to fuck up.
 
My Radeon 9800 Pro kicked ass for years. I'm not partial to any manufacturer, whoever makes the best bang for the buck product is what im going with.
 
Nvidia can fuck up and ATi Can Fuck up.
Personaly I think the FX Generation was a bigger flop than the R600 Generation.
At Least the R600 can render the same images.
Moving on I do think the HD4K will be a beast but I expect that because its a new generation card and it should be a beast. The GT200 is something I cant even speculatae on since I have even less semi factional/fictional data (ala RV770). It should be a beast card too since its new.
But heres the thing these Benches are BS because of the lack of evidence. Sure that COULD happen but we dont know if it will and like everyone has said a graph is easy to create. Im staying out of this until the cards are out.
 
we're talking about the same people that forced the makers of assasin's creed to take directX 10.1 out of it cause nvidia cards cant render it... thus ATi cards do better, and the @$$ holes over there can't handle that.

LOL, right...:rolleyes: Conspiracy theorists unite!!!
 
LOL, right...:rolleyes: Conspiracy theorists unite!!!

Wow... I totally missed that little gem.

yeah, DX10.1 really brought a slew of new features to the table... So sad that the boys in black made it disappear :(
 
something tells me ATi is going to strike back again.

I'd really like to see some more competition in the market.

now i wonder if this can affect gas prices JK
 
I point you to the Analyst Day and Jen-Hsun Huang's remarks. He wasn't just attacking Intel, for the sake of defending their "GPU territory". He was extremely confident and it seems to me that by then, they had already tested their GT200 prototypes and it must be extremely powerful, or they wouldn't be saying what they did. That said, I won't be surprised to see a GT200 powered card, be more than twice as fast as a 8800 GTX.

That doesnt make any sense.

Larabee is still 1-2 years away from being laucnhed as a viable contender.
GT200 is meant to compete with whatever AMD is launching right around the corner on the other hand.

GT200 is not some sort of wild card that NV can use as leverage against Intel.
GT200 is meant as an immediate response to AMD.

The subsequent chip (which is undoubtedly already in R&D) is what will be used to compete against Intel and whatever AMD will have up their sleeves post R700.


Oh, and in case some may have already forgotten Nvidia's past inadequacies...the FX 5000 series cards anyone? ;)
 
That doesnt make any sense.

Larabee is still 1-2 years away from being laucnhed as a viable contender.
GT200 is meant to compete with whatever AMD is launching right around the corner on the other hand.

GT200 is not some sort of wild card that NV can use as leverage against Intel.
GT200 is meant as an immediate response to AMD.

The subsequent chip (which is undoubtedly already in R&D) is what will be used to compete against Intel and whatever AMD will have up their sleeves post R700.


Oh, and in case some may have already forgotten Nvidia's past inadequacies...the FX 5000 series cards anyone? ;)

You need to better read the context, instead of replying immediately...

Obviously GT200 first incarnations are meant to compete with whatever AMD/ATI releases, but the point was NVIDIA, represented by its CEO, was very confident in both present and future (which may include Larrabee) and I don't think that's just because they feel threatened by it, but also because I'm guessing they had already tested GT200 prototypes and are ready for anything from any competitor. I don't think they would be as confident as they were during Analyst Day, without being almost sure, that what they have now, will dominate competition, wherever it may come.
 
You need to better read the context, instead of replying immediately...

Obviously GT200 first incarnations are meant to compete with whatever AMD/ATI releases, but the point was NVIDIA, represented by its CEO, was very confident in both present and future (which may include Larrabee) and I don't think that's just because they feel threatened by it, but also because I'm guessing they had already tested GT200 prototypes and are ready for anything from any competitor. I don't think they would be as confident as they were during Analyst Day, without being almost sure, that what they have now, will dominate competition, wherever it may come.

Because everybody can read minds, AND predict the future!
 
That doesnt make any sense.

Larabee is still 1-2 years away from being laucnhed as a viable contender.
GT200 is meant to compete with whatever AMD is launching right around the corner on the other hand.

GT200 is not some sort of wild card that NV can use as leverage against Intel.
GT200 is meant as an immediate response to AMD.

The subsequent chip (which is undoubtedly already in R&D) is what will be used to compete against Intel and whatever AMD will have up their sleeves post R700.


Oh, and in case some may have already forgotten Nvidia's past inadequacies...the FX 5000 series cards anyone? ;)


yes i was going to ask, what did that post have to do with anything?! intel and ATI are 2 totally different companies.

now back to the 5000 series...

yeah i had an FX5200. slowest card ever. it could barely run sims 2. i remember when my Nvidia PCX 5750 (basically an Nvidia 5700 made for PCI Express, one of the first cards out) blew out, started artifacting, then windows wouldn't load again. for the lords sake, the card was so terrible. the only game that it could run without stuttering was freelancer, and that was the game that it actually died on... mad wierd.
i think at the time Tiger Direct was asking like $150 for an FX5200 Ultra or maybe an FX5500 or something, and I almost bought one, then decided to go for an X850 pro, which actually still works to this day, despite being 4 years old-ishhh.

thank GOD that i did, because that woulda been horrible.
 
yes i was going to ask, what did that post have to do with anything?! intel and ATI are 2 totally different companies.

now back to the 5000 series...

yeah i had an FX5200. slowest card ever. it could barely run sims 2. i remember when my Nvidia PCX 5750 (basically an Nvidia 5700 made for PCI Express, one of the first cards out) blew out, started artifacting, then windows wouldn't load again. for the lords sake, the card was so terrible. the only game that it could run without stuttering was freelancer, and that was the game that it actually died on... mad wierd.
i think at the time Tiger Direct was asking like $150 for an FX5200 Ultra or maybe an FX5500 or something, and I almost bought one, then decided to go for an X850 pro, which actually still works to this day, despite being 4 years old-ishhh.

thank GOD that i did, because that woulda been horrible.

I picked up a 5900xt for $120 when the 9800pro was still $200. I thought that was an awesome deal, especially considering the huge price gap. I know everyone likes to disparage the 5xxx series, and I agree they were not the best years competitively for NVIDIA, but I can say I had a good experience :)
 
Yeah, you really can't fault the website for comparing it to the 9800GX2, as mentioned it is the fastest card out there, and more importantly it is the fastest card against the competing company.

Under your logic, what should they compare it against? the 9600GT? that makes less sense, you compare to the current model that is setting the bar for performance
 
You need to better read the context, instead of replying immediately...

Obviously GT200 first incarnations are meant to compete with whatever AMD/ATI releases, but the point was NVIDIA, represented by its CEO, was very confident in both present and future (which may include Larrabee) and I don't think that's just because they feel threatened by it, but also because I'm guessing they had already tested GT200 prototypes and are ready for anything from any competitor. I don't think they would be as confident as they were during Analyst Day, without being almost sure, that what they have now, will dominate competition, wherever it may come.

I still firmly believe that you are in error.

These companies do NOT work on singular technologies at any given time.

You can bet that Nvidia currently have anywhere from 2-3 generations of hardware that they are at the very least researching if not already trying to develop prototypes of.
*Note: I did not say revisions or iterations, but rather entirely new generations of hardware.

In fact, virtually every single major player out there, be it AMD, IBM, Intel, Nvidia, Samsung, TI, Motorola, etc. have multiple development teams that are researching technologies that are not meant for immediate release, but rather years down the line.

You can rest assured that Nvidia have such project teams as well.
 
I still firmly believe that you are in error.

These companies do NOT work on singular technologies at any given time.

You can bet that Nvidia currently have anywhere from 2-3 generations of hardware that they are at the very least researching if not already trying to develop prototypes of.
*Note: I did not say revisions or iterations, but rather entirely new generations of hardware.

In fact, virtually every single major player out there, be it AMD, IBM, Intel, Nvidia, Samsung, TI, Motorola, etc. have multiple development teams that are researching technologies that are not meant for immediate release, but rather years down the line.

You can rest assured that Nvidia have such project teams as well.
Exactly. The GT200 is simply what nVidia is readying for release now. Common sense should tell you that nVidia as well as AMD/ATi are already preparing future technologies that are significantly faster than what we have now and what is just around the corner. They have to be 1-2 generations ahead to stay competitive. It's not like they release a new GPU and say "Okay now we can start working on the next GPU," they are constantly researching new technologies and probably have new stuff ready to tape out at any given time.
 
I hate how fanboys on both sides say the opponent has nothing up their sleeves. Nvda learned their mistakes from the FX days, you can sure as hell bet on them having something competitive or perhaps even on an "amazing" level to fight back should the 48xx series prove to be revolutionary. Best thing to do is not to take sides but hope both sides will come out with great cards, this way we can keep prices down yet take home more performance for the money.
 
I still firmly believe that you are in error.

These companies do NOT work on singular technologies at any given time.

You can bet that Nvidia currently have anywhere from 2-3 generations of hardware that they are at the very least researching if not already trying to develop prototypes of.
*Note: I did not say revisions or iterations, but rather entirely new generations of hardware.

In fact, virtually every single major player out there, be it AMD, IBM, Intel, Nvidia, Samsung, TI, Motorola, etc. have multiple development teams that are researching technologies that are not meant for immediate release, but rather years down the line.

You can rest assured that Nvidia have such project teams as well.

And what does that have to do with what I said ?!...

I said that NVIDIA seemed way too confident in Analyst Day, which tells me that they already tested GT200 prototypes and were quite pleased with the results. That lead them to say quite bold things in that conference, regarding current and future competitors, which I guess they wouldn't, if GT200 wasn't so good.

What is so hard for you to understand in this simple guess of mine and why is it that you think it's wrong ?

It seems to me you still didn't understand what I said...
 
And what does that have to do with what I said ?!...

I said that NVIDIA seemed way too confident in Analyst Day, which tells me that they already tested GT200 prototypes and were quite pleased with the results. That lead them to say quite bold things in that conference, regarding current and future competitors, which I guess they wouldn't, if GT200 wasn't so good.

What is so hard for you to understand in this simple guess of mine and why is it that you think it's wrong ?

It seems to me you still didn't understand what I said...

You're the one who apparently fails to understand what I and others who have echoed my sentiments are trying to convey to you.

The GT200 is almost assuredly not the source of NV's certitude with respect to Intel. Think about it. Rumors have it that a 9900GTX will likely be a bit slower than a 9800GX2. Chips based on this technology will only be slightly faster, with minimal performance boosts probably also provided via drivers, but nothing significantly faster than this core product.

Intel's Larabee on the other hand is still so far out (relatively speaking with the presumed release of GT200 being around the corner) that by then, NV will almost certainly have an entirely new generation of chips ready for development which are meant as Larabee's competition.

NV as I noted, like most tech companies, have multiple project teams that are focusing on multiple products. I would bet the farm that the GT200 project team is not the source of NV's confident denunciation of Intel, but rather some other design that they are currently researching/working on.

Common sense would dictate that unless the GT200 is a design that is so unparalleled and advanced (that it would help to propel them well into the next decade and tackle Larabee when it too eventually comes out), then there must be something else up NV's sleeve, and there always is.

I mean seriously, don't you think its a bit presumptuous, if not outright outlandish and impetuous to believe that a chip designed for 2008 will be able to exchange blows with one that is being designed to take over the market in 2010?

Why is that so difficult for you to digest? :confused:
 
You're the one who apparently fails to understand what I and others who have echoed my sentiments are trying to convey to you.

The GT200 is almost assuredly not the source of NV's certitude with respect to Intel.

Intel's Larabee is still so far out (relatively speaking with the presumed release of GT200 being around the corner) that by then, NV will almost certainly have an entirely new generation of chips ready for development which are meant as Larabee's competition.

NV as I noted, like most tech companies, have multiple project teams that are focusing on multiple products. I would bet the farm that the GT200 project team is not the source of their confidence, but rather some other design that they are working on.

Common sense would dictate that, unless the GT200 is a design that is so unparalleled and advanced that it will help to propel them well into the next decade and tackle Larabee when it too eventually comes out.

I think that supposition however is a bit outlandish, don't you?

No, I don't, otherwise I wouldn't have said it.

NVIDIA's confidence in Analyst Day was most definitely related to what they have now, and as I've said many other times in this thread, even if you didn't understand it, NVIDIA's attacks and bold statements, were not just directed at future competitors (Intel), but also current ones (AMD/ATI). And that has everything to do with GT200. I wouldn't be surprised if GT200 chips are equipped with some provisions that basically makes it not depend on a good CPU so much. Curiously, this is also something that Scott Watson from Tech-Report said, in one of their podcasts and I agree. It's just a guess, but it's something that makes sense, given how the market is headed.

I never argued with a company's schedule, that obviously contains lots of efforts in terms of R&D. Efforts that will only be seen in a couple of years or more. This is something we, Video Card forum lurkers, know for a very long time. You are just stating the obvious. But GT200 will power NVIDIA's graphics cards for about 1-2 years and if the provisions I'm guessing exist in GT200 are true, then that technology will be used further down the line. Much like like G80's tech was used for GT200 development and that is certainly a weapon for NVIDIA, for the present and future.

You seem to be fixed on the idea that NVIDIA's stance in the Analyst Day was only targeted at Intel, but it wasn't. Most of their bold statements were in fact directed at Intel, but not exclusively. Maybe you should read or listen to that conference, since you surely didn't this far.
 
No, I don't, otherwise I wouldn't have said it.

NVIDIA's confidence in Analyst Day was most definitely related to what they have now, and as I've said many other times in this thread, even if you didn't understand it, NVIDIA's attacks and bold statements, were not just directed at future competitors (Intel), but also current ones (AMD/ATI). And that has everything to do with GT200. I wouldn't be surprised if GT200 chips are equipped with some provisions that basically makes it not depend on a good CPU so much. Curiously, this is also something that Scott Watson from Tech-Report said, in one of their podcasts and I agree. It's just a guess, but it's something that makes sense, given how the market is headed.

I never argued with a company's schedule, that obviously contains lots of efforts in terms of R&D. Efforts that will only be seen in a couple of years or more. But GT200 will power NVIDIA's graphics cards for about 1-2 years and if the provisions I'm guessing exist in GT200 are true, then that technology will be used further down the line. Much like like G80's tech was used for GT200 development and that is certainly a weapon for NVIDIA, for the present and future.

You seem to be fixed on the idea that NVIDIA's stance in the Analyst Day was only targeted at Intel, but it wasn't. Most of their bold statements were in fact directed at Intel, but not exclusively. Maybe you should read or listen to that conference, since you surely didn't this far.

Now you are corroborating exactly what I have been saying all along, but apparently are too parochial to see it for yourself, so I have highlighted it for you.


A) Either you or NV would have to be off your respective rockers to assume that the GT200 is a technology which can be used to combat a mostly esoteric one that is at the very least ~ 1 year away from being unleashed.

That would be foolish and impetuous to denounce a product which has the potential to easily eclipse anything NV currently have, especially in light of the fact that rumors abound point to a 9900GTX that is not even faster than a 9800 GX2. :rolleyes:

We know that some VIP customers have seen and tried GT200. We also know that the card is fast and it is definitely faster than Geforce 9800GTX but we are not sure if it can beat Geforce 9800 GX2.
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7273&Itemid=1

If that turns out to be the case, then where do you even get off thinking that NV was referring to the GT200 when it can presumably at best, only go toe-to-toe against one of their own last gen products? :confused:

You surely can't be that parochial, can you? :confused:

Besides, is this the card that is meant to render CPU's as obsolete?
Hell, it can't even render NV's own last gen product obsolete if speculations hold true. :p
Maybe they should focus on truly eclipsing their own last gen first before they even direct their antipathy and hubris towards CPU's let alone GPU's that are > 1 from being released.
Any maybe you should hold off on assuming as much as well. ;)


B)
You're argument about NV's denunciations not being targeted solely at Intel is a bit of a non sequitur.

For one, how many CPU manufacturers are viable threats to NV?
I can only think of 2: Intel and AMD.

Even if NV made the statement that CPU's will be rendered obsolete towards both Intel and AMD, who would it hurt more?
Simple, Intel.

AMD have their GPU division to fall back on.

Intel on the other hand is currently known almost exclusively for their CPU's (and if you really want to be astute about it, then sure chipsets as well, but so is AMD).

That jab was directed at Intel however, because Intel was recently making statements along the lines of CPU's being able to replace GPU's down the road.

Guess who would be ran out of business?
AMD? Nope. They also manufacture CPU's and have licensing rights to much of the same x86 based tech that Intel do.

NV on the other hand, would be screwed.
Not only do they trail Intel anyways in the integrated graphics sector and the graphics adapter market overall, but 81.5% of their annual revenue comes directly from their GPU business (which also consists of their PSB and CPB businesses). Their MCP business on the other hand is responsible for only 18.5% of their annual revenue.

http://www.xbitlabs.com/news/video/..._Back_Market_Share_from_Intel_Nvidia_JPR.html

http://www.secinfo.com/dvT9t.t4.htm#14yv

In other words, if NV's GPU's are rendered obsolete, then the amount of fixed costs and overhead will bury the company almost overnight, and NV will not be able to liquidate their assets fast enough.

AMD would survive. NV would not.

Why do I point that out? When Intel announced that sometime down the not-so-distant line, their multi-chip solutions (CPU+GPU on a chip) will render discrete graphics adapters as obsolete who do you think panicked and decided to make a PR statement of their own?

AMD? Nope.
They have access to many of the same licensing rights and technologies that Intel have at their disposal when it comes to manufacturing CPU's and GPU's.

NV however do NOT.


This is really common sense stuff that no one should be disputing, and I should not have to be spelling out.
 
Also, to further drive the point home that I was previously making.

Sustain Technology and Product Leadership in 3D Graphics and HD Video, and Media Communications and Ultra-Low Power.

We are focused on using our advanced engineering capabilities to accelerate the quality and performance of 3D graphics, HD video, media communications and ultra-low power processing in PCs and handheld devices. A fundamental aspect of our strategy is to actively recruit the best 3D graphics and HD video, networking and communications engineers in the industry, and we believe that we have assembled an exceptionally experienced and talented engineering team. Our research and development strategy is to focus on concurrently developing multiple generations of GPUs, including GPUs for high-performance computing, MCPs and mobile and consumer products that support PMPs, PDAs, cellular phones and other handheld devices using independent design teams. As we have in the past, we intend to use this strategy to achieve new levels of graphics, networking and communications features and performance and ultra-low power designs, enabling our customers to achieve superior performance in their products.
Taken from NV's 10K report.
http://www.secinfo.com/dvT9t.t4.htm#14yv


O.k. that's pretty obvious, but if you would now direct your attention towards page 47 of their annual 10-K Report you'll see something rather interesting.

R&D (which is a separate expense that is not a part of COGS) accounts for approximately 17% of their annual revenue.

In other words, they spend roughly $700 Million per year on the aforementioned as they "concurrently develop multiple generations of GPUs".

Assuming that the GT200 is that product which can render CPU's obsolete and knock out Intel's Larrabee before it even comes out is downright foolish and impetuous.

Read this article, which too echoes many of my sentiments on this topic:

Nvidia does battle with Intel, Moore's Law

The question is, who is going to be the largest provider of that differentiation and what form will it take? The pressure on Nvidia--expressed by Huang on Thursday at an analyst meeting--is understandable, as the company seeks to fend off both Intel and AMD, who are increasingly focused on graphics, said McGregor. "Nvidia faces serious challenges. One of their big customers (AMD) went out and acquired a competitor (ATI) and then (you have) Intel saying we're going into your territory." That has put Nvidia on edge. Intel, not surprisingly, is the biggest threat.
 
If you think Intel's Larrabee is any threat to NVIDIA, in the high-end. you need to think again. I even have doubts it will compete with NVIDIA's or AMD's mid-range, at the time it's out.

Also, are you seriously basing your comments on the 9900 GTX, on a fudzilla report ?

Rumored specs are out (some of them provided by fudzilla itself) and they clearly put the GT200 based card i.e. 9900 GTX at a level beyond the 9800 GX2. Obviously these are just rumored specs and as we've learned, specs don't always mean much, but that's the only thing we have to speculate about. That and what NVIDIA's CEO said in the Analyst Day and they wouldn't be so confident, if they didn't have something powerful in their hands.

Also, NVIDIA's CEO recently confirmed that it has no plans to buy VIA. I think that's further indication, that, for the next 1-2 years, NVIDIA has something else up its sleeve, in the form of GT200 and its derivatives.
 
If you think Intel's Larrabee is any threat to NVIDIA, in the high-end. you need to think again. I even have doubts it will compete with NVIDIA's or AMD's mid-range, at the time it's out.

Also, are you seriously basing your comments on the 9900 GTX, on a fudzilla report ?

Rumored specs are out (some of them provided by fudzilla itself) and they clearly put the GT200 based card i.e. 9900 GTX at a level beyond the 9800 GX2. Obviously these are just rumored specs and as we've learned, specs don't always mean much, but that's the only thing we have to speculate about. That and what NVIDIA's CEO said in the Analyst Day and they wouldn't be so confident, if they didn't have something powerful in their hands.

Also, NVIDIA's CEO recently confirmed that it has no plans to buy VIA. I think that's further indication, that, for the next 1-2 years, NVIDIA has something else up its sleeve, in the form of GT200 and its derivatives.



What am I stating as foolish and impetuous is the belief that the GT200 is somehow the product that will render CPU's as obsolete.

Larrabee is an altogether different beast that it has to tackle as well, and even then, it won't be the "GT200", but almost assuredly a different chip.
It's also a bit of a tautology to argue that the subsequent chip will be a derivative of sorts of the GT200.
Well, no kidding, the Q6600 is a distant derivative of the Intel 486 DX2/66.

However, do you have any idea how many changes had to be made to that chip?
Do you have any idea how many billions of dollars of R&D were invested into making those changes?

NV is doing the same each and every year (well, ok, less in the last 18 months than before, but nevertheless).

NV were essentially arguing that CPU's would be rendered obsolete (targetting Intel for all intents and purposes) a few weeks back, and you were arguing that the source of their new-found confidence is in the form of the GT200.

That is downright absurd.


How do you propose that the GT200 is going to render CPU's as obsolete?
 
Assuming that the GT200 is that product which can render CPU's obsolete and knock out Intel's Larrabee before it even comes out is downright foolish and impetuous.

I don't know if you're doing it on purpose, but you sure seem like it. NEVER have I said that GT200 will render CPUs obsolete, but I did make a guess, based on NVIDIA's comments during Analyst Day, that it wouldn't surprise me to see GT200 having some provisions, that make it depend less on a good CPU.

And again, if you think Larrabee is going to compete with AMD and especially NVIDIA, in the high-end GPUs, think again. Intel is targeted at the mid-low range, where the big bucks are and Intel knows it can't fight with years of experience of both ATI and NVIDIA, in the GPU arena. Thinking otherwise, is indeed foolish and impetuous.
 
I don't know if you're doing it on purpose, but you sure seem like it. NEVER have I said that GT200 will render CPUs obsolete, but I did make a guess, based on NVIDIA's comments during Analyst Day, that it wouldn't surprise me to see GT200 having some provisions, that make it depend less on a good CPU.

And again, if you think Larrabee is going to compete with AMD and especially NVIDIA, in the high-end GPUs, think again. Intel is targeted at the mid-low range, where the big bucks are and Intel knows it can't fight with years of experience of both ATI and NVIDIA, in the GPU arena. Thinking otherwise, is indeed foolish and impetuous.

Intel already destroys AMD and even beats NV in the low end graphics adapter sector.
Has been that way for ages now.

In case you forgot, NV trails Intel in market share and revenue when it comes to GPU's as a whole, and not just in the low-end sector, but even if you took the sum of all of NV's GPU businesses together it still does not match Intel's market share nor revenues.

Intel however are marketing Larrabee as more of a mid-high end part, it's their discrete offering after all. Why create a card that would only compete against their existing products in the low-end sector, and thus drive costs up for them overall if they are to abide by a higher standard and a more expensive low end product design? :confused:

They would be shooting themselves in the foot if they did that.
It simply doesnt make any sort of marketing or economic sense.



Secondly, you are still entirely confused about this entire exchange between Intel and NV.

Since you're out of the loop, let me clue you in.

It all started last month on the 10th when NV CEO, Huang, SPECIFICALLY attacked Intel stating: "We're going to open a can of whoop-ass" on Intel.
NVIDIA boss says "We're going to open a can of whoop-ass" on Intel


This was followed one day letter by an email circulating around the NV offices by one NV VP, Roy Taylor, who went on to state: the Intel CPU "dead." Roy says Intel is "panicking" because CPUs have "run out of steam," and that they "no longer make anything run faster."
NVIDIA VP joins the smack-talk fun, says the Intel CPU is "dead"


These were 2 specific attacks against Intel once again.
So where do you get off asserting otherwise, that they were not targetting Intel, as you stated in your previous post? :rolleyes::rolleyes:

Secondly, you go on to assume that NV's new found confidence stems from a GT200 which thus far is speculated to run even slower than a 9800 GX2 -- which is nothing to be ashamed of, but is far from something to be so proud of that you can declare CPU's dead and Larrabee as unworthy.

Give me a break.
Im through with this silly argument. :eek:
 
These were 2 specific attacks against Intel once again.
So where do you get off asserting otherwise, that they were not targetting Intel, as you stated in your previous post? :rolleyes::rolleyes:

Do you suffer from poor reading comprehension skills ?

Here's a quote from one of my previous posts:

Silus said:
You seem to be fixed on the idea that NVIDIA's stance in the Analyst Day was only targeted at Intel, but it wasn't. Most of their bold statements were in fact directed at Intel, but not exclusively. Maybe you should read or listen to that conference, since you surely didn't this far.

It's very clear what I said. Most of NVIDIA's bold statements were in fact directed at Intel, but not exclusively. If you have problems understanding that, I really having nothing else to say, but...go improve your reading skills ?

Hamidxa said:
Secondly, you go on to assume that NV's new found confidence stems from a GT200 which thus far is speculated to run even slower than a 9800 GX2 -- which is nothing to be ashamed of, but is far from something to be so proud of that you can declare CPU's dead and Larrabee as unworthy.

And you keep insisting on something I never said, which is shy off troll behavior at this point...

I'll say it again. Based on the confidence of NVIDIA's CEO at Analyst Day, I wouldn't be surprised to see GT200 and its derivatives be less and less dependent on a good CPU, as most current GPUs are. That means that, if these provisions are in place, having a powerful (and expensive) CPU coupled with a GT200 (or derivative) based card, could yield very similar results to a mid-low end CPU coupled with a GT200 (or derivative) based card. NEVER have I said that GT200 and its derivatives, will render CPUs obsolete. That's just something you made up and accused me of doing, for whatever delusional reason...

As for a GT200 based card being slower than a GX2, is a fudzilla report, which is substantiated by nothing more than his guesses. We'll have to wait and see on that one.

Hamidxa said:
Give me a break.
Im through with this silly argument. :eek:

I'm in agreement with that. Trying to maintain a coherent discussion with you, is quite impossible, since you keep accusing people of saying something they didn't. Good luck with that.
 
Back
Top