Nvidia kills GTX285, GTX275, GTX260, abandons the mid and high end market

Its all in the die-sizes. Just look at the similarities between this battle and the Intel/AMD battle. Who is winning that one? Intel. Why? Smaller die sizes = more cpus per wafer = more cpus for the consumer. Who is seemingly winning the ATI/Nvidia battle? ATI at the moment. Why? again because of its ability to shrink its GPU and produce more GPU's per wafer.

Nvidia needs to step up its engineering a little bit and get with the program. Before we know it(granted its will be at least 5 years), CPUs and GPUs will be using carbon nanotubes as transistors and the resulting dies sizes will be so small that the number of processors per wafer will be out the roof.

Charlie has the basics right and while he is embellishing a lot (for god sakes he worked for the UK Inquirer, it comes with the territory), there is quite a bit of truth to his article. Unless Nvidia starts shrinking their GPU's and coming out with logic that will enable this, I don't see them lasting all that long, especially with the upcoming introduction of Larrabee based GPU's from Intel, who are already quite a few steps ahead of Nvidia with their advanced logic and small die sizes.
 
Good lord,is this thread still going?:rolleyes: What ever happened to responsible journalism? All this furor over a third rate hack who couldn't buy space on any respectable site. The worst part is no one seems to remember what happens to the consumer's wallet when a company has no competition for their product. If Nvidia folded tomorrow,ATI would very quickly have a change of heart about their prices. And Charlie would be out of a job. Which might be worth it to see!
 
really guys? nvidia is not going to fall apart completely if it misses a product cycle. or two. hell, or maybe even three. ati was trash at one point, nvidia was trash at one point, intel was trash at one point, and so was amd. i can still remember when AMD was beating intel up across the board in high end procs, sure doesnt seem that way now.

point is, these are all big companies, with big people invested in their success. luckily, the real world doesnt move at internet speed.
 
Good lord,is this thread still going?:rolleyes: What ever happened to responsible journalism? All this furor over a third rate hack who couldn't buy space on any respectable site. The worst part is no one seems to remember what happens to the consumer's wallet when a company has no competition for their product. If Nvidia folded tomorrow,ATI would very quickly have a change of heart about their prices. And Charlie would be out of a job. Which might be worth it to see!

You're missimg the moral of the story, this is all about Greed, widespread mis-management and complete & utter disdain for your loyal customer base (YOU!). And don't you worry your little green heart, Intel will give AMD all they can handle on the GPU front in the next 12-18 months.
 
Like almost any lie, there is a certain amount of truth to what he says. We all know the Geforce GTX 200 series will be discontinued. It's just a question of time. So he's right about that much. As for the reasons why, timing, etc. he's full of it. When he said that all G92 chips were bad, and NVIDIA had production problems relating to them he was again partially right in that they did have a high defect rate, especially mobile versions of the chip. However this problem didn't translate to every G92 GPU ever made. When he said that Fermi suffered from poor yields, he was telling the truth. It wasn't quite as bad as he made it out to be but it was pretty bad none the less.

This is just more quarter to half truths which is par for the course with him.

Exactly. Which is why his site's name is fitting ;)
 
You could & they're not (AMD innocent) but Nvidia made FUD & deception a corpoate directive. Most if not all of AMD's shen's were reactionary to NV's filth tactics & I'm sure they figured tagging along was the only way to level the playing field until people got clued in to the crap they were being fed. This thread is living proof that many didn't & continue to exhibit lemming like denial & pack-rat mentality when their lone symbol of self worth & belonging (a freaking piece of silicon) is threatened. Kind of sad & pathetic. Ooooh Charlie is the DEVIL DUDE, he just smacked me in my E-peepee!
Looks like he's battin' about 1000 on Nvidia prognostication lately.
Welcome to [H] forums! :) Green lemmings on display. We must edumacate the occupants. I actually go to vid forums to make time pass by and laugh.
 
You're missimg the moral of the story, this is all about Greed, widespread mis-management and complete & utter disdain for your loyal customer base (YOU!). And don't you worry your little green heart, Intel will give AMD all they can handle on the GPU front in the next 12-18 months.

Why do people automatically assume that if one disdains the tactics of a hack like Charlie,they're pro-Nvidia? I base my decisions on whatever card best suits my needs at the time.As for Intel,I'll believe it when I see it. I certainly don't think their tactics would be a big improvement over Nvidia's should they gain control over the graphics market.
 
Why do people automatically assume that if one disdains the tactics of a hack like Charlie,they're pro-Nvidia?

Because they are the same as Charlie and support the same thing he does: i.e. just plain AMD fanboys or simply anti-NVIDIA.

Just check my sig for the latest greatest of one of them :)

It's quite irrelevant if what Charlie writes is true or false for these people, as long as it's supporting their color and bashing the color they don't like. I would like to see a NVIDIA fan with the same mouth foaming and insane nature, to write in a blog or website about everything AMD does twisted to look bad, so that we could see the same "double standards" I tried to show not too long ago in another thread. All of sudden, that "blogger" would be the target (to kill) of the same people that somewhat protect Charlie now :)

Hilarious isn't it ?
 
15 pages of denial and hilarity I concur my friend. Hope you make all your rounds of Nvidia defending today and for the weekend.
 
Correct me if I am wrong what he is saying is, nVIDIA is not gonna make any high end cards like GEFOREC 295 ? I only buy high end stuff nothing less, ever.
 
All I know is I don't want Nvidia to drop out. As much as I don't like them, and won't buy their hardware, a monopoly shall not happen.
 
...Wow what an extremely bias website...Did any one else look at all of the other articles? All about Nvidia being bad.

I highly doubt Nvidia is going under.

Where does this place get its information?

...word reached us that they are dead...

Word from sources deep in the bowels of 2701 San Tomas Expressway

WTF? Yea ok, Great way to bs...
 
I have bookmarked this thread for a later visit..aka for fun when NVIDIA next gen launches...some post are just to good to forget ;)
 
I have bookmarked this thread for a later visit..aka for fun when NVIDIA next gen launches...some post are just to good to forget ;)

I agree 4 months and this thread will be buried pretty deep, bookmarking it is a great idea.
 
nVidia's abandoning the high-end market? I don't think so.

Coming up next: AMD/ATI give away their latest CPU's and GPU's, designed by engineers with a holy writ, as they no longer need money and are sustained by the love of their fanbois.

This explains why I originally stopped reading the INQ. This also explains why I'll never bother to read his opinions without laughing. This guy just writes whatever crap he can come up with to grab one fanboi audience and push up numbers at his site. That's not the news, and it's not even semi-accurate.
 
Jesus Christ, some of you people need to work on your reading comprehension (or actually READ the article instead of just commenting on the headline). It's not about Nvidia completely leaving the high-end. It's about them stopping production of their CURRENT high-end products.

Obviously Charlie is heavily anti-Nvidia biased, but this one actually has a ring of truth to it. NV's high end parts are expensive to make, and I have no trouble believing that they can't be sold at a profit if they cut prices to compete with the Radeon 5000 series.

To everybody saying "OMG LOL Nvidia won't do that!", let me ask you a question. Let's assume that their high-end parts cannot be sold at a profit if they cut prices to the point where they're competitive in performance/$ with ATI. If that's the case, what do YOU think Nvidia should do? Should they continue to build more GTX 285/275/260s and sell them at a loss just to maintain market share?
 
Let's assume that their high-end parts cannot be sold at a profit if they cut prices to the point where they're competitive in performance/$ with ATI. If that's the case, what do YOU think Nvidia should do? Should they continue to build more GTX 285/275/260s and sell them at a loss just to maintain market share?

Exactly. This is what they should (and could) do. They have more than 60% desktop gpu game market share.
 
Again what Charlie does is sensationalize and exagerate. That's all he really does. That aside I don't think anyone is in denial here. NVIDIA is in trouble. I've seen it plenty of times in this industry where one company can dominate or even destroy another in just a generation or two. NVIDIA has some tough choices to make but frankly none of us know what they are really going to do. You can either believe Charlie, or take what he says with a grain of salt. Personally I think that Charlie has done what he's always done and keyed in on some truth and then spun it to fit his anti-NVIDIA agenda. So the truth is probably somewhere between Charlie's doom and gloom scenario and nothing being wrong. It's going to be on one end of the spectrum or the other. Unfortunately it's closer to the doom and gloom side of things than "nothing to see here."

Only time will tell but I believe NVIDIA is concentrating all their resources on Fermi and they are taking a huge gamble with it. It's either going to be a bold and brilliant success or a dismal failure. I don't think it will cause NVIDIA to go under, but it won't be good for them in the long term, or the short term if they fail.
 
Hope the driver support doesn't suffer. I just ordered a 9600GT for my HTPC last night.
 
Jesus Christ, some of you people need to work on your reading comprehension (or actually READ the article instead of just commenting on the headline).

That is the problem with "stories" from Charlie. People who have already experienced Charlie's "style" of reporting know that any article bearing his name predominantly comes off as nothing more than a diatribe against Nvidia. His articles convey the sense that he has a personal vendetta against the Green Team and typically devolve into rants. Because of that, any nuggets of truth he may have stumbled upon that were worthy of reporting are quickly overshadowed by the vitriol he spews.

Therefore, many people have simply chosen to stop reading Charlie's articles. Which, in a sense, is a shame, because like Kyle said, there is usually a basis of truth in what he reports. But due to his regrettable choice on what tone to use when presenting that information, many people simply tune him out now.
 
Some more grapes for you guys to digest http://techreport.com/discussions.x/17717

While we'd take all of the speculation with a grain of salt, we wouldn't be so quick to discount Charlie entirely—despite his history of badmouthing Nvidia. SemiAccurate got a fairly good estimate of Fermi's release schedule way back in July, and the die-size estimate wasn't too far off from our own. We'll wait and see.

Poor guy, he never gets credit for the stuff he gets correct. Cue the ear plugging "lalalalala" Charlie is the devil reincarnate.
 
Last edited:
Poor guy, he never gets credit for the stuff he gets correct. Cue the ear plugging "lalalalala" Charlie is the devil reincarnate.

What are you talking about? Tech Report just gave him credit for being right about that information...
 
What do you think Charlies sources actually told him? I suspect exactly the same as what was was picked up by xbitlabs sources, here is their news article:

http://www.xbitlabs.com/news/video/...a_of_Creating_Shortage_on_the_GPU_Market.html

That's says nvidia and ati too are cutting back on high end cards to cut costs and stop being screwed end of quarter by their partners, who are complaining about it. It's not very exciting really.

I don't see xbitlabs being discussed anywhere, or stuck on hardocp's front page?

Obviously fanboys of both sides don't care about the truth, they'll fill the forums with biased rubbish till the cows come home - most video/cpu forums fight a constant battle to keep this under control or the rest of us will give up and go elsewhere (hence H having separate nvidia/ati areas).

What's kind of unfortunately is the supposedly unbiased quality news sites (not just H, there are quite a few) are willing to effectively elevate one fanboy (charlie) to their level by posting his obvious fanboy trolling on their front pages. This just exacerbates the situation. It's like leaving pools of stagnant water all over the place in mosquito season - pretty soon the place is full of them and everyone is getting stung. (or pouring water all over your mogwai if you like your film references - turns everyone into gremlins :) )

I'm sure this site gets lots more hits and a lot more posts as the forum fills with thousands of posts arguing over charlies latest reason why nvidia is the cause of world hunger, but I would question whether this is fostering a better community, or just polarising it, making it a much less pleasant place to be.
 
Coming up next: AMD/ATI give away their latest CPU's and GPU's, designed by engineers with a holy writ, as they no longer need money and are sustained by the love of their fanbois.

This explains why I originally stopped reading the INQ. This also explains why I'll never bother to read his opinions without laughing. This guy just writes whatever crap he can come up with to grab one fanboi audience and push up numbers at his site. That's not the news, and it's not even semi-accurate.

Yeah, I'd say most of the industry hardware sites exihibit lots of spin & bias, whether they're paid shills or not. It's what distinguishes [H]ard|OCP from the the Fanboi sites. Drop by Fudzilla if you you like to keep abreast of the latest FUD & spin from the NV PR camp. Faud there is the guy that regurgitated verbatim NV's ever changing damage control stories as the Fermi mock-up cock-up evolved. Charlie's got it in for NV undoubtedly, likely related to some lies they fed him when he was with the Inq, reported on & ended up making him look like a fool. The point is, he's been deadly accurate in sifting through the mountains of BS & coming up with the real story as it pertains to Nvidia's bungling, long before anyone else has the stones to call them on it. Sure, there's lot's of speculation (Semi-Accurate DOH!) but it's rather childish to dis him when he's right the majority of the time just cause the truth deflates the Fanboi ego.
 
What do you think Charlies sources actually told him? I suspect exactly the same as what was was picked up by xbitlabs sources, here is their news article:

http://www.xbitlabs.com/news/video/...a_of_Creating_Shortage_on_the_GPU_Market.html

I'll just quote the one reply to that article, because it says it all. Including your poor reading comprehension...

"ATI Radeon HD 4870 X2 is going out because of the imminent release of 5870 X2, and you also got the 5870 in its spot now, btw nvidia shortage is alot diffrent, as they DO not have anything new, and the GT200 die is kinda big and they will lose money if they drop the price thats why NVIDIA *makes* a shortage"

All you had to do was read the only comment on the article. And I'm sure the massive increase in put buys on Nvidia stock is just some wacky cosmic coincidence too right? Sorta like a clock being right when your blind and stupid?
 
The point is, he's been deadly accurate in sifting through the mountains of BS & coming up with the real story as it pertains to Nvidia's bungling, long before anyone else has the stones to call them on it. Sure, there's lot's of speculation (Semi-Accurate DOH!) but it's rather childish to dis him when he's right the majority of the time just cause the truth deflates the Fanboi ego.

Many people in this thread have already agreed that there is truth to many of his stories. Beyond that everything he says has his own anti-NVIDIA spin. People also take what he has to say about NVIDIA with a grain of salt because AMD is the only advertiser on his site. Add to that the fact that almost all his articles are NVIDIA bashing and it's hard to take the guy seriously. As for sifting through the BS, your right. He does sift through it before adding his own. If he reported on this stuff without the hate and with a more balanced viewpoint, I think his reputation would improve by quite a bit. He won't do that because his tabloid style sensationalism probably gets his site more hits.
 
Poor guy, he never gets credit for the stuff he gets correct. Cue the ear plugging "lalalalala" Charlie is the devil reincarnate.

That's actually quite funny, because not even that is correct.

Die Size isn't known yet, but speculation through the use of known 40nm silicon (GT215), puts GF100 @ 480mm2, not 530mm2 as Charlie "says".

Second, the release schedule isn't correct either, since 1) there is no release yet, so one can't confirm or deny Charlie's "speculation" and 2) NVIDIA is saying they will release GF100 based cards in 2009.

So unless you choose to believe Charlie on NVIDIA's products info, instead of NVIDIA itself (which I'm sure you do, for the obvious reasons...)...what was correct about those two ? That's right, nothing, since none can be confirmed yet.

If that's the best you can do to help Charlie's credibility, I don't think he wants you on his side :)
 
Yeah, I'd say most of the industry hardware sites exihibit lots of spin & bias, whether they're paid shills or not. It's what distinguishes [H]ard|OCP from the the Fanboi sites. Drop by Fudzilla if you you like to keep abreast of the latest FUD & spin from the NV PR camp. Faud there is the guy that regurgitated verbatim NV's ever changing damage control stories as the Fermi mock-up cock-up evolved. Charlie's got it in for NV undoubtedly, likely related to some lies they fed him when he was with the Inq, reported on & ended up making him look like a fool. The point is, he's been deadly accurate in sifting through the mountains of BS & coming up with the real story as it pertains to Nvidia's bungling, long before anyone else has the stones to call them on it. Sure, there's lot's of speculation (Semi-Accurate DOH!) but it's rather childish to dis him when he's right the majority of the time just cause the truth deflates the Fanboi ego.

I've often asked this to the Charlie lover crowd, but could never really get a proper answer with links to prove them. So since you joined the forum specifically to protect Charlie's credibility, I'll ask you to show everyone, in what was Charlie "right in the majority of the time".

To prove this, you actually need other sources to corroborate Charlie's story. Everyone will be waiting.

Thanks

PS: You trying to make Fuad worse than Charlie was hilarious btw :)
 
Even a broken clock is right twice a day.

Charlie churns out so many anti-nvidia articles that some of it will probably come true. A lot of it doesn't.
 
I've often asked this to the Charlie lover crowd, but could never really get a proper answer with links to prove them. So since you joined the forum specifically to protect Charlie's credibility, I'll ask you to show everyone, in what was Charlie "right in the majority of the time".

To prove this, you actually need other sources to corroborate Charlie's story. Everyone will be waiting.

Thanks

PS: You trying to make Fuad worse than Charlie was hilarious btw :)

Well now with 10 seconds thought here's a few of the current & relevant ones:
1) Got it right on the fake card which NV initially outright denied.
2) Got it right when Nvidia PR spun it as a prototype / engineering sample then was forced to come clean.
3) Got it right with all the BS rebranding shenanigans of stale chips.
4) Got it so right with Bumpgate & was the only one to wade through the mountains of Nvidia spin & FUD trying to bury it. Heroic effort!
5) Got it right long before anyone else on NV folding the MB chipset division. Bye Bye SLI

Being a Macbrick Pro owner, I'm particularly fond of his refusal to let go of the Bumpgate debacle when NV was spinning crap in every direction trying to reneg on their warranty obligation. I didn't know who Charle was until I started researching sites to see if other Macbook owners were experiencing similar problems. He seemed to be the only one willing to wade throgh the FUD to get to the truth & the root of the problem which Nvidia was obviously well aware of. Unbelievably, they continued to ship out chips with the faulty packaging, knowing full well a high percentage of them would fail. What's discraceful is not the fact that they cut a few corners & manufactured bad parts. It's that they took almost 3 years to fess up, correct the problem and offer some sort of financial restitution so that warranties on bricked computers could be handled fairly. Read the link(s), it's pretty entertaining stuff!

http://www.semiaccurate.com/2009/08/21/nvidia-finally-understands-bumpgate/
 
That's actually quite funny, because not even that is correct.

Die Size isn't known yet, but speculation through the use of known 40nm silicon (GT215), puts GF100 @ 480mm2, not 530mm2 as Charlie "says".

Second, the release schedule isn't correct either, since 1) there is no release yet, so one can't confirm or deny Charlie's "speculation" and 2) NVIDIA is saying they will release GF100 based cards in 2009.

So unless you choose to believe Charlie on NVIDIA's products info, instead of NVIDIA itself (which I'm sure you do, for the obvious reasons...)...what was correct about those two ? That's right, nothing, since none can be confirmed yet.

If that's the best you can do to help Charlie's credibility, I don't think he wants you on his side :)
Thanks but I'll listen to Techreport over you, no offense buddy.

I forgot all about bumpgate and the above listed..bumpgate and all the rebranding he sure is wrong alot huh guys? selective memory ftw.
 
Well now with 10 seconds thought here's a few of the current & relevant ones:
1) Got it right on the fake card which NV initially outright denied.
2) Got it right when Nvidia PR spun it as a prototype / engineering sample then was forced to come clean.
3) Got it right with all the BS rebranding shenanigans of stale chips.
4) Got it so right with Bumpgate & was the only one to wade through the mountains of Nvidia spin & FUD trying to bury it. Heroic effort!
5) Got it right long before anyone else on NV folding the MB chipset division. Bye Bye SLI

Being a Macbrick Pro owner, I'm particularly fond of his refusal to let go of the Bumpgate debacle when NV was spinning crap in every direction trying to reneg on their warranty obligation. I didn't know who Charle was until I started researching sites to see if other Macbook owners were experiencing similar problems. He seemed to be the only one willing to wade throgh the FUD to get to the truth & the root of the problem which Nvidia was obviously well aware of. Unbelievably, they continued to ship out chips with the faulty packaging, knowing full well a high percentage of them would fail. What's discraceful is not the fact that they cut a few corners & manufactured bad parts. It's that they took almost 3 years to fess up, correct the problem and offer some sort of financial restitution so that warranties on bricked computers could be handled fairly. Read the link(s), it's pretty entertaining stuff!


http://www.semiaccurate.com/2009/08/21/nvidia-finally-understands-bumpgate/

QFT Go Charlie go! :p
 
Back
Top