Nvidia kills GTX285, GTX275, GTX260, abandons the mid and high end market

there has to be some truth to what he wrote. in todays litigious environment not many people would stick there neck out without knowing what they are saying is true. there would be tort lawyers crawling all over him if he was just making this stuff up.
 
there has to be some truth to what he wrote. in todays litigious environment not many people would stick there neck out without knowing what they are saying is true. there would be tort lawyers crawling all over him if he was just making this stuff up.

This is the internet. You can make shit up all day long if you want. It's the wild west.

This retard Charlie, just took some comments way out of context. Nvidia or ATI don't give a shit about all of this gossip and bitching, they care about the bottom line and will do what ever is necessary to improve their profits and their market share. Wouldn't it make sense to stop production of the 200 series and replace it with the 300 series?
 
Most large corporations don't go around actively dismissing every rumour any random person utters. In fact, the majority of companies play it like Apple: they avoid any and all opportunity to comment on news stories when questioned for comment. There are certainly exceptions, but for the most part, companies keep quiet, and that's actually good policy for the most part.

Yeah, except Nvidia DID come out and specifically address both this article and the previous one about the fake Fermi card. My point is that their responses weren't very reassuring or convincing.
 
All the absolute fan boism is really ghey.

I agree. But aren't you like a sponsored fanboi yourself? ;)

Truth be told, I like Charlie bc he gets everyone riled-up on [H]. Makes for a good thread.:p But the green team has been diabolical with their strategy... ATi is the underdog just making excellent price-performance stuff. Video cards for the proletariat!:cool:
 
Seriously guys, nothing needs to be said about this in this thread. What more could be said. We just need to all sit down tonight, think of Charlie, and pray....

PunchpeopleoverTCPIP.jpg
 
He asked for SOURCES proving what Charlie said was correct. I don't think you understood that. Charlie's words are not a source that can be used to back up what Charlie says.

Let's take these things one at a time shall we?



True.



This is really part of the first one, but really this isn't a huge deal. Most of the time cards shown at these events that get handled by everyone aren't working samples anyway. Yes NVIDIA misrepresented it as working silicon but this fact by itself is barely news worthy. Unless your Charlie and you need more fuel for the NVIDIA hate machine.



Oh no, this is where you go painfully wrong. First off, that's just Charlie's anti-NVIDIA attitude and opinion on the subject your defending. NVIDIA doesn't and has not denied that they rebrand GPUs from generation to generation. What exactly is wrong with rebranding a higher end chip as next generations mid-range or lower end parts if it still competes? (The correct answer: Nothing.) Rebranding GPUs for other markets and price points is a time honored tradition and it's something ATI/AMD have done themselves. Everyone does it including AMD and Intel.



As far as Charlie wading through the "mountains" of NVIDIA spin & FUD again I say, he did that, and then added his own spin to further his anti-NVIDIA agenda. Again, so what? NVIDIA has had and is continuing to have manufacturing problems. Again while news worthy, Charlie just uses it to fuel his own NVIDIA hate machine and furthermore, he over stated the problem on numerous occasions stating that "every G92 GPU" was defective and other assanine and untrue statements like he usually does. Mean while I had no less than 8 G92 GPU's and variants of them that were working flawlessly. As did several others. We had a thread on that here as a matter of fact and everyone posting their "success" stories with G92 by itself is not a huge deal, but it was proof that Charlie was over stating the problem. Yes, many G92's may have been defective, but that's a far cry from "all G92's are defective."

Now, I will agree that NVIDIA did a poor job handling the situation and they didn't take the proper steps to resolve the issues they had with their mobile GPU parts until the problem was nearly out of control. So I'll chalk this up as a win for Charlie, but it could have been reported in such a way so that we didn't have to sift through his BS. (Basically him crying that every single one of them was defective, when that was not the case.)



Again you are off base. First off what Charlie predicted wasn't magical either. I said the same thing as soon as it hit that Intel wouldn't allow NVIDIA to manufacture chipsets for Nehalem. Given what they were doing (or rather not doing) on the AMD side of the chipset business, I said they'd go under to. I'm sorry but I'm not going to congratulate anyone for "predicting" the obvious. I did the same thing, and without any "sources." Nice try though. Oh, and SLI didn't go anywhere. It's still here, and it's been allowed to work with Intel chipsets for about a year now. Care to try again?

OK DanDee, I've got a minute now so I'll try to make this as simple & concise as possible since there seems to be some comprehension challenges here. This is the first "tech" forum I've ever registered for or posted on. I'm a business owner & part-time musician. I own 3 pieces of computer hardware, 2 with Nvidia graphics chips & an old P4 desktop with an ATI X800AIW card. I have ZERO fanboi allegiance to any manufacturer of inanimate objects. I don't, nor do have time to play computer games and have no E-penis insecurities. I stumbled upon Charlie's musings on the Inquirer when trying to determine the cause of video problems with my Macbrick Pro w/ Nvidia G96 chipset. I have no concern for Charlie's jounalistic integrity but do respect that he digs a little deeper to decipher the plap that's sprewed out of the NVIDIA PR machine. Prior to my MacBrick experience, about as unbiased about video chip manufacturers as one could be.

1) & 2) Fermi Fake: It's pretty pathetic when the CEO holds up an obvious fake at a GPU Conference they had planned to present at 6 mo. prior and proclaims "This puppy here is Fermi" and "You are currently looking at a few day old silicon". When it was revealed to be a piss poor mock up they could have simply admitted it, end of story. Instead NV FUD central puts it into high gear & it becomes an "engineering sample" or "prototype". Third spin, yeah it's a brick Jen-Sung-Il was waving around but the demo was running on Fermi silicon that no-one can look at. Bullshit with a capital B. They obviously had no operating Fermi silicon & the sole intent was to dupe shareholders, the industry press & prospective purchasers into believing they actually had a product ready to go into production. Yeah that's "barely news worthy". You can revisit this on their promised launch date sometime in Q4 2009. Here's some links to the affair at our fav NV Fanboi site where poor Faud had to eat dirt cause he wasn't in on the spin. What a fraudulent joke that turned out to be.
http://www.fudzilla.com/content/view/15813/65/
http://www.fudzilla.com/content/view/15798/65/

3) Rebranding: No one has done it more dishonestly & deceptively than those upstanding folks at Nvidia, even having the audacity to up the prices on some of the old crap they couldn't sell. I loved Charlie's line on this "The new parts are 55nm, just like the old. The clocks aren't different, nothing from a user perspective is different, other than the card losing 9550, an X and a +. Oh yeah, they are jacking up the price for the stupid as well" . Priceless! Even NV fanboi site Fudzilla had trouble making sense of it. On top of that they pressured (bribed or threatened?) review sites to provide favorable press & I think our intrepid Kyle of this esteemed site was caught up in the mess.

Here's a couple of links in case memory fails you again:
http://www.fudzilla.com/content/view/11203/34/
http://www.theinquirer.net/inquirer/news/1051123/nvidia-cuts-reviewers-gts250

4) Bumpgate -You figure ol Charlie was overstating the problem huh? Having a vested interest ($1,700 paperweight) I followed his articles that waded through the swamp of "Official" Nvidia lies, deception & denial to the final Bumpgate conclusion where NV was forced come clean. You say it's not a big deal widespread problem? Well certainly all of the chips with the bad packaging haven't (or haven't yet) failed but NV have dished out $320,000,000 (yeah that's US$ million) in restitution to date & likely more to come with (ex) partners like Apple & Dell having to extend warranties. I'm not an expert on manufacturing costs, but I'm willing to guess that's more than a handful of chips. Correct me if I'm wrong again, but to me that's a filthy rotten way to treat your customers & end users. Here's some links if you need a little memory refresher:
http://www.9to5mac.com/nvidia-display-cards
http://www.theinquirer.net/inquirer...le-macbook-pros-have-nvidia-bad-bump-material

5) MB Chipset El Foldo - Here's the link to Charlie's August 2008 article. http://www.theinquirer.net/inquirer/news/1021993/nvidia-chipsets-history
If you were circulating the same news at the time Dan_Dee then kudos to you! I can't be bothered at this point linking the innumerable Jen-Hsun-Il BS & fanboi denials but they're out there en-masse. Read through the article, looks pretty freakin accurate today though he may have missed the date when they'd actually pull the plug by a month or so. SLI not going anywhere? Hmmmm.... who's supporting SLI in their new chipsets, Intel or AMD? Maybe it'll be IBM.

The truth hurts sometimes but you've got to learn to suck it up Buttercups. The directors & PR spinners that have run this company into the ground are some of the most reprehensible slime to have crawled the face of the planet. I'd have an easier time buying into Bernie Madoff's latest investment scheme.

Over & Out Boyos!
 
Last edited:
Yeah, except Nvidia DID come out and specifically address both this article and the previous one about the fake Fermi card. My point is that their responses weren't very reassuring or convincing.
Their responses don't need to be: they aren't designed to reassure us. I think you're reading a bit too much into things, to be honest.
 
Did anyone mention 3dfx?

Fermi is sounding more like the Voodoo 5 6000 everyday...

I would hate for Nvidia to leave the market and have mega-giant Intel step in, but they brought this on themselves if it's really all true... which it very well could be the way they're acting.
 
Last edited:
Their responses don't need to be: they aren't designed to reassure us.

uhh, no...

WE are their customers, they better fucking make sure we are fully reassured or else their goes all their business!

:rolleyes:
 
Just like the words of a ATI fanboy. Dont get me wrong, Im not a fanboy of nVidia either. I just like good solid products on my rig.
 
Their responses don't need to be: they aren't designed to reassure us. I think you're reading a bit too much into things, to be honest.

Then what exactly were those responses designed to do? Is there ANY other possible reason for them to come out and address a negative rumor that's going around?
 
WE are their customers, they better fucking make sure we are fully reassured or else their goes all their business!
You think NVIDIA or AMD really care about us (their customers) enough to bother? Like smart said, if they had wanted to reassure us, they could have potentially done more than say "this is not true" (I'm not sure what, but something)

On that point, it doesn't really matter anyway. The story is about NVIDIA cutting a product line, and that hasn't happened yet. NVIDIA knows that very few enthusiasts are really "on the fence" about buying a GTX 280, 275 or 285 as it is given AMD's new parts. There isn't much "damage" to try to "control", at least not yet.

Then what exactly were those responses designed to do?
Like I said, they aren't intended to reassure us. NVIDIA's primary concern is its shareholders, not its customers (you would agree, I'm sure). I am somewhat surprised that a NVIDIA rep responsed to Kyle's inquiry, but then again Kyle does hold a position in the community which NVIDIA apparently finds significant enough not to dismiss.
 
You think NVIDIA or AMD really care about us (their customers) enough to bother? Like smart said, if they had wanted to reassure us, they could have potentially done more than say "this is not true" (I'm not sure what, but something)

On that point, it doesn't really matter anyway. The story is about NVIDIA cutting a product line, and that hasn't happened yet. NVIDIA knows that very few enthusiasts are really "on the fence" about buying a GTX 280, 275 or 285 as it is given AMD's new parts. There isn't much "damage" to try to "control", at least not yet.


Like I said, they aren't intended to reassure us. NVIDIA's primary concern is its shareholders, not its customers (you would agree, I'm sure). I am somewhat surprised that a NVIDIA rep responsed to Kyle's inquiry, but then again Kyle does hold a position in the community which NVIDIA apparently finds significant enough not to dismiss.

Reassuring us is part of it. How do you think shareholders would react if consumers had no confidance in the company?
 
<Excellent post and well said!>
Over & Out Boyos!

That now brings us to the volunteer Nvidia pr fanbois in this thread and the daily routine of running to each thread defending them like their firstborn. It's a mixed bag of creepy + funny + entertainment as they do it for free. The emotional attachment to a piece of PCB, wires, and silicon needs to be studied and added to the DSM IV some time in the near future. This thread really is extraordinary.
 
I agree. But aren't you like a sponsored fanboi yourself? ;)

Truth be told, I like Charlie bc he gets everyone riled-up on [H]. Makes for a good thread.:p But the green team has been diabolical with their strategy... ATi is the underdog just making excellent price-performance stuff. Video cards for the proletariat!:cool:

Sponsored by Nvidia/ATI/AMD/Intel? . . . .uh. yes, fan boy of what? All of them ;)
I am the guy who answers questions with a definite maybe.
 
Charlie stated this. I dont think i ever want to buy from nvidia again

The board has wood screws crudely driven through it. The vents on the end plate are blocked. The DVI connector is not soldered to anything, The SLI connectors are somewhat covered by a heat shield. The 8-pin power connector is connected to nothing. The 6-pin connector is connected to the PCB with glue, not pins and solder. The board is crudely chopped off with power tools. The 8-pin connector that should be there is not. The 6-pin connector that should be there is cut. The mounting holes are too close to the edge. There are also likely many more flaws, but this should be enough to prove a point.
 
Charlie stated this. I dont think i ever want to buy from nvidia again

A picture is worth a thousand words. Feel good funny article of the year for me.

[
Fermi_end_plate_cropped.jpg


In other news Nvidia was spotted at Home Depot negotiating bulk prices for wood screws. JHH "this puppy here is Fermi! *crowd applause* priceless.
 
Charlie stated this. I dont think i ever want to buy from nvidia again
wow, you mean a mockup is built like a mock-up? Somebody stop the presses!

Myself, and many others, aren't going to buy a 5800 card until the GT300 numbers are final. Its not a matter of the 5870 being too slow or too expensive, its just not worth getting screwed as soon as GT300 comes out and is faster. Believe it or not, 512 shaders (GTX285x2.133) on an improved arch is going to be faster than 1600 (aka 4870x2) on the same arch. Especially when the 285 was faster than the 4870!
 
wow, you mean a mockup is built like a mock-up? Somebody stop the presses!

Myself, and many others, aren't going to buy a 5800 card until the GT300 numbers are final. Its not a matter of the 5870 being too slow or too expensive, its just not worth getting screwed as soon as GT300 comes out and is faster. Believe it or not, 512 shaders (GTX285x2.133) on an improved arch is going to be faster than 1600 (aka 4870x2) on the same arch. Especially when the 285 was faster than the 4870!

i bet you'll love paying $650 for that card too when you have less than 10% more speed than a 5870 too right?
 
OK DanDee, I've got a minute now so I'll try to make this as simple & concise as possible since there seems to be some comprehension challenges here. This is the first "tech" forum I've ever registered for or posted on. I'm a business owner & part-time musician. I own 3 pieces of computer hardware, 2 with Nvidia graphics chips & an old P4 desktop with an ATI X800AIW card. I have ZERO fanboi allegiance to any manufacturer of inanimate objects. I don't, nor do have time to play computer games and have no E-penis insecurities. I stumbled upon Charlie's musings on the Inquirer when trying to determine the cause of video problems with my Macbrick Pro w/ Nvidia G96 chipset. I have no concern for Charlie's jounalistic integrity but do respect that he digs a little deeper to decipher the plap that's sprewed out of the NVIDIA PR machine. Prior to my MacBrick experience, about as unbiased about video chip manufacturers as one could be.

1) & 2) Fermi Fake: It's pretty pathetic when the CEO holds up an obvious fake at a GPU Conference they had planned to present at 6 mo. prior and proclaims "This puppy here is Fermi" and "You are currently looking at a few day old silicon". When it was revealed to be a piss poor mock up they could have simply admitted it, end of story. Instead NV FUD central puts it into high gear & it becomes an "engineering sample" or "prototype". Third spin, yeah it's a brick Jen-Sung-Il was waving around but the demo was running on Fermi silicon that no-one can look at. Bullshit with a capital B. They obviously had no operating Fermi silicon & the sole intent was to dupe shareholders, the industry press & prospective purchasers into believing they actually had a product ready to go into production. Yeah that's "barely news worthy". You can revisit this on their promised launch date sometime in Q4 2009. Here's some links to the affair at our fav NV Fanboi site where poor Faud had to eat dirt cause he wasn't in on the spin. What a fraudulent joke that turned out to be.
http://www.fudzilla.com/content/view/15813/65/
http://www.fudzilla.com/content/view/15798/65/

3) Rebranding: No one has done it more dishonestly & deceptively than those upstanding folks at Nvidia, even having the audacity to up the prices on some of the old crap they couldn't sell. I loved Charlie's line on this "The new parts are 55nm, just like the old. The clocks aren't different, nothing from a user perspective is different, other than the card losing 9550, an X and a +. Oh yeah, they are jacking up the price for the stupid as well" . Priceless! Even NV fanboi site Fudzilla had trouble making sense of it. On top of that they pressured (bribed or threatened?) review sites to provide favorable press & I think our intrepid Kyle of this esteemed site was caught up in the mess.

Here's a couple of links in case memory fails you again:
http://www.fudzilla.com/content/view/11203/34/
http://www.theinquirer.net/inquirer/news/1051123/nvidia-cuts-reviewers-gts250

4) Bumpgate -You figure ol Charlie was overstating the problem huh? Having a vested interest ($1,700 paperweight) I followed his articles that waded through the swamp of "Official" Nvidia lies, deception & denial to the final Bumpgate conclusion where NV was forced come clean. You say it's not a big deal widespread problem? Well certainly all of the chips with the bad packaging haven't (or haven't yet) failed but NV have dished out $320,000,000 (yeah that's US$ million) in restitution to date & likely more to come with (ex) partners like Apple & Dell having to extend warranties. I'm not an expert on manufacturing costs, but I'm willing to guess that's more than a handful of chips. Correct me if I'm wrong again, but to me that's a filthy rotten way to treat your customers & end users. Here's some links if you need a little memory refresher:
http://www.9to5mac.com/nvidia-display-cards
http://www.theinquirer.net/inquirer...le-macbook-pros-have-nvidia-bad-bump-material

5) MB Chipset El Foldo - Here's the link to Charlie's August 2008 article. http://www.theinquirer.net/inquirer/news/1021993/nvidia-chipsets-history
If you were circulating the same news at the time Dan_Dee then kudos to you! I can't be bothered at this point linking the innumerable Jen-Hsun-Il BS & fanboi denials but they're out there en-masse. Read through the article, looks pretty freakin accurate today though he may have missed the date when they'd actually pull the plug by a month or so. SLI not going anywhere? Hmmmm.... who's supporting SLI in their new chipsets, Intel or AMD? Maybe it'll be IBM.

The truth hurts sometimes but you've got to learn to suck it up Buttercups. The directors & PR spinners that have run this company into the ground are some of the most reprehensible slime to have crawled the face of the planet. I'd have an easier time buying into Bernie Madoff's latest investment scheme.

Over & Out Boyos!

I can think of nothing better than to just sit here and remind you that when someone wants the best of the best, top performing, all around kick ass video card, they go Nvidia, PERIOD. the 295, Nvidia's old hardware, can go toe to toe with NEW Ati hardware.

When Nvidia takes 6 extra months to put something out, it's because it's going to run like a raped ape. When I want a Wal-Mart performing card, I'll go buy an Ati.

How does a snap back to reality sit with you, Sswifty? Tough turd to swallow? Now STFU.


*disclaimer to everyone else (I like the 5870, but this guy needs to just shut up...good god
 
Reassuring us is part of it. How do you think shareholders would react if consumers had no confidance in the company?
A fair point, but I think the benefit is ancillary to a degree. As someone who's been witness to the antics of publicly traded companies for some time now, I've come to the conclusion that customer satisfaction with a company and the company's market standing are rarely directly linked (unfortunately), and in fact often times the opposite is true (fuck customers by cutting major corners and stock prices have the potential to soar). It seems vague, but things really never are as they seem in the corporate world.

We are certainly not what matters to these guys.
 
Charlie is one of the ultimate trolls. And I dont mean that in a good way. NOTHING he says can be taken seriously whether its true or not because his whole focus in "journalism" is bad-mouthing nVidia. He is going scream foul whether its true or not so why give him any kind of credence? :confused: The guy is just a sad crusade against one color of silicone.
 
I give it a 6.5 on the foaming at the mouth scale - o meter. If he used some facts to put together a coherent point to dig in to Sswifty, it would be a 9.5. But a 10 on the dramatic scale for sure.
 
OK DanDee, I've got a minute now so I'll try to make this as simple & concise as possible since there seems to be some comprehension challenges here. This is the first "tech" forum I've ever registered for or posted on. I'm a business owner & part-time musician. I own 3 pieces of computer hardware, 2 with Nvidia graphics chips & an old P4 desktop with an ATI X800AIW card. I have ZERO fanboi allegiance to any manufacturer of inanimate objects. I don't, nor do have time to play computer games and have no E-penis insecurities. I stumbled upon Charlie's musings on the Inquirer when trying to determine the cause of video problems with my Macbrick Pro w/ Nvidia G96 chipset. I have no concern for Charlie's jounalistic integrity but do respect that he digs a little deeper to decipher the plap that's sprewed out of the NVIDIA PR machine. Prior to my MacBrick experience, about as unbiased about video chip manufacturers as one could be.

Uh, ok.

1) & 2) Fermi Fake: It's pretty pathetic when the CEO holds up an obvious fake at a GPU Conference they had planned to present at 6 mo. prior and proclaims "This puppy here is Fermi" and "You are currently looking at a few day old silicon". When it was revealed to be a piss poor mock up they could have simply admitted it, end of story. Instead NV FUD central puts it into high gear & it becomes an "engineering sample" or "prototype". Third spin, yeah it's a brick Jen-Sung-Il was waving around but the demo was running on Fermi silicon that no-one can look at. Bullshit with a capital B. They obviously had no operating Fermi silicon & the sole intent was to dupe shareholders, the industry press & prospective purchasers into believing they actually had a product ready to go into production. Yeah that's "barely news worthy". You can revisit this on their promised launch date sometime in Q4 2009. Here's some links to the affair at our fav NV Fanboi site where poor Faud had to eat dirt cause he wasn't in on the spin. What a fraudulent joke that turned out to be.
http://www.fudzilla.com/content/view/15813/65/
http://www.fudzilla.com/content/view/15798/65/

Why are we talking about this? I agreed with you! (Though I don't think it is quite as big a deal as you and Charlie seem to think it is.)

3) Rebranding: No one has done it more dishonestly & deceptively than those upstanding folks at Nvidia, even having the audacity to up the prices on some of the old crap they couldn't sell. I loved Charlie's line on this "The new parts are 55nm, just like the old. The clocks aren't different, nothing from a user perspective is different, other than the card losing 9550, an X and a +. Oh yeah, they are jacking up the price for the stupid as well" . Priceless! Even NV fanboi site Fudzilla had trouble making sense of it. On top of that they pressured (bribed or threatened?) review sites to provide favorable press & I think our intrepid Kyle of this esteemed site was caught up in the mess.

Here's a couple of links in case memory fails you again:
http://www.fudzilla.com/content/view/11203/34/
http://www.theinquirer.net/inquirer/news/1051123/nvidia-cuts-reviewers-gts250

My memory is fine. Again you are simply agreeing with Charlie's opinion on the subject. NVIDIA never stated that they weren't rebranding earlier GPUs. For example, the first [H] 9600GT evaluation flat out stated that it was a rebranded Geforce 8800GT from last generation. The same thing was said in the first 9800GTX evaluation as well. As for the pricing, that's up to them. If you don't like their offerings because they are "rebadged" GPUs from another generation, then vote with your wallet and buy something else. This information is well known whenever these "re-GPUs" launch. I don't see how NVIDIA's being dishonest about what these GPUs and cards are based on this. So it seems you are the one in need of a refresher.

4) Bumpgate -You figure ol Charlie was overstating the problem huh? Having a vested interest ($1,700 paperweight) I followed his articles that waded through the swamp of "Official" Nvidia lies, deception & denial to the final Bumpgate conclusion where NV was forced come clean. You say it's not a big deal widespread problem? Well certainly all of the chips with the bad packaging haven't (or haven't yet) failed but NV have dished out $320,000,000 (yeah that's US$ million) in restitution to date & likely more to come with (ex) partners like Apple & Dell having to extend warranties. I'm not an expert on manufacturing costs, but I'm willing to guess that's more than a handful of chips. Correct me if I'm wrong again, but to me that's a filthy rotten way to treat your customers & end users. Here's some links if you need a little memory refresher:
http://www.9to5mac.com/nvidia-display-cards
http://www.theinquirer.net/inquirer...le-macbook-pros-have-nvidia-bad-bump-material

Reading comprehension fails you it seems. At least in this case. I never said anything about it being "no big deal or that it wasn't widespread." I said it wasn't as big of a deal as Charlie made it out to be and that's the truth. He claimed the problem is worse than it was. He said that all their GPUs (or at least all G92 derrived parts) were defective and that isn't the truth. A large percentage, or even a widespread problem does NOT equal ALL of them being defective. You see the word "all" would indicate that all of the GPUs were defective. Since several of them are still working today, this statement is obvisouly false. This should be obvious. Again what Charlie says has truth to it. I'm not and never had denied that. You just have to cut through the BS and his personal agenda to find the real truth which usually isn't evident until weeks or months later.

5) MB Chipset El Foldo - Here's the link to Charlie's August 2008 article. http://www.theinquirer.net/inquirer/news/1021993/nvidia-chipsets-history
If you were circulating the same news at the time Dan_Dee then kudos to you! I can't be bothered at this point linking the innumerable Jen-Hsun-Il BS & fanboi denials but they're out there en-masse. Read through the article, looks pretty freakin accurate today though he may have missed the date when they'd actually pull the plug by a month or so. SLI not going anywhere? Hmmmm.... who's supporting SLI in their new chipsets, Intel or AMD? Maybe it'll be IBM.

You have no idea what you are talking about. Intel's chipsets currently support SLI. Ever hear of X58 or P55? NVIDIA is basically done in the chipsete market, well excluding their Atom based offerings, so it would seem that Charlie is right about that. Again that wasn't hard to predict once the news leaked out that Intel wouldn't be allowing NVIDIA to make chipsets for their Nehalem CPUs.

As for your comment about IBM, that's more absurd than anything you've said to date. IBM doesn't want to be in the consumer hardware business. IBM is all about being an outsourcing solutions provider. They derrive 85% of their annual revenue from that business unit alone. They left the consumer computer hardware business some time ago. I doubt they'll get back into it anytime soon.
 
Last edited:
When Nvidia takes 6 extra months to put something out, it's because it's going to run like a raped ape.
Please share with me tomorrow's lotto numbers since you obviously have a crystal ball.
Seems you have one of your own, you obviously know exactly how the ATI and Nvidia cards will perform 6 months from now.
Now STFU.
You probably should take your own advise.
 
Heh I think it's time for this thread to get shut off, people are starting to just be stupidly irrational. What happens in the future happens and what has happened now has happened all that is left is to deal with the results.
 
Amusing thread/article title, since AMD/ATI is dominating the low end market.

You just cannot beat a HD4770/4850 for the 100 dollar price point. 4770 double shines due to extremely awesome performance per watt of heat/energy used, making them prime candidates for a cheap but uberfast crossfire setup.

Whenever I feel my single 4770 cannot handle the games I play anymore (no issues so far playing wow & others at 1920x1200), Ill simply drop a second one in. The reviews show its one of the best crossfire cards you can ask for, basically a straight up doubling of performance, plus amazing overclock headroom.

Id go with a 4850 if you dont plan on adding a second one later (lots of heat and wont be around much longer) though.

Nvidia's parts that compete with the 4770/4850 are still selling at over 120, and they often still lose in the benches to cheaper ati solutond. Bad move nv :)
 
I can think of nothing better than to just sit here and remind you that when someone wants the best of the best, top performing, all around kick ass video card, they go Nvidia, PERIOD. the 295, Nvidia's old hardware, can go toe to toe with NEW Ati hardware.

When Nvidia takes 6 extra months to put something out, it's because it's going to run like a raped ape. When I want a Wal-Mart performing card, I'll go buy an Ati.

How does a snap back to reality sit with you, Sswifty? Tough turd to swallow? Now STFU.


*disclaimer to everyone else (I like the 5870, but this guy needs to just shut up...good god
uhm yeah so does atis own old hardware aka 4870x2 and so did nvidias old hardware 9800gx2 vs gtx 280 lol what kinda logic is that
 
uhm yeah so does atis own old hardware aka 4870x2 and so did nvidias old hardware 9800gx2 vs gtx 280 lol what kinda logic is that

Combination of Valtrex and green coolaid has nasty side-effects.
 
Wait... someone explain something to me.

I've been following this as closely as I've been able, but I'm not understanding something: what does nVidia's ceasing chipset manufacturing have to do with their GPU manufacturing?

nVidia never stated they were ceasing to manufacture GPUs, they stated they are done with mobo chipsets. What does one have to with the other?

It does seem they're going in some weird direction, and it's even stranger we've heard nothing from them regarding their next-gen GPUs, but how did the assumption that they were "leaving the GPU market" arise out of all this?

I'm just trying to get a clear handle on this situation, because none of that makes any sense. Regardless of any "problems" or "falling-out" they might be having with Intel, and regardless of their ceasing to make mobo chipsets, how does this equate all this "nVidia is leaving the GPU market to ATi!" nonsense?
 
Last edited:
Wait... someone explain something to me.

I've been following this as closely as I've been able, but I'm not understanding something: what does nVidia's ceasing chipset manufacturing have to do with their GPU manufacturing?

nVidia never stated they were ceasing to manufacture GPUs, they stated they are done with mobo chipsets. What does one have to with the other?

It does seem they're going in some weird direction, and it's even stranger we've heard nothing from them regarding their next-gen GPUs, but how did the assumption that they were "leaving the GPU market" arise out of all this?

I'm just trying to get a clear handle on this situation, because none of that makes any sense. Regardless of any "problems" or "falling-out" they might be having with Intel, and regardless of their ceasing to make mobo chipsets, how does this equate all this "nVidia is leaving the GPU market to ATi!" nonsense?
Nvidia has stated they are not done with chipsets either. They are waiting the resolution to legal issues with Intel before proceeded to make more chipsets for Intel boards.
 
Wait... someone explain something to me.

I've been following this as closely as I've been able, but I'm not understanding something: what does nVidia's ceasing chipset manufacturing have to do with their GPU manufacturing?

nVidia never stated they were ceasing to manufacture GPUs, they stated they are done with mobo chipsets. What does one have to with the other?

It does seem they're going in some weird direction, and it's even stranger we've heard nothing from them regarding their next-gen GPUs, but how did the assumption that they were "leaving the GPU market" arise out of all this?

I'm just trying to get a clear handle on this situation, because none of that makes any sense. Regardless of any "problems" or "falling-out" they might be having with Intel, and regardless of their ceasing to make mobo chipsets, how does this equate all this "nVidia is leaving the GPU market to ATi!" nonsense?

Nobody is saying nvidia is closing the door on GPU or anything. reality is, they dont have a next gen product to compete with AMD/ATI yet, current generation products cost a lot to make so they have to sell it at a loss if they have to compete with AMD's mid range, low end etc.
either nvidia can sell them at loss or not sell anything until their Fermi based products hit the street. Charlie is trying to convince people that nvidia is not going to sell anything for another couple of months...
chipset business is entirely different thing. its not related to GPU.
 
Back
Top