Nvidia kills GTX285, GTX275, GTX260, abandons the mid and high end market

Now any who knows me knows that I've always preferred ATI to Nvidia. But I currently run a GTX 260 because it was the best buy price/performance at the time. Charlie there has always had a bug up his bum when it comes to Nvidia (ex-girlfriend?). Whatever the reason, he's always been way out of line when it comes to Nvidia. Are they heavy handed in the market? Perhaps from a marketing standpoint, but such talk that they are on the way out is just ridiculous! Despite my personal pleasure at witnessing ATI garner significant improvements over the last two years, and despite the fact that Nvidia has hit a few bumps, it all just levels out the playing field once again. I expect great things from both in the next year, and the competition thus raised will benefit us consumers. But Charlie trying to drive a stake in the heart of a non-existent Nvidia Ghoul is none-less than Don Quixote tilting at windmills. I never cared for Nvidia, Charlie. But in truth, they aren't a bad outfit, from the end-user's perspective.
 
Well now with 10 seconds thought here's a few of the current & relevant ones:
1) Got it right on the fake card which NV initially outright denied.
2) Got it right when Nvidia PR spun it as a prototype / engineering sample then was forced to come clean.
3) Got it right with all the BS rebranding shenanigans of stale chips.
4) Got it so right with Bumpgate & was the only one to wade through the mountains of Nvidia spin & FUD trying to bury it. Heroic effort!
5) Got it right long before anyone else on NV folding the MB chipset division. Bye Bye SLI

So 1) and 2) are the same and even though JHH never said it was a working prototype, I'll give you that one.
3) This was reported by several others prior to Charlie, that just picked up on someone else's inf file from a driver, but ok, that's two.
4) He did ? Where's the proof ? He said every chip (55nm and 65nm) suffered from the problem (G92, G92, G96, GT200, etc, etc). Where are all of these failures ? Such a massive defect on all of these chips would certainly have loads of sources calling out NVIDIA because of it...but there aren't...and despite me asking for external links, you provided none. That's not surprising though! So that's one down.
5) Folding of the chipset division ? Have you heard of Ion ? Guess not :) And that's two down.

So since 1) and 2) are the same "news" basically that's one right, 3) was someone else's work that Charlie called his own, but that's two.
4) and 5) are false. So 2 out of 4 are more or less accurate and 2 are wrong.

You understand that "majority" when 2 parties are involved, means > 50% right ?
 
When Larrabee comes out will Charlie be attacking Intel as well? I think Intel's move into this segment is such a wild-card that is makes guessing the success of either Nvidia or AMD's next gen offerings very difficult. Hopefully this damn recession will finally end so there are customers ready to actually buy some new products.
 
Thanks but I'll listen to Techreport over you, no offense buddy.

I forgot all about bumpgate and the above listed..bumpgate and all the rebranding he sure is wrong alot huh guys? selective memory ftw.

That's no surprise really :)
Despite the fact that nothing that Charlie said about GT300 (he didn't even know the chip's real codename...) has been confirmed or denied (yet), you are already choosing to believe his version! That's great!

Kind of reminds of Minority Report, where people were being convicted of crimes they haven't committed yet! Oh but Charlie has a crystal ball right ?
 
When Larrabee comes out will Charlie be attacking Intel as well? I think Intel's move into this segment is such a wild-card that is makes guessing the success of either Nvidia or AMD's next gen offerings very difficult. Hopefully this damn recession will finally end so there are customers ready to actually buy some new products.

The recession is something all camps need to take into consideration. ATI was forced be necessity to take the lower-priced road due to their lesser (comparatively) performing cards. Now that they are neck and neck again, it remains to be seen if they will take the recession into consideration. If Intel can introduce a competitive GPU, I for one am all for it.
 
When Larrabee comes out will Charlie be attacking Intel as well? I think Intel's move into this segment is such a wild-card that is makes guessing the success of either Nvidia or AMD's next gen offerings very difficult. Hopefully this damn recession will finally end so there are customers ready to actually buy some new products.


He has a thing against nvidia. The Inq. has a thing against nVidia. I've owned several nVidia cards. But i've been pulling for ATi from the 8 pipe softmod 9500 Pro days. I love to see nVidia get bashed.:p
 
Well now with 10 seconds thought here's a few of the current & relevant ones:
1) Got it right on the fake card which NV initially outright denied.
2) Got it right when Nvidia PR spun it as a prototype / engineering sample then was forced to come clean.
3) Got it right with all the BS rebranding shenanigans of stale chips.
4) Got it so right with Bumpgate & was the only one to wade through the mountains of Nvidia spin & FUD trying to bury it. Heroic effort!
5) Got it right long before anyone else on NV folding the MB chipset division. Bye Bye SLI

Being a Macbrick Pro owner, I'm particularly fond of his refusal to let go of the Bumpgate debacle when NV was spinning crap in every direction trying to reneg on their warranty obligation. I didn't know who Charle was until I started researching sites to see if other Macbook owners were experiencing similar problems. He seemed to be the only one willing to wade throgh the FUD to get to the truth & the root of the problem which Nvidia was obviously well aware of. Unbelievably, they continued to ship out chips with the faulty packaging, knowing full well a high percentage of them would fail. What's discraceful is not the fact that they cut a few corners & manufactured bad parts. It's that they took almost 3 years to fess up, correct the problem and offer some sort of financial restitution so that warranties on bricked computers could be handled fairly. Read the link(s), it's pretty entertaining stuff!

http://www.semiaccurate.com/2009/08/21/nvidia-finally-understands-bumpgate/

He asked for SOURCES proving what Charlie said was correct. I don't think you understood that. Charlie's words are not a source that can be used to back up what Charlie says.

Let's take these things one at a time shall we?

1) Got it right on the fake card which NV initially outright denied.]

True.

2) Got it right when Nvidia PR spun it as a prototype / engineering sample then was forced to come clean.]

This is really part of the first one, but really this isn't a huge deal. Most of the time cards shown at these events that get handled by everyone aren't working samples anyway. Yes NVIDIA misrepresented it as working silicon but this fact by itself is barely news worthy. Unless your Charlie and you need more fuel for the NVIDIA hate machine.

3) Got it right with all the BS rebranding shenanigans of stale chips.]

Oh no, this is where you go painfully wrong. First off, that's just Charlie's anti-NVIDIA attitude and opinion on the subject your defending. NVIDIA doesn't and has not denied that they rebrand GPUs from generation to generation. What exactly is wrong with rebranding a higher end chip as next generations mid-range or lower end parts if it still competes? (The correct answer: Nothing.) Rebranding GPUs for other markets and price points is a time honored tradition and it's something ATI/AMD have done themselves. Everyone does it including AMD and Intel.

4) Got it so right with Bumpgate & was the only one to wade through the mountains of Nvidia spin & FUD trying to bury it. Heroic effort!]

As far as Charlie wading through the "mountains" of NVIDIA spin & FUD again I say, he did that, and then added his own spin to further his anti-NVIDIA agenda. Again, so what? NVIDIA has had and is continuing to have manufacturing problems. Again while news worthy, Charlie just uses it to fuel his own NVIDIA hate machine and furthermore, he over stated the problem on numerous occasions stating that "every G92 GPU" was defective and other assanine and untrue statements like he usually does. Mean while I had no less than 8 G92 GPU's and variants of them that were working flawlessly. As did several others. We had a thread on that here as a matter of fact and everyone posting their "success" stories with G92 by itself is not a huge deal, but it was proof that Charlie was over stating the problem. Yes, many G92's may have been defective, but that's a far cry from "all G92's are defective."

Now, I will agree that NVIDIA did a poor job handling the situation and they didn't take the proper steps to resolve the issues they had with their mobile GPU parts until the problem was nearly out of control. So I'll chalk this up as a win for Charlie, but it could have been reported in such a way so that we didn't have to sift through his BS. (Basically him crying that every single one of them was defective, when that was not the case.)

5) Got it right long before anyone else on NV folding the MB chipset division. Bye Bye SLI]

Again you are off base. First off what Charlie predicted wasn't magical either. I said the same thing as soon as it hit that Intel wouldn't allow NVIDIA to manufacture chipsets for Nehalem. Given what they were doing (or rather not doing) on the AMD side of the chipset business, I said they'd go under to. I'm sorry but I'm not going to congratulate anyone for "predicting" the obvious. I did the same thing, and without any "sources." Nice try though. Oh, and SLI didn't go anywhere. It's still here, and it's been allowed to work with Intel chipsets for about a year now. Care to try again?
 
He asked for SOURCES proving what Charlie said was correct. I don't think you understood that. Charlie's words are not a source that can be used to back up what Charlie says.

Let's take these things one at a time shall we?



True.



This is really part of the first one, but really this isn't a huge deal. Most of the time cards shown at these events that get handled by everyone aren't working samples anyway. Yes NVIDIA misrepresented it as working silicon but this fact by itself is barely news worthy. Unless your Charlie and you need more fuel for the NVIDIA hate machine.



Oh no, this is where you go painfully wrong. First off, that's just Charlie's anti-NVIDIA attitude and opinion on the subject your defending. NVIDIA doesn't and has not denied that they rebrand GPUs from generation to generation. What exactly is wrong with rebranding a higher end chip as next generations mid-range or lower end parts if it still competes? (The correct answer: Nothing.) Rebranding GPUs for other markets and price points is a time honored tradition and it's something ATI/AMD have done themselves. Everyone does it including AMD and Intel.



As far as Charlie wading through the "mountains" of NVIDIA spin & FUD again I say, he did that, and then added his own spin to further his anti-NVIDIA agenda. Again, so what? NVIDIA has had and is continuing to have manufacturing problems. Again while news worthy, Charlie just uses it to fuel his own NVIDIA hate machine and furthermore, he over stated the problem on numerous occasions stating that "every G92 GPU" was defective and other assanine and untrue statements like he usually does. Mean while I had no less than 8 G92 GPU's and variants of them that were working flawlessly. As did several others. We had a thread on that here as a matter of fact and everyone posting their "success" stories with G92 by itself is not a huge deal, but it was proof that Charlie was over stating the problem. Yes, many G92's may have been defective, but that's a far cry from "all G92's are defective."

Now, I will agree that NVIDIA did a poor job handling the situation and they didn't take the proper steps to resolve the issues they had with their mobile GPU parts until the problem was nearly out of control. So I'll chalk this up as a win for Charlie, but it could have been reported in such a way so that we didn't have to sift through his BS. (Basically him crying that every single one of them was defective, when that was not the case.)



Again you are off base. First off what Charlie predicted wasn't magical either. I said the same thing as soon as it hit that Intel wouldn't allow NVIDIA to manufacture chipsets for Nehalem. Given what they were doing (or rather not doing) on the AMD side of the chipset business, I said they'd go under to. I'm sorry but I'm not going to congratulate anyone for "predicting" the obvious. I did the same thing, and without any "sources." Nice try though. Oh, and SLI didn't go anywhere. It's still here, and it's been allowed to work with Intel chipsets for about a year now. Care to try again?

Off topic

I expect a motherboard editor to know the difference between your and you're.
 
He asked for SOURCES proving what Charlie said was correct. I don't think you understood that. Charlie's words are not a source that can be used to back up what Charlie says.

Let's take these things one at a time shall we?



True.



This is really part of the first one, but really this isn't a huge deal. Most of the time cards shown at these events that get handled by everyone aren't working samples anyway. Yes NVIDIA misrepresented it as working silicon but this fact by itself is barely news worthy. Unless your Charlie and you need more fuel for the NVIDIA hate machine.
Short Version = True



Oh no, this is where you go painfully wrong. First off, that's just Charlie's anti-NVIDIA attitude and opinion on the subject your defending. NVIDIA doesn't and has not denied that they rebrand GPUs from generation to generation. What exactly is wrong with rebranding a higher end chip as next generations mid-range or lower end parts if it still competes? (The correct answer: Nothing.) Rebranding GPUs for other markets and price points is a time honored tradition and it's something ATI/AMD have done themselves. Everyone does it including AMD and Intel.
He wasn't questioning the practice of it, but Charlie did call it out for the 2-3 rounds of rebranding, the third time was when everyone was saying "no f 'n way they'll do it again" So verdict = True


As far as Charlie wading through the "mountains" of NVIDIA spin & FUD again I say, he did that, and then added his own spin to further his anti-NVIDIA agenda. Again, so what? NVIDIA has had and is continuing to have manufacturing problems. Again while news worthy, Charlie just uses it to fuel his own NVIDIA hate machine and furthermore, he over stated the problem on numerous occasions stating that "every G92 GPU" was defective and other assanine and untrue statements like he usually does. Mean while I had no less than 8 G92 GPU's and variants of them that were working flawlessly. As did several others. We had a thread on that here as a matter of fact and everyone posting their "success" stories with G92 by itself is not a huge deal, but it was proof that Charlie was over stating the problem. Yes, many G92's may have been defective, but that's a far cry from "all G92's are defective."

Now, I will agree that NVIDIA did a poor job handling the situation and they didn't take the proper steps to resolve the issues they had with their mobile GPU parts until the problem was nearly out of control. So I'll chalk this up as a win for Charlie, but it could have been reported in such a way so that we didn't have to sift through his BS. (Basically him crying that every single one of them was defective, when that was not the case.)
Short version = True



now. Care to try again?

Add to the list : I now recall Charlie talking about how the 2 series Nvidia was taking a loss (back when it was still new) and only recently Nvidia admitted. SO the point really is what Charlie predicts correctly not a question of 'what Nvidia does that is ethically wrong' was the point that guy was making.
 
I'm sure Dan does. Going out on a limb and guessing typo.
Charlie gets alot of his guesses from B3D. They've, (forum users) caught him quoting their guesses on new tech and get a kick about it.
 
Last edited:
So 1) and 2) are the same and even though JHH never said it was a working prototype, I'll give you that one.
3) This was reported by several others prior to Charlie, that just picked up on someone else's inf file from a driver, but ok, that's two.
4) He did ? Where's the proof ? He said every chip (55nm and 65nm) suffered from the problem (G92, G92, G96, GT200, etc, etc). Where are all of these failures ? Such a massive defect on all of these chips would certainly have loads of sources calling out NVIDIA because of it...but there aren't...and despite me asking for external links, you provided none. That's not surprising though! So that's one down.
5) Folding of the chipset division ? Have you heard of Ion ? Guess not :) And that's two down.

So since 1) and 2) are the same "news" basically that's one right, 3) was someone else's work that Charlie called his own, but that's two.
4) and 5) are false. So 2 out of 4 are more or less accurate and 2 are wrong.

You understand that "majority" when 2 parties are involved, means > 50% right ?

Don't understand what your motivation is for refuting already established fact. Are you a Viral Mousekateer? Unfortunately, I have a business to run at the moment & don't have time to illustrate the absurdity of your post. Stay tuned, I'll be back tonight to lay it out for you in simplistic terms so you can maybe understand what's really happening here.

Cheers!
 
I personally will buy nVidia or ATI if it makes sense price/performance wise my previous video card was nVidia. nVidia generally has better manufacturers (EVGA, BFG..). However nVidia does certain things that cause me to like ATI more. Here is my list:

1.) price gauging, remember $600 prices on the 8800 GTX cards, for almost a year?
2.) PhysiX support disabled if another card (ATI) is detected in the system. WTF I bought the nVidia card why are you disabling features I paid for. This pisses me off to no end.
3.) Latest Batman game developed closely working with nVidis folks disables AA on ATI hardware?
4.) CUDA is dead long live OpenCL
5.) Model naming rehashes are confusing and are borderline false advertisement (8800, 9800 and 250 cards)

I can't stress the point number 2 enough. It's both anti-competitive and anti consumer because they are ripping off their own customers with that action.
 
Off topic

I expect a motherboard editor to know the difference between your and you're.

I'm sure Dan does. Going out on a limb and guessing typo.
Charlie gets alot of his guesses from B3D. They've, (forum users) caught him quoting their guesses on new tech and get a kick about it.

Yes, it was a typo on my part.
 
To recap:

1) Charlie specifically is an nvidia fanboi, to a degree. This colors his writing

2) He's a sensationalist. This colors his writing.

3) All his news comes from the grapevine and can only be confirmed through his anonymous sources.

4) Many things he puts out there don't turn out to be close to reality.

5) He nails many stories that do turn out to be true long before they're anywhere near reportable even in the big tech blogs.

It's a style of journalism that is useful to many of us, despite the fact that it has more resemblance to tabloids than to newspapers or television news. The style tends to make a website wildly popular, until it has grown to the point that they fire the ones with the low standards for proof for flubbing too many stories - and the journalist goes on to write elsewhere, starting the cycle over. See: The Register, The Inquirer. Both incorrect information and original investigation are found much more often in this tradition - the ratio of A to B is probably about the same as some MSM rags at this point, which mostly repeat the words of others that they consider trusted sources. It is what it is
 
It's okay Dan, I'm sure Stoly corrected Kyle when he kept saying 5770 when he meant 5870. :D
 
Well, Charlie likes to stir the pot and that he does.
I do my part by not supporting his site. Someone will always quote it. :p
 
He asked for SOURCES proving what Charlie said was correct.
[...]

Basically, reading your post you are saying that Charlie was never actually wrong or lying, just that he blows things way out of proportion and is often not the first to draw some correct conclusions.
That's not bad for a "tech tabloid" called "semi-accurate", is it? :)
 
Basically, reading your post you are saying that Charlie was never actually wrong or lying, just that he blows things way out of proportion and is often not the first to draw some correct conclusions.
That's not bad for a "tech tabloid" called "semi-accurate", is it? :)

I've said it a couple times already. There is truth to what he says, but he sensationalizes everything and he blows some stuff way out of proportion. Charlie's math skills seem to suck. For example:

Reletively high percentage of defective chips=OMFG they are all defective! NVIDIA SUXORZ!!!!

Again, you can read his site or don't. Whether or not his type of "journalism" has some value is up to the reader.
 
Basically, reading your post you are saying that Charlie was never actually wrong or lying, just that he blows things way out of proportion and is often not the first to draw some correct conclusions.
That's not bad for a "tech tabloid" called "semi-accurate", is it? :)

He's like the Glenn Beck of the tech world, he takes snippets of truth and writes it to the worst case scenario to paint it in the worst light, unfortunately for Nvidia alot of it has come true. ;)
 
Don't understand what your motivation is for refuting already established fact. Are you a Viral Mousekateer?
You're the one who signed up at this forum solely to defend Charlie, and you're calling him a shill? That's laughable.

Unfortunately, I have a business to run at the moment & don't have time to illustrate the absurdity of your post.
This is doublespeak for "I need to think of a counter-argument, so I'll claim that I have really important business to attend to so as to delay my official response".

I expect a motherboard editor to know the difference between your and you're.
So terribly petty.

Now, if you gentlemen don't mind, I have a state to govern, a Bugatti to maintain and a diamond-encrusted, gold-plated runway model/actress to nail!
 
pot-kettle.jpg
 
Facts:

1. Nvidia's PR flack only says: "There is no truth to this". The brevity of the reply uses a clever ploy renowned amongst spinners. Define "This". Moreover, only one element of Charlie's article (or Kyle's quote) has to be wrong for Nvidia to be able to say "No Truth". Such as: they are not "killing" the parts, they are merely "managing their/AIB inventory", or EOL'ing the parts ... which is, strictly speaking, not exactly the same as "killing". Or only 3 and not 4 parts are being killed (the fourth tomorrow). Or maybe they are not objecting to the EOL'ing part, just the withdrawal because, as their board has been told, it's merely a "product replenishment timing issue". No matter that there aren't any to act as replenishers!

I'd be more assured, given Nvidia's track-record of lying or fudging the truth themselves, if Kyle demanded Nvidia say yes/no explicitly as to EOL of each part, whether they have replacement parts now (or widely available in the channel within 30 days) for the withdrawn parts. BTW, the 2xx series are NOT replacements for the withdrawn parts. They are, at best, renamed (once again) 9600/9800x parts.

Absent a clear, definitive and detailed official statement from Nvidia, I would conclude that the PR department is being sleazy again, and that Charlie is right. After all, Charlie can be sued by Nvidia if he is wrong, and that hasn't happened yet ... despite the scads of articles that he has written about Nvidia. In fact, I'm surprised that Kyle has fallen for this Nvidia PR ploy.

2. Charlie ALWAYS gives the details, and leaves it up to you as to the degree of accuracy of his articles. Everyone refuting this article here, and others of Charlie's, offers not one shred of data to back up their "opinions" (which is exactly what they are), character assination oriented or not. There's no law against writing articles in an entertaining manner, either, which is what Charlie does. Makes a welcome change from Nvidia fanbois famously content-empty, vacuous, and Nvidia-PR-regurgitating, spleen-venting claptrap.

In contrast, most/every other reviewer/commenter either repeats Nvidia's (or other mfgrs) press releases and pabluum whole, or reworks them slighly to avoid being labelled as a plagarist ... which they essentially are. If anyone has different FACTS on Charlie's article, including the financial precursor he wrote and linked into his article, then come up and say so. Nothing at Semi-Accurate yet (smart and foundry-knowledgable folks post there constantly), nor here.

3. In short, if you have better info, post it. Otherwise, don't ask non-brain-dead folks to believe a classical Nvidia response when they are withdrawing from a market, and hoping no one will notice. If you want to see confidence in Nvidia demonstrated, look up the last month's stock performance of NVDA. Those that can are probably getting out while the going is good.

And, lest you think that I'm an AMD fanbois, the two PCs my wife and I use are both Nvida chipset and Nvidia GPU models.
 
Facts: <snip> In fact, I'm surprised that Kyle has fallen for this Nvidia PR ploy.

This sounds remarkably similar to what Charlie said to me Mr. New Poster. :)

I would suggest that I have not fallen to anyone's ploy. I simply saw a HardForum topic that was getting incredible traffic, which tells me HardOCP readers are interested, read the article squarely aimed at NVIDIA and gave NVIDIA a moment to comment on the article. If that makes me "fallen for this NVIDIA PR ploy," so be it.

And just for the record, you guys complaining are the only ones left giving this topic attention. :eek:
 
Couldn't nvidia have put this to rest by simply releasing high resolution images of the board in question?

Not saying I agree with anyone here... I couldn't care less. I just buy whatever's the best performing when it's time to buy.

Although I have to say that there are a number of things that makes me think Charlie is full of crap.
For instance, my DVI headers look exactly like the one he says is fake. You can barely see the pins sticking through the solder, to the point that in a low res photo, they would look like unpopulated vias just filled with solder.

Lets also not forget that those 'wood screws' have been appearing on nvidia gpu's since as long as they've had this dual slot config. They attach the heatsink to the slot panel.

The only thing I saw that looks worthy of FUD, are those power connectors... But those could have been glued down and connected with wire, and still have been operational.
 
Nvidia's PR people remind me of the spokesperson of some dictator. . .e.g. that Chemical Ali guy.

"There is no truth to this. They are just malicious rumors spread by the enemies of Nvidia!"

Meanwhile they give us absolutely nothing concrete to back up their statements. In the face of all the rumors about Fermi being late and problematic, all they did was show us a fake and tried to pass it off like this was the real thing. Then when they got busted for misleading the public they say "oh yeah uhh. . . well there was a real one and it was working, but we didn't feel like showing anybody!" All they would have to do to shut up the naysayers is show us the goods, but they didn't.

Now there's a rumor that they won't be able to sell their high-end cards at a profit if they try to remain cost-competitive with ATI. Instead of addressing the rumor head-on and reassuring their customers, all we get is a half-assed dismissal with no specifics whatsoever.

Nvidia's silence is deafening. If there were no truth to the rumors going around, Nvidia should be falling over themselves trying to prove us wrong and reassure their customers and shareholders.
 
He has a thing against nvidia. The Inq. has a thing against nVidia. I've owned several nVidia cards. But i've been pulling for ATi from the 8 pipe softmod 9500 Pro days. I love to see nVidia get bashed.:p

That is sad. When I was with Nvidia at QuakeCon, an AMD guy (who I met at CES earlier this year, GREAT guy and very fun to hang out with) stopped by and chatted with me at the Nvidia area. Was he a dickhead or rude? Hell no. The guy acted just like he always does even though I was with the competition at this particular event. He brought me up to date on what they were up to and even invited me over. If he, as an actual AMD employee, can show decorum then why can't we (including Charlie) try to at least be a little even-handed (granted I am going off what you guys are saying as I have no dealings with this Charlie dude yet). Heck I once worked on mod projects for AMD/ATI and NVidia at the same time with them knowing about each other and debuted the projects at the same event. Nvidia provided me with an AMD CPU and AMD-based mobo for their project. No infighting although they were competitive. I have seen firsthand AMD and Intel folks getting along as well at events. Intel just send a 975 proc to me as a favor asked by Nvidia for a charity project we are working on. If I remember, Intel is about to be a direct competitor to Nvidia.

All the absolute fan boism is really ghey. Use what is best at the time. That has and will change from one vendor to another. My first Nvidia card was the original Geforce which replaced my 3dfx solution. You will find a mix of Nvidia and ATI-based systems in my house now and probably will have an Intel GPU at some point as well.

and, when it is all said and done, it is just fooking computer hardware. get a life. nothing to get worked up over
 
Nvidia's silence is deafening. If there were no truth to the rumors going around, Nvidia should be falling over themselves trying to prove us wrong and reassure their customers and shareholders.
Most large corporations don't go around actively dismissing every rumour any random person utters. In fact, the majority of companies play it like Apple: they avoid any and all opportunity to comment on news stories when questioned for comment. There are certainly exceptions, but for the most part, companies keep quiet, and that's actually good policy for the most part.

All the absolute fan boism is really ghey. Use what is best at the time.
I'd say use whatever you want, regardless of whether it's the best or not, just don't make a big stink about it. I rarely tell people what video card(s) I'm running since it doesn't make any difference to anyone else but me. I also rarely make recommendations as to what card people should buy given their specific circumstances.

People do tend to take this stuff far too seriously -- you're dead on about that.
 
Most large corporations don't go around actively dismissing every rumour any random person utters. In fact, the majority of companies play it like Apple: they avoid any and all opportunity to comment on news stories when questioned for comment. There are certainly exceptions, but for the most part, companies keep quiet, and that's actually good policy for the most part.


I'd say use whatever you want, regardless of whether it's the best or not, just don't make a big stink about it. I rarely tell people what video card(s) I'm running since it doesn't make any difference to anyone else but me. I also rarely make recommendations as to what card people should buy given their specific circumstances.

People do tend to take this stuff far too seriously -- you're dead on about that.
all you need is what will run the programs and games you prefer at settings you can live with. Hell, I have my highly modified phase change unit on the shelf in my garage since over-clocking at high-levels is great for bragging righs but useless for everyday use
 
Just a note. Just because Charlie can be sued about something. Doesn't mean they will. Nvidia doesn't tend to sue website journalists. Right or wrong he's not that big of a PiA to them.
 
Back
Top