NVIDIA Website Dedicated To Mocking Intel?

i found the cartoon funny, nvidia can say whatever they want in my book
 
i found the cartoon funny, nvidia can say whatever they want in my book

I think it's hilarious how people are expecting companies to act all ethical and such. It's human nature to cheat and gain wealth and power at the cost of others. Intel is no saint, nor is AMD or nVidia. All entities are deceptive, more than willing to lie through their teeth if it gains them a shred more power. To expect anything else is foolish. To boycott a company because of something they did and buy the products from another company who has done exactly the same things even if one isn't aware of it is hypocritical.
 
Lol@ people on cry binges. I think its funny. "Intels insides". Its like nvidia is trying to get people to buy amd cpu's. But hell with it. Its a cheesy website. Intel is probably too classy to a site like that of their own.
Posted via [H] Mobile Device
 
Lol@ people on cry binges. I think its funny. "Intels insides". Its like nvidia is trying to get people to buy amd cpu's. But hell with it. Its a cheesy website. Intel is probably too classy to a site like that of their own.
Posted via [H] Mobile Device

Yeah, they prefer to bully other companies into bankruptcy like in the good ol' days :)
 
Wow I was beer'd up last night. I forgot some words. But its like what the hell, let nVidia make senseless jabs. What does the average consumer care? They probably won't even hear about it. Let a company have some fun. I think this shows that they feel backed into a corner. My question is why are they not attacking ATI?
Posted via [H] Mobile Device
 
I think this shows that they feel backed into a corner. My question is why are they not attacking ATI?
Posted via [H] Mobile Device

Because "Larrabee" is more like "Fermi" than "R800"...the fight is about the GPGPU market, and NVIDIA sees Intel as a threat there...but apparently not AMD.
 
Because "Larrabee" is more like "Fermi" than "R800"...the fight is about the GPGPU market, and NVIDIA sees Intel as a threat there...but apparently not AMD.

Learn to think.
The reason that nvidia picks intel is because, intel, like nvidia, has only paper product that they can do nothing about but talking with their big mouth.
ATI has something new, it makes nvidia look even pathetic if it attacks the one who is a generation ahead of them.

INTEL and NVIDIA, two losers mocking each other.
 
Learn to think.
The reason that nvidia picks intel is because, intel, like nvidia, has only paper product that they can do nothing about but talking with their big mouth.
ATI has something new, it makes nvidia look even pathetic if it attacks the one who is a generation ahead of them.

INTEL and NVIDIA, two losers mocking each other.

Please share the substance you are under influence of.
Intel sits on 2/3 of the CPU market and NVIDIA sits on 2/3 of the GPU market...and making a profit too...

Yeah they are really losing :rolleyes:
 
I rolled my eyes when I saw this. Pointless, unfunny, and a waste of time. It does nothing to make me looking forward to whatever Nvidia is putting out next and does nothing to discourage me from buying Intel products. Nvidia needs to shut up and take a good long look at how they are presenting themselves to the consumers they are obviously targeting with this kind of stuff. All that is going to happen with stuff like this is retaliatory comments from Intel and it will lead to a giant circle of feces being flung by a bunch of PR monkeys on all sides.

To all of you telling Nvidia to hurry and release something new: They can't. TSMC can't produce the chips to keep up with ATI's demand, what makes you think they'd be able to produce enough to even allow NV to launch a new card this year? Even if Fermi was ready to launch there is no possible way for it to come out this year.
 
ATi has done its share of relabeling cards, exactly like nVidia has done now. No one is innocent here. Intel has knowingly and willingly limited the growth of the PC market and negatively affected competition as well as forcing customers to pay more for their PCs.

Intel is the big criminal here right now, the rest are small fry at best.

Your opinion does not count.

We all know you're an nVIDIA fan-girl/fan-boy. To defend this sort of behavior is just mind boggling.

Now, as for this ATi re-labeling you speak of... what cards exactly?
 
You're still spouting this nonsense?? Oh god -_-

Did anybody here with a 3000-series card see it show up as a 4000-series card after updating their catalyst drivers? No. Furthermore, all of that driver fud is incorrect, as it clearly labels the 4750 as an RV630 chip, which would have limited it to a maximum of 320 stream processors.

The last time ATi rebadged WAS the X1000 series. We would see things like X1300-series cards get cores from the first waves of X1600-series and whatnot. However as far as the X1900GT/X1950 Pro thing goes... die shrink. Same clocks and specs, k, fine. You could argue that, but nobody argues that a GF 9800GTX+ is a rebadged 9800GTX...

The comic is in bad taste, but it really isn't any worse than any of the marketing tactics Intel, AMD, ATI, Nvidia, Apple, or Microsoft have used before. They're all guilty. That's marketing.

Funny how the site is aimed at Intel but SAYS it is aimed at the industry...

Does she spout anything other than non-sense?

Between her an Atech I swear they could start they own website... maybe call it nVIDIAZone where all the delusional nVIDIA fans can go to pat themselves on the back (the way AMDZone users do).

Or maybe she could start her own blog to the style of Sharikou.

Seriously.. she and Atech irk me.
 
Your opinion does not count.

We all know you're an nVIDIA fan-girl/fan-boy. To defend this sort of behavior is just mind boggling.
Yup, such a big fangirl that I have been recommending AMD cards to people planning to build a new system because I acknowledge that AMD has got the best GPUs for gaming purposes at this point.

You're a dolt.

Now, as for this ATi re-labeling you speak of... what cards exactly?
All IGPs on the market are relabeled. HD3xxx GPU cores are relabled as though they are HD4xxx cores while they most definitely as not. That's the most famous example I can think of without doing more research.
 
Yup, such a big fangirl that I have been recommending AMD cards to people planning to build a new system because I acknowledge that AMD has got the best GPUs for gaming purposes at this point.

You're a dolt.


All IGPs on the market are relabeled. HD3xxx GPU cores are relabled as though they are HD4xxx cores while they most definitely as not. That's the most famous example I can think of without doing more research.

The IGP were actually all based off of the HD2400. All of them. Even the 4200 igp is.

The 3xxx renaming is a driver thing, and no AIB manufacturer actually manufacturers the cards affected, anymore. The renames are nowhere near the level of the g92 core type renames, but they are close.
 
Please share the substance you are under influence of.
Intel sits on 2/3 of the CPU market and NVIDIA sits on 2/3 of the GPU market...and making a profit too...

Yeah they are really losing :rolleyes:

I agree and its sad that integrated graphics play such a large role in the GPU market. But that might change with better software on all fronts being written to utilize the GPU's better than just the CPU. Adobe has finally made progress with its products using the GPU more. It still puts the people who buy their products through "licensing hell" though.
 
I agree and its sad that integrated graphics play such a large role in the GPU market. But that might change with better software on all fronts being written to utilize the GPU's better than just the CPU. Adobe has finally made progress with its products using the GPU more. It still puts the people who buy their products through "licensing hell" though.

Lol. I was about the jump in and argue that Intel controls the lions share of the GPU market, and the fact's don't lie.

Sure, they are not the DX10 beasts with 512 "CUDA Cores":rolleyes: or whatever, but they are there... and with a crushing lead on the real market, you know, the one with the most money.
 
Lol. I was about the jump in and argue that Intel controls the lions share of the GPU market, and the fact's don't lie.

Sure, they are not the DX10 beasts with 512 "CUDA Cores":rolleyes: or whatever, but they are there... and with a crushing lead on the real market, you know, the one with the most money.

Not with profit per unit but with the amount of units sold. That would mean Intel adds up to being number one in profit from GPU's too. Some argue that it comes as a package deal to muscle its integrated graphics chipset in. I have no idea if thats how Intel negotiates though. Most comments about business practices are raw speculation anyway.
 
Some argue that it comes as a package deal to muscle its integrated graphics chipset in.


I meant to say this instead (lol I was so vague):

*Some argue that its processors come as a package deal with the motherboards that feature the integrated graphics chipset in it. Forcing its market penetration further.
 
Not with profit per unit but with the amount of units sold. That would mean Intel adds up to being number one in profit from GPU's too. Some argue that it comes as a package deal to muscle its integrated graphics chipset in. I have no idea if thats how Intel negotiates though. Most comments about business practices are raw speculation anyway.

+1 True
I wanted to edit out "money" with "units/money", but news threads don't alllow that.
thanks!
 
I meant to say this instead (lol I was so vague):

*Some argue that its processors come as a package deal with the motherboards that feature the integrated graphics chipset in it. Forcing its market penetration further.

And of course, now that both Intel and AMD are both looking for on-die, and later, on-CPU graphics solutions...
 
The IGP were actually all based off of the HD2400. All of them. Even the 4200 igp is.

The 3xxx renaming is a driver thing, and no AIB manufacturer actually manufacturers the cards affected, anymore. The renames are nowhere near the level of the g92 core type renames, but they are close.

Wikipedia has some more detailed info, including the corresponding Radeon GPU name: http://en.wikipedia.org/wiki/AMD_700_chipset_series#Integrated_graphics

Seems to vary from HD2100 (740) to 4200 (985).
 
Wikipedia has some more detailed info, including the corresponding Radeon GPU name: http://en.wikipedia.org/wiki/AMD_700_chipset_series#Integrated_graphics

Seems to vary from HD2100 (740) to 4200 (985).

yeah... they are all still just HD2400 cores. Some of the review sites stated that, but not all.


They are very decent, for IGP, though. They at least forced both nVidia's and Intel's hand to do something better. (well... not Intel:p).

I looked at that chart a while back, when I was still considering a IGP solution for what I wanted.
 
Wow. Just wow.

Maybe Intel should make a comic, the first strip can be something about relabeling video cards with a new numbering scheme each time to try and fool potential customers...

Oh wait, that's actually true. Not to mention the plethora of other things Nvidia has done.

yeah for sure, they should follow along with what intels doing, a nice straight forward naming scheme; 2, i3, i5, and i7, where the chips in i7 are actually identical to the chips in the upper half of the i5 series but keeping the lower half the i5 series actually faster for any single threaded application and actually at higher clocks, all on two different sockets and hey, core i3 is where? But you know ken, I mean I mean I mean... well thats just football!
 
*didnt actually want to hit the post buttom, damn I wish this section had an edit button, alas:

I chuckled, those cartoons are pretty funny.

Is there really such a thing as bad publicity in the war against Larabee by Nvidia or the war against Fermi/Cuda by Intel? These products arn't yet released, we've got no benchmarks which seem legit from either of them, and both of them seem like they've got alot to offer. So either company slinging something at the other simply brings the other into focus. By talking about how larabee is going to be an abysmal favour I, as am ethusiast, are forced to bring up what I know about larabee to the forefront of my mind, and what I know about larbee is that it stands to bring some pretty serious change to X86, even if it might be an abysmal failure. Same goes for Cuda. In a sense, intel and Nvidia are doing each other a favour.
 
Back
Top