G4saurus Defectus

ALL YOUR BASES ARE BELONG TO US!!
ROFLOL!
Maybe Im the only one on here that remebers seeing that originally.
 
LOL I liked the Render for Food part and the renaming/rehashing of the same old technology part with different names.
 
I'm building bridges!!

|¯¯¯¯¯¯¯¯¯¯¯¯¯¯| |¯¯¯¯¯¯¯¯¯¯¯¯¯¯| |¯¯¯¯¯¯¯¯¯¯¯¯¯¯| |¯¯¯¯¯¯¯¯¯¯¯¯¯¯|

lol

|¯¯¯¯¯¯¯____¯¯¯¯¯¯¯| shit this one broke :(
 
|¯¯¯¯¯¯¯¯¯¯¯|¯¯¯¯¯¯¯¯¯¯¯¯¯|¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯|¯¯¯¯¯¯¯¯¯¯¯¯¯| ULTRA BRIDGE!!
 
Yeah, cos ATI totally didn't do that with the 2xxx and 3xxx series :p

ATi improved their line of GPU's with more than spee bumps, HD 38XX has much lower power consumption, slighly better performance, DX10.1, UVD for full acceleration of BluRay and HD-DVD.

This is the nVidia's rehashing scheme;

8800GTX/8800GTS 640 (G80) = Great card
8800GT/GTS 512 (G92) = Smaller manufacturing, same technology, almost as fast as the previous high end GPU, that's ok.
9800GT/9800GTX (G92) = Identical and same old GPU, same outdated technology and as fast as the previous generation
9600GSO, 8800GS, 96XX etc (G92) = Still the same old GPU, same outdated technology
9800GTX+ (Gsomething) = Smaller GPU, same old technology as the almost 3 years old 8800GTX

GTX 260/GTX280 (GT200) = Faster GPU, same obsolete technology as the tired 8800GTX

Pretty much all those cards were pretty close each other with no improvements between them except the GTX 260/GTX 280

Now let's go for ATi since the beginning of DX10

HD 2900XT (R600) = Big, hot GPU which barely tied the GTX 640, had better feature set, terrible power consumption
HD 3800 (RV670) = Much smaller and cooler, DX10.1, UVD, cheaper, slighly faster, better feature set.
HD 4800 = Same architecture and feature set, but further improved for performance, UVD 2, better FSAA, all for only 32% more transistors

Only the first two generations were closer except when DX10.1 anti aliasing were used which benefits greatly the HD 38XX architecture.

So, which had a more consistent performance improvement and feature set? You? LOL
 
Evolucion8, I have had both ATi and Nvidia cards and I like both.
Now with that said, your being a total fanboy. How is the GT200 outdated technology? Woopdy do it doesn't have DX10.1 which NOBODY uses and was only faster in Assassin's Creed because, IIRC, using it skipped a rendering pass accidentally so it didn't do as much work. While I agree that the naming scheme is retarded (I think that's universal), the 8800GTS 320/640 to 8800GTS 512 added the same features (sans DX10.1) as the 2900XT to 3870. I'll concide that AMDs current AA setup KILLS NV's but they are both competitive which is all that matters.
 
That was a hoot, but after watching that video again though it kind of makes me sad that the G4Saurus could become extinct within our lifetime. I think it's time for all of us to do our part whether you be an ATI fan or an Nvidia fan. So please give the G4Saurus a chance and buy one, otherwise your children may never see one roaming free as nature intended. If you don't want to do it for Nvidia, then do it for your children for God's sake...

 
Evolucion8, I have had both ATi and Nvidia cards and I like both.
Now with that said, your being a total fanboy. How is the GT200 outdated technology? Woopdy do it doesn't have DX10.1 which NOBODY uses and was only faster in Assassin's Creed because, IIRC, using it skipped a rendering pass accidentally so it didn't do as much work. While I agree that the naming scheme is retarded (I think that's universal), the 8800GTS 320/640 to 8800GTS 512 added the same features (sans DX10.1) as the 2900XT to 3870. I'll concide that AMDs current AA setup KILLS NV's but they are both competitive which is all that matters.

A fanboy? LOL, tell me what the GTX 280 offers besides performance that the 8800/9800 series can't? The GTX 260 is a nice card, competition is always good, it would be terrible for us if only ATi or nVidia would stay as the only solution, so please answer my questions and stop being a noob.
 
A fanboy? LOL, tell me what the GTX 280 offers besides performance that the 8800/9800 series can't? The GTX 260 is a nice card, competition is always good, it would be terrible for us if only ATi or nVidia would stay as the only solution, so please answer my questions and stop being a noob.

There's a reason it offers nothing aside from higher performance over the 8800GTX. Simple. The 8800 GTX does everything that needs be done and then some, so the only way to go for the time being is faster. It's not "tired" technology if it continues to work. My nearly three year old 8800GTXs keep up with everything at 1920x1200 with 16x AF and 2-4x AA, full settings. I'm not seeing your point.
 
AMD always cuts down their competition in marketing. they do it to intel, and now with nvidia. I think it's a crappy way to advertise and makes me want their products less. ..here's looking at you apple.

I view modern advertisement more as a form of entertainment than a way to sell their product. It is somewhat underhanded to cut down your competition in advertisement, but you can't deny that it breeds hilarious results.
 
There's a reason it offers nothing aside from higher performance over the 8800GTX. Simple. The 8800 GTX does everything that needs be done and then some, so the only way to go for the time being is faster. It's not "tired" technology if it continues to work. My nearly three year old 8800GTXs keep up with everything at 1920x1200 with 16x AF and 2-4x AA, full settings. I'm not seeing your point.

keep dreaming that about your 8800 gtx.
 
I understand every joke in this video, except the title "G4saurus".

G4 = Geforce
Saurus = Dinosaur (as they often call something when it's old, in this case old tech.)

I'm not saying I agree with it, just explaining that's all.

For the record, I'm still usin' and lovin' my GTX 260 Core 216, but hat doesn't mean my funny bone isn't working.
 
Performance is the only thing that's needed over a 8800GTX and remember, nvidia is still WAY ahead with CUDA and their GP-GPU stuff. Hopefully ATi will catch up with that soon so we can start to see PhysX or some such be used for gameplay physics.
Except for DX10.1 (which is pretty much for shits and giggles) there isn't that much of a feature difference between the GTX 2xx series and the 48xx series videocards.

I prefer Nvidia's drivers to ATi's drivers but they both have their driver fuck-ups.

Well answer me this, why is it obsolete? It continues to do what people use it for so they only way to go is faster. Really what else do you want it to do? Cook your breakfast? Inject awesome into every game that runs on it? Give you a hand job? :confused:
 
Performance is the only thing that's needed over a 8800GTX and remember, nvidia is still WAY ahead with CUDA and their GP-GPU stuff. Hopefully ATi will catch up with that soon so we can start to see PhysX or some such be used for gameplay physics.
Except for DX10.1 (which is pretty much for shits and giggles) there isn't that much of a feature difference between the GTX 2xx series and the 48xx series videocards.

I prefer Nvidia's drivers to ATi's drivers but they both have their driver fuck-ups.

Well answer me this, why is it obsolete? It continues to do what people use it for so they only way to go is faster. Really what else do you want it to do? Cook your breakfast? Inject awesome into every game that runs on it? Give you a hand job? :confused:

PhysX will simply not take off, I'm still waiting for tittles with that enabled. CUDA is quite useful, but still not a real reason to buy a GeForce card, neither an ATi card for the Stream computing. The real reason to buy something is the best bang for the buck along with performance and feature set, Microsoft is the one who drives the market in DX, and DX10.1 is a nice checkmark on the list. When I meant old technology, is because nVidia hasn't put enough effort in their line up because they rested on their laurels doing nothing because they didn''t had competition at that time to put pressure to innovate. It wouldn't cook or do me hand jobs unfortunately, but will give me features that will be more futureproof and better than the competition (For now). We shall see what will come next from nVidia and ATi.
 
is because nVidia hasn't put enough effort in their line up because they rested on their laurels doing nothing because they didn''t had competition at that time to put pressure to innovate
Really? REALLY? Regardless of why, they're selling something thats nearly twice as fast as a 8800 Ultra for 1/2 the price 2 years later. Regardless of what ATI did, I'd call that innovation.
 
Performance is the only thing that's needed over a 8800GTX and remember, nvidia is still WAY ahead with CUDA and their GP-GPU stuff.

Real Customer Survey saaaaaaaaaaays:
The new ATI stuff is far superior to CUDA. They don't have to totally rewrite an application from scratch - which is a requirement to use CUDA - in order to use it. They literally can hook in, and go.
The new FireMV stuff once again blows Quadro out of the water. DTP folks are outright refusing nV because the IQ, frankly, is crap. CAD folks are sick of playing guess the driver. ATI's drivers turned the corner about a year ago, and people have definitely taken notice.

We won't cover the legions of customers who've had to cripple their laptops, and are justifiably livid.
 
Funny video. I think we'll see Nvidia come out with their 55nm
GPUs and possibly take back the performance crown. If they don't take the performance crown back
then ATI will probably just sit pretty for a while. If they do take back the performance crown than ATI
will probably launch their 45nm GPUs / with the 5870x2 taking back the performance crown. And then the cycle will repeat itself.
 
Come on fellas. This is all about LOL's and not about the toughness of your e-penis. Some folks can't upgrade all the time. Show some respect and you will be given the same.

P.S. I'm not into cars, but I imagine a [H]ardcore car forum is just like this. All kindsa fanb0y, etc, etc.. :)
 
Really? REALLY? Regardless of why, they're selling something thats nearly twice as fast as a 8800 Ultra for 1/2 the price 2 years later. Regardless of what ATI did, I'd call that innovation.

Short memory eh? I called that desesperation, selling an expensive piece of hardware for such low price cutting the profit margins is far from being an innovation thing, or did you forgot that the introductory price of the GTX 280 were 650 bucks? Both companies are the same, now that the HD 4870X2 has the crown in performance, it's price hasn't dropped and actually it increased a little.
 
Real Customer Survey saaaaaaaaaaays:
The new ATI stuff is far superior to CUDA. They don't have to totally rewrite an application from scratch - which is a requirement to use CUDA - in order to use it. They literally can hook in, and go.
The new FireMV stuff once again blows Quadro out of the water. DTP folks are outright refusing nV because the IQ, frankly, is crap. CAD folks are sick of playing guess the driver. ATI's drivers turned the corner about a year ago, and people have definitely taken notice.

We won't cover the legions of customers who've had to cripple their laptops, and are justifiably livid.

Go ahead and prove that. I would love to see how superior ATI's GPGPU efforts are, given how mature NVIDIA's SDK is. Oh and having used CAD tools for a couple of years, I would also like to know how is NVIDIA's IQ "crap", since it never was and it isn't, especially after the 8 series based Quadros.
 
To add to the fire, everyone does realise that if they were smart enough to release a dx10.1 part or hell even a proper dx10 part nvidia would. There is a reason they haven't, it ain't through choice.
 
He said, desktop publishing people think the IQ is crap, the CAD people have driver issues. Don't know anything about it myself but reading his post would help.
 
PhysX will simply not take off, I'm still waiting for tittles with that enabled. CUDA is quite useful, but still not a real reason to buy a GeForce card, neither an ATi card for the Stream computing. The real reason to buy something is the best bang for the buck along with performance and feature set, Microsoft is the one who drives the market in DX, and DX10.1 is a nice checkmark on the list. When I meant old technology, is because nVidia hasn't put enough effort in their line up because they rested on their laurels doing nothing because they didn''t had competition at that time to put pressure to innovate. It wouldn't cook or do me hand jobs unfortunately, but will give me features that will be more futureproof and better than the competition (For now). We shall see what will come next from nVidia and ATi.

Oh, I see where your coming from now. I need to sleep more at night. :eek:
Though I would say in features they're resting on their laurels not in performance, they're essentially tied in the mainstream market (i.e up to the 4870) then you have a choice between more expensive but more performance or less performance but cheaper.
 
PhysX will simply not take off, I'm still waiting for tittles with that enabled. CUDA is quite useful, but still not a real reason to buy a GeForce card, neither an ATi card for the Stream computing. The real reason to buy something is the best bang for the buck along with performance and feature set, Microsoft is the one who drives the market in DX, and DX10.1 is a nice checkmark on the list. When I meant old technology, is because nVidia hasn't put enough effort in their line up because they rested on their laurels doing nothing because they didn''t had competition at that time to put pressure to innovate. It wouldn't cook or do me hand jobs unfortunately, but will give me features that will be more futureproof and better than the competition (For now). We shall see what will come next from nVidia and ATi.

It already took off. There are already some games that support it, including the well known Mirror's Edge and the amount of titles that support it will increase. The plus side for anyone is that those effects won't be available just for folks with NVIDIA cards, but the NVIDIA cards will be able to speed things up, while others will need to run it on their CPUs.

FYI, having a GPU that supports CUDA is part of the "feature set" you mentioned. Just like having DX10.1 support is too, for ATI cards. The question is, how many games support or allow the use of that feature. This far, there are more PhysX games, than there are DX10.1 games. One just chooses based on what he/she will use the card for.

It's also becoming quite boresome to see the "old technology" bit being mentioned, when neither company introduced a new architecture. NVIDIA increased their processor units by less than 2 times and tweaked its architecture for GPGPU and ATI increased their processor units by 2.5 times, while having to fix the broken AA engine as well. They did nothing but play catch up in performance and they were successful, because they introduced those performance levels at an appealing price. That's all.
 
He said, desktop publishing people think the IQ is crap, the CAD people have driver issues. Don't know anything about it myself but reading his post would help.

And I asked for proof of those people "thinking" that the IQ is crap, or is it just hear-say ?
And maybe you should read my post too. because I used some CAD tools for a couple of years and no major problems ever occurred. Problems exist in every driver set and that definitely doesn't exclude ATI's drivers. Also NVIDIA's support for these kind of cards is excellent to say the least. Don't know about ATI in that regard.
 
Would like to mention that due to game development time the reason physx may appear has nothing to do with cuda and all to do with aegia? Also, most companies are interested in the software physx engine with hardware support tacked on so people arguing how useful cuda is is from one aspect pointless as it would have appeared in games released recently anyway. Unless someone has links saying how companies suddenly jumped on board and jeoperdised release dates and game stability to cram this "new" tech in.
 
Back
Top