What is the next SUPER card?

RicKuRuKuS

[H]ard|Gawd
Joined
Mar 21, 2003
Messages
1,409
When will the next best of the best card come out? I mean one that will be top of the line for like 2 years or so like the Radeon 9800 Pro... When is the next big step up from the radeon 9800 series?

-warsaw
 
I mean one that will be top of the line for like 2 years or so like the Radeon 9800 Pro

I've had a 9700 Pro A.I.W since the day it came out and it's still top of the line in my opinion. It can play any game I throw at it and I would epect it to hold it's own when HL2 and D3 come out.
 
The X800 series is the top dog right now from ATi. About twice as fast as the 9800XT.
 
EQTakeOffense said:
I've had a 9700 Pro A.I.W since the day it came out and it's still top of the line in my opinion. It can play any game I throw at it and I would epect it to hold it's own when HL2 and D3 come out.

Man, we have almost the same specs and I can't say the same. Yes, for the most part but I have noticed with some of the newer games like Painkiller, Far Cry, I can still play with all the effects but have to settle for 1024x768 and even then my PC cried bloody murder on the Snow Bridge level.

At this point I can it is becoming a game of diminishing returns. That said, I am waiting a few monthes to see how the whole PS 3.0 thing plays out and whether the upcoming patches actually make a difference in image/frames per second or whether Nvidia and several game companies are just blowing PR smoke.

That said, even if I do end up getting the 6800, I honestly would like to never see the the Nvidia logo at the begining of a game again. Smacks of rigging the odds to me.
 
I like geforce 6800U, because scales better. But, hey, that's just me... :rolleyes:

On second thought, there's no way I could afford one. I'd rather buy a X800Pro and turn it into an XT for free.
 
M4d-K10wN said:
I like geforce 6800U, because scales better. But, hey, that's just me... :rolleyes:

On second thought, there's no way I could afford one. I'd rather buy a X800Pro and turn it into an XT for free.

It's true that the X800 and 6800U both have their high points of their own.....and they typically do excel in different areas. The thing about the hardmod is that it there is FAR from a 100% success rate right now.

It basically depends mostly on what kind of games you play. Like previous cards, the 6800U holds a commanding lead in OpenGL performance.....whereas the X800Pro/XT excel more at DX9 games of today.....farcry for one.

Take each card for what it is.....and don't expect anything more. Overall I'd still say that the X800XT is/will be the "top dog" of this generation.....but the Pro holds that title ATM, seeing as none of the other cards are actually released. ;)
 
What good is nvidias ps3.0 when the next full utilized ps3 game (so far unreal 3) will run like a slideshow on the 6800's? gg sit :)

and I doulbe these ps3.0 patches released in the next few games for the next 6 months will really help much if at all.
 
Unoid said:
What good is nvidias ps3.0 when the next full utilized ps3 game (so far unreal 3) will run like a slideshow on the 6800's? gg sit :)

and I doulbe these ps3.0 patches released in the next few games for the next 6 months will really help much if at all.
What? Stock to stock, the performace is really similar between 6800U and X800XT:pE.
 
by the time ps 3.0 is mainstream ati's new cards will be out then. so its really a moot point to even comparte ps 3.0
 
You can't compare stock to stock, ati is using low-k 0.13nm process and their core is smaller by 60 million transitors, 160 million compared to 220 milllion which close to 40% difference. Nvidia is suffering low yields and low clock speeds as a result, it difficult to produce chips with 220 million at high yields, as you get less chips per wafer and also you need all 220 million resistors to be working another difficult feat.
 
Yeah... but It gets greater performance boost for lower clock gain. Kind of like AMD chips :)

And it's not like shader model 3.0 is its' only feature. How about being able to thread, much like a CPU? Inifinetly long shader instructions? Being able to group and render at once several low-poly models? I'm pretty sure there are more; I just don't have the memory for these things.
 
Im more curious to know what power vr and s3 are doing. The "fringe" companies, and we havent from matrox in awhile either. Also SiS

Of course Nvidia and ati will release faster and faster.

I think software needs to catchup to hardware. To me it looks like software hasnt taken the same leaps that hardware has. Dx9 games are barely penetrating the market, even though dx9 hardware has been out even before the dx9 spec was released!

I think this really started with quake 3. About that time, graphics cards started coming in constant updates and refreshes, progressing much more rapidly than the games. We're at the point now, where a lot of whiz bang features even older cards support havent gotten widespread adoption.
 
i have a feeling the only difference with any patch in any current game utilizing SM 3.0 is only going to show itself with potential performance increases

i don't think we are going to see ANY image quality differences

even with the new SM 3.0 FarCry mod
 
THE JEW (RaVeN) said:
..........and Ati's completely rewriting their OpenGL code. Could be anybody's game come Doom time.

I could shoot fairy god parents out of my ass and wish to go back to the future and enjoy Doom 4.

I think we all know [or as least have a nagging suspiscion] that Doom3 will be dominated by nvidia cards. The 6800U was built to tear D3 a new one. 32x0, US2, etc.
 
Unoid said:
What good is nvidias ps3.0 when the next full utilized ps3 game (so far unreal 3) will run like a slideshow on the 6800's? gg sit :)

and I doulbe these ps3.0 patches released in the next few games for the next 6 months will really help much if at all.


I/we will be ready for and need a new card more powerful than current offerings by the time PS3.0 is relevant
 
Who cares; It's not like you lose any money or performance over SM3.0. Might as well take it.
 
PCX: Performance-Class PCI Express GPUs
==================================
" Dominate your games with the power of the NVIDIA® GeForce™ FX and GeForce PCX performance-class GPUs. Powered by advanced NVIDIA technologies including the CineFX™ 2.0 engine, UltraShadow™ technology, and the ForceWare™ unified software environment, the GeForce FX and GeForce PCX performance-class GPUs deliver high-performance cinematic gaming, unmatched features, rock-solid stability, and compatibility with next-generation games. "



GeForce FX
5950 Ultra:
Graphics Core 256-bit
Memory Interface* 128-bit
Memory Bandwidth* 30.4GB/sec.
Fill Rate 3.8 billion texels/sec.
Vertices per Second 356 million
Memory Data Rate 950 MHz
Pixels per Clock (peak) 8
Textures per Pixel** 16*
Dual RAMDACs 400 MHz

=========================

CineFX 2.0 Engine
The second-generation CineFX 2.0 engine powers advanced pixel and vertex shader processing and true 128-bit color precision. Delivering double the floating-point pixel shader power of the previous CineFX engine, CineFX 2.0 produces a visible performance boost through its more efficient execution of pixel shader programs.
UltraShadow Technology
Powers the next-generation of complex, realistic shadow effects by accelerating shadow generation. Accurate shadows that effectively mimic reality without bogging down frame rates are one of the keys to more believable game environments.
Intellisample HCT
Second-generation Intellisample technology delivers up to a 50-percent increase in compression efficiency for compressing color, texture, and z-data, and powers unprecedented visual quality for resolutions up to 1600 x 1280.
===============================
 
THE JEW (RaVeN) said:
..........and Ati's completely rewriting their OpenGL code. Could be anybody's game come Doom time.

i doubt we will see a ati opengl update like that for a while ;)
maybe around a year i say, give or take a few months
 
Bah choices choices, BFG is offering a 6800U with waterblock and EVGA is offering a 6800UEE. Damn, so now I've got 4 cards to decide from, Sapphire Toxic X800XTPE, Gainward Cool FX PowerPack Ultra/ 2600 "Golden Sample", BFG 6800U with waterblock, and the eVGA 6800UEE. Bah.

Damnit I want to see benches of things other than craptastic reference cards.
 
M4d-K10wN said:
Who cares; It's not like you lose any money or performance over SM3.0. Might as well take it.
Hmm, as of preorder prices today, it seems that you do lose money for SM3.0 ;)
 
From my point of view, they are all great cards, all have excellent features and have great benchmarks, the competition this time is a draw. Personally I'm buying a 6800 Ultra!
 
In some applications yes. Once you start turning on AA/AF and The resolution the differences are HUGELY noticable.

True.....it isn't simply "twice as fast" in any and all games.....but the performance differences DO range to that extent in a lot of games/apps with insanely high IQ/resolutions. It was just worded improperly the first time when it was put as being "twice as fast".

From my point of view, they are all great cards, all have excellent features and have great benchmarks, the competition this time is a draw. Personally I'm buying a 6800 Ultra!

This is what I've been saying for some time now. This is simply NOT another 9700Pro launch all over again.....there is no clear winner ATM. Seeing as both cards have their ups and downs in different areas, the decision of what card to buy will play mostly on a persons preferences/history with either company.....and what specific games they play.

I can see where a lot of people are coming from when they say that "The (insert card name here) is the fastest video card there is" because they both do kick some serious ass. The truth is that neither card decimates the other in every test out there (like the 9700Pro ;) ) and it is essentially a draw.

Buy the card based on your personal preference and what YOU think, not what the marketing fools are ramming down your throats about PS3.0 and whatnot.....sheesh. :rolleyes:

How is that even relevant?

heh, I would hope he's not throwing in a vote for the 5950 for the fastest card......what a fool. :p
 
Yeah, but if i run a game at 1600x1200 on my 19-inch; I can't see the effects of AA, or AF beyond 2x.
 
M4d-K10wN said:
Yeah, but if i run a game at 1600x1200 on my 19-inch; I can't see the effects of AA, or AF beyond 2x.

Well I don't quite agree to THAT extent.....I will give you credit where it's due.....1600x1200 is sure nice. Being able to game/bench at that res is still one of the main reasons that I want to pick up a 19" screen, but alas. :(

Even at 1600 res I can still notice differences between 4x and 2x AA.....but whatever. And yes, that is while gaming.....I'm very picky. :p

The thing is that I'd rather play at 1280x1024 with 4xAA/8xAF then 1600x1200 with 2xAA/8xAF.....or somethign comparable. That is of course due to personal preference and many other things.....but it's MY preference, and I respect yours.

To me, differences between 4xAA and 6xAA are VERY small (while gaming or not) but there are still quite noticeable differences between 2x and 4x.....whether you're gaming or not. Gaming at higher resolutions cuts back the need for overly high levels of AA.....but I'd still rather have 1280 with 4x then 1600 with 2x.

EDIT: One thing I didn't touch is the AF thing. No matter what size screen or resolution i can ALWAYS tell the difference between 2x and 8x AF.....whether I'm at 1600 res or not.....I'm not quite sure waht you're getting at. I know that some people have different eyes and they notice different things.....but I've only ever seen higher resolutions as reducing the need for AA, not AF. Again, personal preference people.....not flame bait. ;)
 
i too can notice the difference between 2x and up AA's/AF's on my 19" monitor

im only using a 9600XT ATM but it was a HUGE jump from my 7000PCI, can you say "slideshow"?

once i buy the x800 pro (like next week) ill see how it will chew up and spit out this 1600x1200 res that my 9600XT can only a few games at


also, im very tolerant of my FPS so i tend to go higher than [H] maxes their benches at, but thats just me
 
retardedchicken said:
also, im very tolerant of my FPS so i tend to go higher than [H] maxes their benches at, but thats just me

Heh, whereas I'm about 10 million times more picky. I'll run Unreal2k4 at 1024x768 high details without AA/AF just because it's THAT much smoother. ;)
 
Back
Top