Thatonen00b
Limp Gawd
- Joined
- Apr 11, 2004
- Messages
- 358
burningrave101 said:And from the looks of Call of Duty which is OpenGL, i can pretty much say it for certain lol.
yeah, the moment I saw those I closed the window
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
burningrave101 said:And from the looks of Call of Duty which is OpenGL, i can pretty much say it for certain lol.
burningrave101 said:That may be true but if your not one of those persons running less then a name brand 350w then there is no point in making that an issue of whether or not to buy one. Everyone keeps bringing up the power issue as being one of the negatives of the 6800u and i bet there is hardly a single one of you that has less then a 350w PSU installed. So to continuously bring it up is a bit ironic in the fact that very few of you will have an issue thats needs addressing.
Most likely ATI will increase their transistor count in their next release this fall or whenever and when they do its likely it will draw as much power as the 6800u. Your better off just keeping your power supply upgraded above all else because it can wreck your whole system if its not good quality and able to supply sufficient power. The more transistors that they pack into the GPU core the more power its going to draw. nVidia has 220 million in the 6800u compared to the X800XT PE which only has 160 million.
trungracingdev said:They did a test on maximum pc on the PSU issue of the 6800 ultra. They tested a PC power n cooling 410w and 510w, a fortron 400w, 250w and a 400w generic PSU. Every PSU passed the test except for the generic ones. Understand this, not everyone buys name brand PSU. Most people usually bought their computer pre-built from manufactors, which uses generic psu's to cut corners. How you manage to ignore the facts is beyond me?
Fine, you have a nice system. How about those who don't?
The GeForce 6800 GT at default clock speeds is doing 350 MHz core and 2x500 MHz memory (1000 effective).
overclocked it was capable of running of a 401 MHz core and 530 MHz (1060 effective) memory frequency. Now that is bang for your bucks as that is roughly Ultra speed. Memory was a tad disappointing but hey, can't blame NVIDIA there. otherwise it would be competing its own product.
I agree with your comments. However, I see a bigger problem with the 6800GT. There are a lot of comments from people saying "just buy the 6800GT and overclock it to Ultra levels". However, minus one molex connection you are going to run into some power problems. Even if you overclock the 6800GT core to 400MHz, you may only see a frame or two of extra performance. Nvidia has even confirmed the fact that the 6800GT will "clock-down" if there is not enough power available, and without the 6800U's second power connection I doubt there will be.
burningrave101 said:http://www.guru3d.com/article/Videocards/135/7/
6800GT overclocked to 425Mhz core and 1.2Ghz RAM. 6800 Ultra Extreme overclocked to 475Mhz core and 1.255Ghz RAM.
jarman said:Okay...PCFormat is reporting the exact opposite (PCFormat, July 2004, 163, page 55). I'm interested to see what Kyle and the boys have to say about this with retail boards.
burningrave101 said:And your point is?
nVidia had had the lead for a long time and ATI just finally started to take the lead with the 9500 pro and 9700 pro. They had just began to be in the running and actually competitive against the top tier nVidia.
Do i need to repeat it again?
SM 3.0 will be implemented in quite a few games long before 6 months from now i'm sure and the OpenGL games like Doom 3 will likely perform a whole lot better on an NV40.
This isn't FUTURE tech. This is "the next 2-3 months tech".
NV40 also has UltraShadow II Technology which is in Doom 3.
burningrave101 said:Those clocks go with the FiringSquad review. Guru3d got 401/1060 on their 6800GT. Its going to be a matter of different boards how high you can OC.
And what is PCFormat? I dont guess i've ever looked at that one.
agar said:I just don't know how you can't go wrong with 434.99 for a XT/PE. You can argue 2-3fps here or there, but for that price you can't compete.
randsom said:bfg 6800 gt oc 299.99, one of the best bang for the buck deals ever.ill never notice the 5 fps difference.
trungracingdev said:Are u talking about the one time best buy sale thats long over? If not, please link me to the price.
Blackwind said:Catching up? LOL
They lapped them on the playing field the last two releases running....
9700 Round 1 ....Winner!
9800 Round 2 ....Winner!
Anyone has some redemption to do or "catching up"....it's nVidia. I mean really...that little do hicky to force tri that nVidia now has in their control panel.....you think that got there by accident? It was direct response to them getting banged over the head by users, hardware review sites, vendors and grandmothers who know a thing or two about vid cards.
Ok...maybe not the grandmothers.
nVidia appears to be back on track but I think you are wearing blinders if you can't see the accomplishments of ATI. no shame is simply saying...nVidia got their butts handed to them and now they are putting more GREAT effort to make amends. More power to em and I as a consumer have more options......
LOL. So now everyone should wait till Doom 3 to decide eh? the life buoy is right in front of you....it has ATI written on it.
Stop flaying the excuses already.
Dyslexic said:Well said in all coments. This guy is just a MEGA NVIDIA FANBOY he cant give ATI any props. Hey burningrave101 have you ever owned an ATI card? Mabye the 9700pro a card everyone knows was the best you could get for like a year or did you blindly follow your fanboy instinks?
ToastyBoy said:Thanks for the link , but it doesn't look like either card was the clear winner. At high rez and max settings it was 6 or Nvidia, 5 for ATI, and 1 tie. It almost seemed like they intentionally tried to skew the results in Nvidia's favor by using so many QIII based games.(QIII, Wolfenstein, + Jedi Acadamy) I'm still gonna buy the x800xt since it won all of the games that really matter to me....the ones I play. (like Halo, UT2004, + Far Cry)
burningrave101 said:Oh yea i'm a MEGA NVIDIA FANBOY
And even if they do use 3Dc there will have to be a fallback off DXT5 for all those that dont have cards that support it and there is very minimal differences between the two. And all it is is texture compression.
BTW, the X800 does seem to add a few features more than just the 9800. I'm not sure if the 9800 does sub-surface scattering (or a different way of doing it) or MRTs. So there were some core enchancements other than just a "doubling up" on the old tech.
burningrave101 said:Yea right lol. Yea your not an ATI fanboy. You've just been going on about ATI non stop this whole thread, reaching for the skys for reasons to get the X800XT PE and not a 6800u even though the 6800u is the better card in performance and technology. If your only reason is a $70 price difference its not a very good one. .
Merlin45 said:most things done in ps 3.0 can be multipassed in ps 2.0 at a performance decrease, V.S. 3.0 on the other hand, cannot be emulated in VS 2.0. (you cannot do texture lookups in VS 2.0, so no displacement mapping.
agar said:Humus Demo, using 3Dc.
With shadows:
No compression: 125fps
3Dc: 146fps (+17%)
DXT: 136fps (+9%)
3Dc & DXT: 158fps (+26%)
Without shadows:
No compression: 164fps
3Dc: 210fps (+28%)
DXT: 195fps (+19%)
3Dc & DXT: 239fps (+46%)
Available here: http://esprit.campus.luth.se/~humus/ and this is thread discussing its advantages over DXT: http://www.beyond3d.com/forum/viewtopic.php?t=13579. Looks like there will be performance and visual differences betwee the two. They funny thing is burning, you just spew crap from reviews without really researching what it is you are spewing. It's like you d/l Nvidia coorperate docs and cut them and paste them here.
" The 6800U supports pixel shader 3.0, vertex shader 3.0, 32-bit floating-point, and UltraShadow II Technology " ALL BOW DOWN TO THE HIGH TECH ADVANCES THAT HAVEN'T BEEN USED IN ANY GAMES AND HAVE YET TO BE USED IN ANY GAMES.. It was like people flamebaiting others about TruForm and how it was going to change graphics. Give me a break. This is your classic quote of the week " i can see the truth when the the rest of you seem to ignore it and the rest of you are totally unbias" Wow!
I think you should re-think your comments and previous posts before you make statements like that.
Blackwind said:Are you really that clueless?
I would take the X800XT, is there any doubt? Really, it should only be a consideration if the other card is a 6800 Ultra, but its not, it a 6800GT.
http://www.spodesabode.com/content/article/nv40r420p2/5
Thanks for the link.. ATi seems unconcerned with 3DMark scores as usual, and Farcry (which should be optimized for Nvidia) is right on the same level with a 6800 ultra. Otherwise, its quite a bit faster than a 6800ultra - nothing new to see, but its a nice small one chart page which pretty well sums it all up.
burningrave101 said:Snip because its so damn long!Those Pics are cool, but they don't compare them to shader 2.0. They seem to show HL2 screenies as if it was running SM 3.0 when it doesn't even have SM 3.0 implemented yet. I'm not taking away from SM 3.0 tech at all, I just don't believe that you will see a big visual quality diff. Thats just my opinion. I could be totally wrong, but I haven't seen any evidence either way.
gsboriqua said:Those Pics are cool, but they don't compare them to shader 2.0. They seem to show HL2 screenies as if it was running SM 3.0 when it doesn't even have SM 3.0 implemented yet. I'm not taking away from SM 3.0 tech at all, I just don't believe that you will see a big visual quality diff. Thats just my opinion. I could be totally wrong, but I haven't seen any evidence either way.
Getting performance out of there drivers?Something ATI has never been able to do.
burningrave101 said:Oh yea i'm a MEGA NVIDIA FANBOY just because i can see the truth when the rest of you seem to ignore it and the rest of you are totally unbias altogethor because you like nVidia and ATI both the same.
I never denied the poor performance of the NV30 because ATI definitely had the better cards with the 9700pro/9800pro. nVidia turned it around alot with the 59XX cores though and a few solid driver updates to help in DX9 games. ATI deserves recognition for their accomplishments over the last 2 years but that doesn't just dismiss all their years of suckage while nVidia held the top crown.
Some of you guys really crack me up, i tell ya. Just because none of you can come up with anything better you resort to calling people a fanboy lol. I love it .
Dyslexic = xSyzygy666x lol.
Burninggrave you have got to be the most annoying Nvidia fanboy to pop it's head up for some time
For starters your above sentence is the biggest load of crap ever sprouted Ever hear or use a card called the Radeon 8500.The performance gained out of these cards with later driver realises was nothing short of exellent and certainly as good if not better than anything nvidia has done to date!.
They got performace from that card near on 2 year's later with driver realises?
As for all this constant badgering you keep sprouting about in every goddamed thread that mentions the X800XT! STFU unless you own said card and know for shore how it overclocks and Plays games.And if you want to buy the 6800GT do it and stop busting everyone elses chops about it =Freedom of choice some prefer ATI some Prefer NVIDIA!
As for the original Thread topic My opinion is go the X800XT its the top of ATI's line of card For the same Price as Nvidias middle of the road card!
IT'S NOT THE BEST ITS NOT THE WORST ITS JUST AS GOOD AS ANYTHING NVIDIA CURRENTLY HAS FULLSTOP!
Dyslexic said:No the reason I call you a fanboy is because you will not admit that both the 6800 and the X800 are great cards. Each has good points and bad points. All of the reviews new and old I have seen say they preform very well and are fast as hell. NVidia has had ups and downs like any GPU company they used to get spanked by 3Dfx back in the day so they were not always on top before the 9700pro came along. They got good when the first Gforce cards came out and held the lead till Gforce 4s got slammed by 9700s it wasnt for all that long. Its a tuff thing to stay on top no one does it for ever. So stop and let your nose pull free from the collective asses of NVidia and just take a break.
X800 - 66 Degrees C
6800 GT - 49 Degrees C
Both cards where tested with a room temperature of 22 Degrees C. The temperature measured of course is not the core temperature. Both cards have been tested at the same time-frame after 3 runs of Aquamark 3 in the most heavy mode. So unless your have some seriously good cooling you need to take note of the fact that the card will create higher ambient temperatures inside your case that can have an adverse effect on other components, especially an overclocked processor (CPU).
http://www.anandtech.com/cpu/showdoc.html?i=2091&p=1
By the way this review that you posted proves that the X800 XT is faster with AA and AF enabled in all but old Q3 engine games. I don't know anyone with a decent rig that doesnt use AA and AF on every game they play.
Dyslexic said:Thats funny cause in the article you posted it says the following:
"With this article, we were also trying to put an end to the ATI vs. NVIDIA PCI Express debate. Our conclusion? The debate was much ado about nothing - both solutions basically perform the same. ATI's native PCI Express offering does nothing to benefit performance and NVIDIA's bridged solution does nothing to hamper performance. The poor showing of NVIDIA under Far Cry and Warcraft III is some cause for concern, which we will be looking into going forward. We will keep you all updated on any and all findings with regards to that issue as we come across them."
Doesnt sound like they think the 6800ultra is a better card.