The mysterious 1.2 patch.

heyheyhey

Gawd
Joined
Oct 11, 2003
Messages
982
Ok, I'm tired of everyone saying all this stuff like nVidia can't do full 3.0, nVidia can only do a mix of 16 and 32, and a WHOLE bunch of stuff about 2.0B and ATi. But what are we basing all of this off? The FarCry 1.2 patch, quite possibly one of the glitchiest patches I've ever seen. If I were you, I would wait 'till more games come with full and reliable 2.0B, 3.0, and 3DC support before we start saying the said things.
 
You do know the patch was pulled because some ati cards crashed to desktop, it wasnt pulled do to anything wrong with nvidia cards.
 
I know, but there's so many rumors going around and I'm sure there's people to believe them, I'm just informing them that all of this is based off one very glitchy patch.
 
Well i have posted a few times about glitches i am experiencing with 1.2 patch and my 6800 U .So your wrong about the patch not affecting nvidia cards.
Lord of Shadows said:
You do know the patch was pulled because some ati cards crashed to desktop, it wasnt pulled do to anything wrong with nvidia cards.
 
Just a rumor, I heard there were notable precision issues with the mountain textures (textures that are near vertical), dunno if it was Nvidia or ATi. Doesn't seem a big enough reason to stop the patch though, so I tend to think it was a crashing issue.
 
I didnt notice any texture problems,in fact everything looked beautiful,the problem i had was when i reached the cafeteria in the control level,my fps dropped to a slide show.This happened out of no where.Everything was running smooth as silk until that point.
ZenOps said:
Just a rumor, I heard there were notable precision issues with the mountain textures (textures that are near vertical), dunno if it was Nvidia or ATi. Doesn't seem a big enough reason to stop the patch though, so I tend to think it was a crashing issue.
 
BTW, just to mention in passing.. Tom did a review of the HIS X800pro and XT (the one with the Nvidia sized heatsink) and Farcry 1.1

As we already knew, the X800XT easily beat out the 6800ultra, and the 6800GT beat out the X800Pro. Didn't bother to overclock either, which is kind of unfortunate. Interesting how close the 9800XT is to the 6800non-ultra.
 
SM3.0 is supposed to be a mix for FP16/FP32. That is how the spec works because doing full FP32 is extremely resource heavy and a waste of performance when FP16 offers no IQ loss in many scenarios. R520/NV50 will both be FP16/FP32 mix. Crytek themselves said that FP16 was used only when FP32 was not useful.

Driverheaven wrote an article about this, attacking something they apparently know nothing about (as they have done twice before with NV40, proven wrong both times). People complain about Toms Hardware, but Driverheaven is really the king of bias and inaccuracy on the web IMO, and makes Toms Hardware look totally unbiased in comparison.
 
ny555soul said:
I didnt notice any texture problems,in fact everything looked beautiful,the problem i had was when i reached the cafeteria in the control level,my fps dropped to a slide show.This happened out of no where.Everything was running smooth as silk until that point.

Well, Tom's Hardware did an in-depth IQ comparison between the two patches and found no differences in the sm2.0 and sm3.0 paths, except that the lighting in the 3.0 path was dimmer (looks more realistic) and that textures were flickering in the sm2.0 path on ATI cards in areas.
 
I thought part of true SM3.0 is full 32 bit......

""Nvidia encouraged Crytek to use partial precision in the shaders wherever possible. This means that many of the shaders will run in 16-bit precision, not the 32-bit precision (a requirement of SM3.0) that they are touting to the press."
 
tranCendenZ said:
SM3.0 is supposed to be a mix for FP16/FP32. That is how the spec works because doing full FP32 is extremely resource heavy and a waste of performance when FP16 offers no IQ loss in many scenarios. R520/NV50 will both be FP16/FP32 mix. Crytek themselves said that FP16 was used only when FP32 was not useful.

Driverheaven wrote an article about this, attacking something they apparently know nothing about (as they have done twice before with NV40, proven wrong both times). People complain about Toms Hardware, but Driverheaven is really the king of bias and inaccuracy on the web IMO, and makes Toms Hardware look totally unbiased in comparison.

I hate to say this but Toms is the last place you should quote for actual information. Just MHO. ;)

This is the same site that says AMD's A64's are not fast and never has anything good to say unless it's Intel or Nvidia.
 
tranCendenZ said:
Well, Tom's Hardware did an in-depth IQ comparison between the two patches and found no differences in the sm2.0 and sm3.0 paths, except that the lighting in the 3.0 path was dimmer (looks more realistic) and that textures were flickering in the sm2.0 path on ATI cards in areas.

Well technically speaking I dont believe the patch was created with the full intention of improving IQ. The 1.2 patch was created for performance as far as using sm3.0 abaility goes I do believe.
 
I didnt read that review,but i went from 61.21 drivers,1.1 patch,and dx 9.0b to 61.76,1.2 patch and dx 9.0c and in my opinion IQ seemed alot better,and fps was smooth even @ 1600x1200 with maxed settings,but as soon as i reached the cafeteria in the control level fps just dropped to slide show,I tried moving to an area where effects werent high,but the slide show didnt go away.I cant think of any other reason other then the 1.2 patch being the cause.
tranCendenZ said:
Well, Tom's Hardware did an in-depth IQ comparison between the two patches and found no differences in the sm2.0 and sm3.0 paths, except that the lighting in the 3.0 path was dimmer (looks more realistic) and that textures were flickering in the sm2.0 path on ATI cards in areas.
 
Had no problems with FarCry 1.2 but there were some odd things happening, like flickering textures when I am close to them (like tree sides when I am crouched behind it). I dropped by videocard oc to stock speed, and everything went back to normal. I have 6800gt + 61.76 +DX9b (didn't even bother enabling sm30, I'll wait for final patch & dx9c).
 
ny555soul said:
Well i have posted a few times about glitches i am experiencing with 1.2 patch and my 6800 U .So your wrong about the patch not affecting nvidia cards.


I never said it only affected ATi cards.
 
R1ckCa1n said:
I thought part of true SM3.0 is full 32 bit......

""Nvidia encouraged Crytek to use partial precision in the shaders wherever possible. This means that many of the shaders will run in 16-bit precision, not the 32-bit precision (a requirement of SM3.0) that they are touting to the press."

Right, this is where Driverheaven is misleading/wrong, glad you pointed it out. For SM3.0, the card must be capable of FP32 minimum, but SM3.0 also allows for partial precision at a lower precision, such as FP16. Just Driverheaven BSing again.

FP24 will be dead after this gen, Nvidia has the more future oriented (and smarter/more flexible) shader architecture, with both support for 64-bit shaders and 128-bit shaders. Odds are ATI will adopt the same architecture for R520 in order to stay competitive with SM3.0.
 
I never said you said it only affected ATI cards..lol,I was referring to lord of the shadows post.Next time read the entire post before letting your fingers do the talking :D
heyheyhey said:
I never said it only affected ATi cards.
 
Well you posted above the quote so I thought you were using it to back yourself up. Sorry :)
 
R1ckCa1n said:
I hate to say this but Toms is the last place you should quote for actual information. Just MHO. ;)

This is the same site that says AMD's A64's are not fast and never has anything good to say unless it's Intel or Nvidia.


If Tom's Hardware was a paper rag, it would be put to good use in my bathroom for wiping....


A64's are owning pretty good in the latest games, clock for clock they are kicking the shit out of regular XP's...
 
R1ckCa1n said:
I hate to say this but Toms is the last place you should quote for actual information. Just MHO. ;)

This is the same site that says AMD's A64's are not fast and never has anything good to say unless it's Intel or Nvidia.

Your full of it. Tomshardware has been known as an ATI and AMD bias site the same as they are known as an Intel and nVidia bias site. You just havn't been around very long i guess. Who their bias towards changes over time with who they show to be faster. If the benchmarks are conducted in a fair manor then you can't say their bias just because the performance results dont match up how YOU think they should be. Its the same way with HardOCP and their reviews. I may not agree with their findings and how they went about finding them, but i dont think they are totally bias towards ATI and would skew the results in favor of ATI.

None of the their reviews have said the AMD A64's are not fast. They've showed the same kind of performance results most other sites are getting. I KNOW because here a few months back i did extensive research on both CPU's and had nearly every review that had been published bookmarked.

Tomshardware has made mistakes before about info they have quoted yes. So has every other site on the net including HardOCP. Tomshardware has always been one of the sites to say the kind of stuff that would get most other sites in trouble. They aren't afraid to push their articles to the limits to put out the truth. They exposed faults on the Pentium 4's when they were first released and Intel stopped sending them CPU's for a while.

I dont know how all this Tomshardware BS has managed to get started over the years. I guess its because Tomshardware is the most popular and highest trafficted hardware site online.

I know alot of the moderators and admin at Tomshardware along with a few of the reviewers and a large portion of them prefer AMD processors. And the nVidia vs ATI debate is null and void because they've been showing the same results other sites are getting.

TheRapture said:
A64's are owning pretty good in the latest games, clock for clock they are kicking the shit out of regular XP's...

Thats true but the majority of PC Games are GPU intensive and usually the speed of the processor has very little to do with the performance in those games if you max out the eye candy like your suppost to. If you want to run 800x600 benchmarks then yea, the A64 beats the XP and Pentium 4 in gaming lol. But if you will look at HardOCP's latest CPU scaling review, the Pentium 4 3.0c performed as well as the A64 3500+ and better because the GPU was doing all the work at 1600x1200 and AA + AF.
 
Here's one reason why the 1.2 patch was recalled.

FarCry%202004-07-21%2015-23-40-10.jpg


This was happening to a few but not many people before 1.2, after 1.2 it got much much worse, I origionally thought it was the 61.76 drivers but it wasn't it happens with just about every driver made on the Nvidia side, which leads me to believe it is a game bug.

Oh and for those that were wondering, it would happen with or without MS 3.0, with or without DX 9.0c. I've tried under every config I can think of, except uninstalling and reinstalling Far Cry.

more pics if anybody is interested

http://snipershide.free-universe.com/remote/far_cry/6176_crap/
 
=[TheRapture]A64's are owning pretty good in the latest games, clock for clock they are kicking the shit out of regular XP's...

Thats true but the majority of PC Games are GPU intensive and usually the speed of the processor has very little to do with the performance in those games if you max out the eye candy like your suppost to. If you want to run 800x600 benchmarks then yea, the A64 beats the XP and Pentium 4 in gaming lol. But if you will look at HardOCP's latest CPU scaling review, the Pentium 4 3.0c performed as well as the A64 3500+ and better because the GPU was doing all the work at 1600x1200 and AA + AF.

But I can't run 1600x1200 because of my LCD, so my cpu IS a bottleneck, but, I do have the in game settings cranked, running 1280x1024, VERY HIGH and water to ULTRA HIGH, 2x fsaa and 4x af. Very nice!
 
ZenOps said:
Oh yeah, Toms is a ATi fansite for sure:

http://graphics.tomshardware.com/graphic/20030127/geforce_fx-29.html

"NVIDIA takes the crown! No question about it - the GeForceFX 5800 Ultra is faster than the competition from ATI's Radeon 9700 PRO in the majority of the benchmarks."

Uh huh, what was Tom smoking when he posted that one.

They push the product they feel is better. Being a fansite for ither one is not part of the equation.

http://graphics.tomshardware.com/graphic/20040723/his-08.html

As you can see, these cards offer blazingly fast performance. Both the HIS Excalibur X800 Pro IceQ II and the faster X800 XT (PE) left us with a very good impression. Nonetheless, our unrestricted recommendation for ATi fans is the cheaper Excalibur X800 Pro IceQ II. What's not to like? It offers very good 3D performance thanks to its 12 pixel-pipeline design. Also, it's very quiet because of the IceQ II cooler. Lastly, the good cooling efficiency should also make overclockers happy. If those last two factors are not as important to you, we recommend considering the Excalibur X800 Pro with the standard cooling solution, which should save you somewhere around $40 compared to the IceQ II version. The direct competitor to the X800 Pro is NVIDIA's GeForce 6800 GT, which cuts a better figure in our benchmarks thanks to its 16-pipe design.

Its bigger sister, the Excalibur X800 XT (PE) IceQ II, offers more 3D performance, as it also benefits from its 16 pixel-pipe design. It is positioned directly against NVIDIA's GeForce 68000 Ultra, and for the moment, with the drivers currently available, the two seem more or less evenly matched. Of course, like everything good, this additional performance comes at a premium price. Nonetheless, for absolute enthusiasts, the Excalibur X800 XT PE IceQ II is just what the doctor ordered.

I guess your going to have to point out for me where they are being "fanboyish" in that.
 
Name one other site that did not completely bash the 5800 as possibly the worst chipset Nvidia has ever released (and I'm talking all the way back to the TNT) And yet Tom had the audacity to "give it the crown" If I could find it, I'd even show you the one where he gave the 5800 the "videocard of the year award", but I think that one has been erased forever.
 
Tom has some good info. The problem is he "gives the crown" to whoever lines his pockets best, whether that be with products or cash. It used to be AMD, then Intel moved in, and since they could "support" him better, they get the crown. Same with ATI and Nvidia.

Instuctional stuff and benches are the only things I see THG for, and you have to really read into the benchmarks. Quite often he'll test his supporter's product on a superior setup and crown it king when the competition comes very close on a far inferior system. I've seen 3dmark benchmarks where on system was running a slower CPU, or half the ram, etc. His reviews are most definately biased. He has good information, you just have to look deeper than face value.
 
ZenOps said:
Oh yeah, Toms is a ATi fansite for sure:

http://graphics.tomshardware.com/graphic/20030127/geforce_fx-29.html

"NVIDIA takes the crown! No question about it - the GeForceFX 5800 Ultra is faster than the competition from ATI's Radeon 9700 PRO in the majority of the benchmarks."

Uh huh, what was Tom smoking when he posted that one.

Before the more intensive DX9 games came out he was right, the FX5800 Ultra is fast as hell in DX8 - faster than the 9700PRO, and you can see that from his benchmarks. At the time he wrote that review his assessment did not go challenged by any game.
 
Back
Top