Old 8800gts 640MB better then HD5770

Homey D. Clown

Limp Gawd
Joined
Sep 27, 2004
Messages
220
Why does everything look so much better with my 8800gts then it does with my 5770? I've tried call of duty black ops,Dirt 2,crysis and crysis warhead and the 8800 makes everythin look so much better.weird.
 
This is a pretty vague post. In what way? Are you using the same exact settings? There really shouldn't be any [noticeable] difference at the same settings.
 
I'd say the 5770 is better in every way. IIRC the 640 8800GTS was a little faster than a 8800GT and I know for a fact the 5770 is quite a bit faster than a 8800GT. Throw in DX11 and much lower power consumption than the GTS and you've got a winner.

As far as why your games look worse, I don't have an answer for that. They certainly shouldn't.
 
Does it actually look worse or just different and you're just not use to it? I know when I swap from my AMD to my nVidia machines the graphics do look a little different, though I wouldn't say one is worse than the other.
 
If it looks worse, you may want to make sure the nvidia drivers did a complete uninstall...

Swapping from one to the other, without a fresh windows install, can cause issues.
 
Need more details on how exactly it looks worse. Blurry? Less colorful? Less detailed?

Anyways, you could check to make sure the mipmap detail level is at high quality in catalyst. Were you running any digital vibrance with the nvidia?

Another thought, if you're using an lcd monitor, would be that you somehow have some interpolation/scaling going on. I'd double check the scaling options and overscan/underscan in catalyst.
 
I haven't changed any settings,just swapped cards and drivers.
You didn't have any game profiles set with your NVIDIA card with different AF / AA levels set or things like that?

Try going into the AMD control panel and setting the filtering quality to High Quality, for starters.
 
With the 8800 the little details are more visible even though with the ATI card I had settings set for best quality. It's strange because I know I should be getting results with the ati then with the 8800,maybe I just got a bad card.I got both cards when they were the newest models,so they are pretty old.
 
Yet your sig says...5850?

So.......you confused? On medication?:)
 
the flashed 5770 mac card is still in the mac, the 5850 died, I need to change my sig. the 5770 is the card I was using till I upgraded to the 5850.
 
With the 8800 the little details are more visible even though with the ATI card I had settings set for best quality. It's strange because I know I should be getting results with the ati then with the 8800,maybe I just got a bad card.I got both cards when they were the newest models,so they are pretty old.

You forgot to mention one thing, that you are using a flashed HD5770 in a MAC.

Dude, seriously, I can't imagine why the 8800GTS looks better.
Maybe because it was designed for the Mac whereas the other was hacked to work... :rolleyes:

Since you own an Apple proprietary system, suck it up and go buy an Apple-certified graphics card.

/thread
 
haha, there is a difference, usually AMD/ATI have better visuals, but in all honesty, it may be just due to the monitor/settings being used.

I do know one thing though, there is no way in hell that 8800GTS640 has better visuals than a HD5770.
 
there is a difference the way 2 different hardware setups run games, and only a handfull of people are able to see it.
 
The problem is that the OP is running a hacked HD5770 on a Mac. I can only fathom how many issues that will cause and I'm not about to try and troubleshoot something like that on such a proprietary system.

If he wants a new GPU on his Mac, he needs to order one from Apple, period.

Also, he's using a first-gen Mac Pro from 2006. Time to upgrade to a modern system that will actually play games, unless of course he likes playing at 20fps.
 
this is not possible.

its just like the tv ads, they cant show you how better their tv is than the one you have.
That's only partially correct. If there are anomalies in the output (for example poor anisotropic filtering or AA that isn't working correctly) you can most definitely capture that via screenshots. That's how developers have been caught cheating before at both ATI and NVIDIA. But if it's just "I prefer the color output of vendor X" for some obscure, most likely placebo reason, that's not going to be possible to capture.
 
Blurry? Change your refresh rate, problem solved :p

Edit: nevermind, I misread a post lol
 
Last edited:
That's only partially correct. If there are anomalies in the output (for example poor anisotropic filtering or AA that isn't working correctly) you can most definitely capture that via screenshots. That's how developers have been caught cheating before at both ATI and NVIDIA. But if it's just "I prefer the color output of vendor X" for some obscure, most likely placebo reason, that's not going to be possible to capture.

i switched from amd to nvidia on my NEC 2490 wuxi and the difference in games was very different.

trust me, i am not able to show you this by posting pictures side to side. You need to be on this side to see it.

most people still claim that they dont see any difference in speed between amd cpu's and intel cpu's, i call these people the blind sheep.
 
i switched from amd to nvidia on my NEC 2490 wuxi and the difference in games was very different.

trust me, i am not able to show you this by posting pictures side to side. You need to be on this side to see it.

most people still claim that they dont see any difference in speed between amd cpu's and intel cpu's, i call these people the blind sheep.

I run most my games over 60 FPS so I am happy :).
 
my buddy was saying this the other day. Nvidia looks way better than ati. I don't see it. I think its just the dv.
 
People, he's running the hacked HD5770 on a 5 year old Mac. There's a reason the screen is blurry.
 
my buddy was saying this the other day. Nvidia looks way better than ati. I don't see it. I think its just the dv.

No, it's not, AMD/ATI has had better quality visuals since the HD3000 series was released. There are other threads on here discussing it in detail.

It's not a huge difference, but there is one.
 
If running HDMI, there is video profiles in CCC, run full RBG mode and this should solve it, at least it does on windows! :D
 
As someone who has spent plenty of time switching between ATI/AMD and Nvidia cards/computers on a daily basis. You guys are nuts thinking there is any significant difference.
Also, is this thread going anywhere since this guy has basically entered hackintosh territory and expects golden results?
 
If running HDMI, there is video profiles in CCC, run full RBG mode and this should solve it, at least it does on windows! :D

Not on a mac, and not on a hacked one at that. But yeah, Windows, ftw. :cool:
 
As someone who has spent plenty of time switching between ATI/AMD and Nvidia cards/computers on a daily basis. You guys are nuts thinking there is any significant difference.
Also, is this thread going anywhere since this guy has basically entered hackintosh territory and expects golden results?

There is a difference, especially if you look back to the Series 6 and X100 series, nvidia was clearly better. Now however, AMD is in the lead in visual quality.

On Linux and OS X especially running OpenGL, AMD graphics are FAR better than NVIDIA.

With Windows, it's kind of a crapshoot depending on what DX updates you have and which drivers you are running. AMD is usually better, but you are right, there is little difference.

We need to get back on topic though. PM me if you want to further discuss it. ;)

OP, we need some input on what you are going to do, otherwise /thread.
 
This has nothing to do with my Mac, I have a 5770 in my PC also,that is the one that I am talking about.
 
I can't take this thread seriously.

a) you haven't given us any imperical data to back up your statement

b) you have not even properly detailed in what game, program, or on what system you're referring to this magical card of visual wonder is perhaps performing miracles of dazzling visual effects

c)
homey_the_clown.jpg
 
i switched from amd to nvidia on my NEC 2490 wuxi and the difference in games was very different.

trust me, i am not able to show you this by posting pictures side to side. You need to be on this side to see it.

most people still claim that they dont see any difference in speed between amd cpu's and intel cpu's, i call these people the blind sheep.
I am aware of this, and nothing I said contradicts what you just said :p I am just saying that if, for example, he had differing anisotropic filtering settings between one card and the other, he could definitely demonstrate that via screenshots. If his problem is simply how he perceives the output of one card versus the other, that can be hard to tell, and would be impossible to capture via a screenshot. You'd have to see them side by side.

I think you would be hard-pressed to tell the difference on most of the TN panel 1080p LCDs that vendors push these days though.
 
I can't take this thread seriously.

a) you haven't given us any imperical data to back up your statement

b) you have not even properly detailed in what game, program, or on what system you're referring to this magical card of visual wonder is perhaps performing miracles of dazzling visual effects

c)
homey_the_clown.jpg

Considering he never brought up the Mac the I don't understand what the fuck your post is about.I know you're trying to be a fucking funny assclown but you're not.

As for the OP there shouldn't be a difference. The only thing i can come up with either some kind of driver conflicts or your settings were defaulted in-game which you say they were changed. It could be some driver optimization ATI implemented but I doubt it.
 
Considering he never brought up the Mac the I don't understand what the fuck your post is about.I know you're trying to be a fucking funny assclown but you're not.

As for the OP there shouldn't be a difference. The only thing i can come up with either some kind of driver conflicts or your settings were defaulted in-game which you say they were changed. It could be some driver optimization ATI implemented but I doubt it.

This is the point, he's running a hacked GPU on a 5 year old Mac. Honestly, I'm amazed he got it to work at all. Now he wants us to trouble shoot this and make it work?

Yes, that picture was perfect for this thread. :D
 
I think the issue is that the iniital post from the OP had jack information in it.

I noticed the 5850 in his sig and did a post search that led to the link to a 5770 in his Mac.

At that point he still didnt clarify it was yet ANOTHER 5770 he is using in a different rig.

If folks cant be bothered to provide all the info required for folks to come up with a solution, then dont bother. Otherwise it starts to look like trolling.
 
i can't tell the difference between the 8400GS-M and a 5470 when playing media or using windows and most if not all games look similar
 
It took one dipshit poster to just ASSUME he was running 'a hacked 5770 on a hacked mac' just because he read a completely different thread by the OP. What? Someone can have a PC and a Mac? Someone can have TWO 5770s? No way! <gasp!>

But in all fairness it would help if the OP didn't type like a one handed 12 year old using a cellphone. Hell I'm still not exactly sure what's going on.

In the end this thread is full of fail - there is too little information to help the OP and I honestly doubt he has the language skills to give us any real info. But I'm an eternal optimist.... so....

OP: Close this thread. Try again after you've had some time to think about how you can give us the information we need to assist you. Telling us your PC's specs would be nice, what driver revisions you've tried, whether you've properly cleaned the ForceWare drivers off your system before installing the ATI/AMD card, HOW exactly the graphics look better/worse, whether you ate paint chips as a kid, etc... that would be at least a promising start.
 
Back
Top