Where is all this "untapped raw power" PS3?

Its a debate, not a fight so it should be ok :)

I think this untapped power is not even there. Its just a bunch of hype like sony always does like with the ps2 with the emotion engine and all that with the ducks in the water. They bloat it up saying its super high tech when its really just a good cpu at the time. Sony always lies, can't believe some people would believe such stuff. The ps3 to me is as power as the 360. One has a better cpu than the other then the other has a better gpu than the other so it levels out to be both the same practically in graphics if done right.

Anyone can optimize a game for a ps3 to push its graphics, they can also do that for the 360. Any system can be pushed even further when optimized for that system. Only game i've seen push 360 to its graphics hight is gears of war. We haven't really seen a game like graphics on ps3 because its never been done yet so we can't say its inferior just yet so some people need to put their gun down on that lol. We can stay like half way through the gen if the ps3 never really looked that good then it would be good to call that 360 had better graphics overall in its lifetime.

To me though right now from all the multiports it could be because of lazy porting but the ps3 versions always got the crappier textures and bad antialiasing. Most of all though i think the cpu in the ps3 is pretty good since it runs linux and stuff. Its 2 different cpus and gpus. We can't really compare the games until we have 2 games that were made on those systems that were optimized on those 2. Thats the only time we would be able to really see the difference in the games. Right now it could be anything from bad porting to lazy developers. Just have to see :)
 
Conker a good test may be Darkness. From what I have herd they had 2 teams developing this game one for 360 and one for PS3. I don't own a 360 so there is no way for me to compare but there are many people here with both systems. Maybe they could rent both from Gamefly or something and give a side by side comparision
 
I hate to ask a stupid question, but what is the big deal about 30fps vs 60fps? Is one a requirement of HD content? I thought they human eye/brain could not tell the difference much past 30fps anyway? Sorry if this has already been addressed in this thread.
 
I hate to ask a stupid question, but what is the big deal about 30fps vs 60fps? Is one a requirement of HD content? I thought they human eye/brain could not tell the difference much past 30fps anyway? Sorry if this has already been addressed in this thread.

no it's not a bad question I had herd the same thing myself. They have done further testing and that was for the average TV viewer. Gamers have been found to be able to see a great deal higher frame rate than that though. It seems that we have trained our eyes to pick up more detail and process information at a higher rate than average.
 
no it's not a bad question I had herd the same thing myself. They have done further testing and that was for the average TV viewer. Gamers have been found to be able to see a great deal higher frame rate than that though. It seems that we have trained our eyes to pick up more detail and process information at a higher rate than average.

I think that’s true, but it also depends on the person. Some people can see rainbow effects in plasmas, others can't, some people get sick feeling during games, and others don’t. I can see and feel a difference in CS when I'm dropping below 60 fps, others may not...

For me, games tend to look and feel (as in controls don't feel laggy or non-responsive) better when running at 60 fps or higher. I think most will agree that when one sees fluctuation in fps that it’s more noticeable than if it were locked in at 30 or 60.
 
There is so much information about this on the net....google it.

Short is that 24 FPS is okay in movies because each individual frame is blurred abit. Check out the pause and frame advance on a DVD player or VCR. each frame is a bit blurry. That blurryness makes everything go together smoothly in your brain when played at 24 FPS.

A computer at 30FPS looks jerky to most people because the individual frames are perfectly sharp, distinct, and clear.

People actually can see 45-80HZ typically - depending on the person, their age, eyesight etc.

Electrical flourescent lights operate at 55-60hz...some people see them flicker - others don't...

For a quick test of what hz refresh scaling your eyes can see hold a pencil with your thumb and finger's shake it quickly in front of a CRT computer monitor at various Refresh rates. At 60HZ most people will likely see the pencil jumping like crazy, at 100hz nobody should see the pencil jumping and it should be a smooth movement just like if you weren't holding the pencil infront of the CRT while jiggling/shaking it.

As you play with the refresh rates on the CRT you can find out exactly what your eyes can see.
 
I hate to ask a stupid question, but what is the big deal about 30fps vs 60fps? Is one a requirement of HD content? I thought they human eye/brain could not tell the difference much past 30fps anyway? Sorry if this has already been addressed in this thread.

some people are tech/benchmark junkies, they just like to be able to say their console has higher frames. some people actually can tell the difference between 30 and 60 fps. i can only tell the difference when theyre set side by side, so i guess im lucky.
 
I can see when things are running at 60Hz.. really sucks when your entire office building has florescent lights that never turn off. Headaches or migraines almost every single day.

But hey, I can impress people when I walk by and say "why is your refresh rate set to 60hz!".. :) :rolleyes:
 
Back
Top