Console to PC Ports Can Be Great

Consoles or not, developers STILL have to cater to the lowest denominator among PCs because most people do not have the enthusiast high-end or even close to it. It has always been that way. That doesn't just go for graphics but for game environments and AI also. It is just exacerbated now by consoles because of their proliferation.
And they can do this by making the game scale.

Besides that your average gaming PC is at the least on par with current consoles.
 
I'm ready to see all the tin foil hat wearers out in this thread.

Good game is good game
Bad game is bad game

Whether it was ported or not. Quit scapegoating.

Agreed. I never understood all the whining about "Press Start to Continue" and minuscule things like that.

I like a game for its gameplay, not the politics of selling out to the consoles. If you were in the business of making video games for a profit, you'd probably overlook the pc world too. I have a $300 xbox 360 that satisfies my gaming needs just as much as my $2000 gaming rig. Are the graphics super bleeding edge on the Xbox? Nope, but I still have plenty of fun.
 
Fable 3 for the x360 is unimpressive. They have to improve it, if they hope to sell a version for the PC.
 
I think many of the points made are valid.
It would be nice if people wouldn't be so hyperbolic in their reaction to others' statements.

I think we can all agree that great gameplay with crappy graphics and crappyy gameplay with great graphics are both to be avoided.

If you stay away from the extremes of both examples, however, let's look at 2 scenarios:

Great gameplay and decent-but-dated graphics
Great graphics, but non-intuitive controls

Again, within the context of avoiding extremes on both sides, the 1st scenario would probably be preferable to most people.


There is also the additional factor of the type of game being played.

In a game like WoW, where the game is less linear/scripted and you can roam to your heart's content, the eye candy has a fair amount of impact.

In an FPS like Battlefield: Bad Company 2, the amazing amount of detail that is possible has less impact on the actual enjoyment of the game, since the game moves at a faster pace and the action itself is more immersive.
 
I genuinely do. More advanced engines allow developers to do more complicated things and, in turn, fulfill their vision as best they can. But a DX11 render path on a completed game won't make it any better. It'll be the same thing, with a couple more shaders and faster execution. Woopie, the game is the same.

I respectfully disagree. I think graphics can make an actual impact on games and to a lesser extent frame rates. The two most obvious examples that come to mind are 'scary games', 'realism-based shooters' and 'race car driving games'.

The first is obvious. The spooky atmosphere partially created by graphics and partially by sound/music in games like Resident Evils, Silent Hills, Phantasmagoria's, etc etc help draw the gamer in. If you suddenly saw something extremely fake looking in an otherwise detailed game, it could draw you out of the immersion-factor pretty quick. Same goes for cheezy sounds or music. In those games though, the better they are starting to look, the scarier some of them are becoming.

The second is pretty obvious why graphics improve the experience. If oyu are trying to accurately potray a real-world battle and you use cartoony shell shading so it looks like oyur watching an episode of the Looney Tunes verus realistic, fairly well detailed, well-lit models etc.... one is obviously going to make oyu feel like you were actually there and the other not so much.

Half-life 2 when it was BRAND NEW many years ago is an example of a game whose graphics really created a 'world' where you felt like you were in a real city/urban area/etc and were really fighting to save the ....world? friends? barry + alix? Half Life 2 also had great graphics for the time and were a pretty decent setup from say CoD2.

Racing games have always been about realism and trying to make you feel like your the race-car-driver. I don't see many people playing that 1980s F1 Grand Prix or whatever it was called with the cartoony 8 bit graphics anymore.

People also get spoiled by good graphics to boot. I can't tell you the number of times I've seen friends go 'Ohh man, Deus Ex 1!!!?!?!?! That game was the bomb. I'm gonna go replay that before Deus Ex: Human revolution comes out' and the next day when I ask them how Deus Ex was, they played it for 15 minutes, had some fun but the graphics just looked too dated. They remembered the graphics with rose-tinted classes. IE: How great they were, --for--the time.
 
Problem is even if this is the greatest port ever its still a very boring game that is not worth buying to begin with.
 
Most people probably don't have a 360 level GPU. That's a 3850-4850 GPU in raw terms. Most people do not have that. Why would they rewrite the entire game (there are software limitations outside the hardware, like a 2GB memory cap on 32bit apps, less than that can even be utilized) to support larger map sizes, less loading times, etc for the <10% of the PC population that could run it? Probably less than <5%
 
Most people probably don't have a 360 level GPU. That's a 3850-4850 GPU in raw terms.

.... .... .... .... ...No. Just No. No No No. An xbox 360 GPU is a ATI Radeon x1900 level gpu.

So if you take that 3850-4850 you were estimated and then think about the generation before that, your at the 2850-3850. Go another generation back, then another, then another and your at the x1900.

You've over-estimated the xbox 360's GPU by 5 generations. Most xbox 360 games run at 640p (under 720p resolution) and then are upscaled via the same process a dvd player uses to 'output' at 1080p.

A PS3 only has a 7800GT in it. If you think about a modern-day video card, the GTX580 and start counting backwards from GTX5xx -> 4xx > 2xx > 1xx -> 9xxx -> 8xxx > 7xxx and keep in mind each step indicates about a year. That's how far back you need to go to reach wne the PS3's GPU was 'new' and fairly 'cutting edge'. If a PC gamer has not updated his video card in over 6 years then yes. A PS3 would be more powerful most PC gamers are not gaming on 6 year old machines though.

Very few gamers on this board or any other board produly tout the stats of their Pentium 4-450mhz with a 7800GT and 512MB of ram. Infact, that percentage is probably under 5% of gamers.

If you look at the steam hardware survey, from a hardware-alone perspective ignoring OSs for the moment, 78% of all gamers have DX10 capable hardware meaning a nvidia 8xxx series card or ati 4xxx or later cards. So in short, 78% of all gamers have a graphic card more powerful than either ps3 or the xbox 360. Some people's single cards going as far as 7-8x more powerful. Let's not even get started on what's possible with SLI/CF.
 
.... .... .... .... ...No. Just No. No No No. An xbox 360 GPU is a ATI Radeon x1900 level gpu.

So if you take that 3850-4850 you were estimated and then think about the generation before that, your at the 2850-3850. Go another generation back, then another, then another and your at the x1900.

You've over-estimated the xbox 360's GPU by 5 generations. Most xbox 360 games run at 640p (under 720p resolution) and then are upscaled via the same process a dvd player uses to 'output' at 1080p.

A PS3 only has a 7800GT in it. If you think about a modern-day video card, the GTX580 and start counting backwards from GTX5xx -> 4xx > 2xx > 1xx -> 9xxx -> 8xxx > 7xxx and keep in mind each step indicates about a year. That's how far back you need to go to reach wne the PS3's GPU was 'new' and fairly 'cutting edge'. If a PC gamer has not updated his video card in over 6 years then yes. A PS3 would be more powerful most PC gamers are not gaming on 6 year old machines though.

Very few gamers on this board or any other board produly tout the stats of their Pentium 4-450mhz with a 7800GT and 512MB of ram. Infact, that percentage is probably under 5% of gamers.

If you look at the steam hardware survey, from a hardware-alone perspective ignoring OSs for the moment, 78% of all gamers have DX10 capable hardware meaning a nvidia 8xxx series card or ati 4xxx or later cards. So in short, 78% of all gamers have a graphic card more powerful than either ps3 or the xbox 360. Some people's single cards going as far as 7-8x more powerful. Let's not even get started on what's possible with SLI/CF.
That's 2 generations ;)

x1XXX
HD 2XXX
HD 3XXX

Go play a ported 360 game on a x1900 and see how it runs. Consoles perform much better at their task than equivalent PC hardware. Mass Effect 1 ran ok-good on my computer when I had a 3850.

Card model plays a huge role. I got my parent's a 4000 series card for xmas so my mom can have dual monitors like she does at work. But it's a low end card and scores a bit higher than my old 9700 Pro did in 2004. (in benchmarks). There is an issue with sample selection by steam hardware surveying. Most people on steam aren't joe blow either. I know quite a few people that try to play games on store bought OEM computers with integrated graphics.
 
I'm really happy that Lionhead is at least trying to cater to us gamers that belong to the PC crowd. Unfortunately, Fable 3 is completely mediocre in every way.

My prediction: Fable 3 sells like crap; the industry says, "We tried our best to cater to what the PC gamers wanted and they still didn't buy our game. Therefore there's no reason to put forth the effort to port a game correctly".
 
Exactly. These games were all pretty clearly console ports, but Batman Arkham Asylum was simply fantastic on the PC. Whether you used a mouse and keyboard, or a game pad the game just worked and was executed perfectly on both platforms.

That's almost completely true with the exception of those pull back fight stages. I really had to fight the keyboard in a few of those and it was painfully obvious that they were developed with a controller in mind. It was annoying enough that I went ahead and pulled out my controller for each of them then switched back to kb/m for the rest.
 
Go play a ported 360 game on a x1900 and see how it runs. Consoles perform much better at their task than equivalent PC hardware. Mass Effect 1 ran ok-good on my computer when I had a 3850.

No, they'd run about the same. The difference is you'd need to lower your resolution all the way down to less than 1024x768 to really get the 'console' experience. It doesn't take a powerful gpu at all to run a console port at 800x600 resolution on low detail settings(equivilent of a console).
 
Who has a monitor that runs at that resolution? Even 15" have been able to do 1024x768 for more than 10 years

Run it in windows mode at 800x600. That's about the reoslution that consoles render at. 640p. Of course, being in windowed moe its more taxing so you could lower the resolution a bit more. You'd find that even an 1900xt can keep up at rediculously poor resolutions of 800x600.

That's why ever 'modern' game in the last 10 years thats a console-port and/or also on console has its 'miminum' gpu set at 6800gt/x900 roughly. Since if you turn the settings down reallll low, you can play it with that old hardware. Xbox 360s do the same thing. Really low-res, blurry textures rendered at 640p and then upscaled the same way a dvd-player works.
 
Who has a monitor that runs at that resolution? Even 15" have been able to do 1024x768 for more than 10 years

That's exactly why you can't compare a console with a pc directly. The console is running at a MUCH lower resolution.
 
Then talking about the graphics and all the nicer resolution

...

Video shows lack of clean textures, assloads of aliasing, bad animation. Yep.
 
No, they'd run about the same. The difference is you'd need to lower your resolution all the way down to less than 1024x768 to really get the 'console' experience. It doesn't take a powerful gpu at all to run a console port at 800x600 resolution on low detail settings(equivilent of a console).
Now we're downplaying consoles? I guess that's another plus point for consoles, since they can run the same games we can, with your eye candy, for a lot less than your computer. Sounds pretty good.

Mass Effect 2 looks pretty much the same on 360 as it does on a PC. But even though my 4870 is many times more powerful than the 360, I can still notice frame rate dips down to 30s-high 20s at 1080 occasionally.

I'm not comparing directly, I'm saying in order to have an equivalent experience aesthetically you should at least have a 3850 class GPU. Else you'll be running on your PC at 800x600 lowest settings, which really won't look the same as the 360. But Mass Effect would still be fun even with laughable texturing. It's a story driven, good game. That'll all still be there.

I think the [H], in general, is more concerned with words than what is actually there. "Press any key to continue" has been on PC's since DOS. Do not turn off the game while saving goes for consoles OR PC's. Did you think that it's ok pull the plug on your PC while it's in the middle of writing your save? It will probably be borked if you do so whether you're on PC or console. This getting bent out of shape over this frivolous garbage is absurd and preposterous. These things are completely irrelevant to the game. If the game is bad, that's the developers fault for making a terrible game. Not the PS3 or 360's.

Console games can be just as good as PC games. I think most of the people on this forum anymore are hysterical about this crap.

For the record, I played on terrible computers for years and chafed under terrible frame rates and lowest settings. 10-25fps was the norm. If I could get in to the 20s on a game that was doing pretty well. But that never stopped me from having fun playing the games. The only thing that WOULD prevent the fun was if the frame rates were downright unplayable (single digits). In 2003 my computer (with a P4) had trouble playing Quake 3 (1999) at decent settings. The only games that ran well were pre-2000.

Consoles introduced me to good games with Metal Gear Solid.

Good game is good.
Bad game is bad.

You're still scapegoating.
 
1. Some people just enjoy consoles more and thats ok. Having fun is still having fun.

2. You're on crack.

Fun to present facts that are likely incorrect and factually irrelevant, that make it seem like you, on your end of the argument, is uninformed and doesn't matter eh?
 
Good game is good.
Bad game is bad.

You're still scapegoating.

The problem with "consolization" is that it can make otherwise good games annoyingly frustrating to play on a PC, and would provide a significantly better experience if the developer had bothered to spend even a minuscule budget on adapting it appropriately. I'm the first to admit that I don't know my bum from a hole in the ground when it comes to what's involved with coding for a big title, but some times it's what appears to me to be a minor thing to alter.

I'm a business owner several times over, and I can tell you that even if it eats in to my margin a bit, I absolutely will not release something that I think cheapens my brand and hurts the relationship I have with my customers. It's simply a bad decision in the long term. Many game devs seem to be content to release ports that are very poorly done, and they've lost my dollars as a result.
 
People are forgetting that the code for a console game is optimized for consoles. When you take a console port and put it on pc, it's not going to run as well.

When you take a PC game and then port it to a console, it tends to look pretty horrible or dated because of the same concept. Just look at the mess that is Risen on the 360; it's a gorgeous game on the PC, why is it so completely terrible looking on a console? The answer is that it's a pc game.

Furthermore, you'll notice that very few console games have the wide open fields of play that pc gaming has enjoyed for a long time.

Case in point, Crysis 1 (pc only) versus Crysis 2 (console to pc port and a cod corridor shooter).

Console gaming will always have its niche (especially because of exclusives), but you must be kidding yourself if you think that they can keep abreast with what a PC can pull off, especially with the upgrade cycle inherent in the platforms.

Also, what's this talk about Mass Effect being a decent looking game on both platforms? It's a corridor rpg; you can't compare it to what a pc could have done with it. The reason console ports are terrible is because they're made for consoles, not a pc.
 
People are forgetting that the code for a console game is optimized for consoles.

Exactly. When you're coding for a console you're coding for one specific hardware set that never changes. You can optimise the shit out of it. How many different PC configurations are there?

As much as I hate to bring Apple and Microsoft into it, it's the exact same thing. Apple codes for an extremely small hardware subset, whereas Microsoft codes for everything.
 
Some can. Batman, Bioshock, Dead Space are all pretty great. Would be better in DX11 which tho.

Hold on there...Dead Space had one of the most god awful PC control schemes I've ever encountered. I heard wonderful things about it, but actually attempting to play it on PC was so frustrating I didn't even complete the first level before uninstalling and vowing never to touch it again. It may indeed be a gem of a game, but like a diamond at the bottom of a pile of manure, it's unlikely anyone is going to dig through that much shit to find out.
 
Exactly. When you're coding for a console you're coding for one specific hardware set that never changes. You can optimise the shit out of it. How many different PC configurations are there?

Diversity sucks (if it didn't, the government wouldn't have to keep telling you otherwise). But, it's not that bad. Windows provides hardware abstraction, which mostly eliminates the issue of hardware diversity, at a modest performance penalty. Most of the so-called diversity is a matter of scale, anyway. 2Gigs vs. 8Gigs of RAM. 100 vs. 200 shaders.

The trouble with console ports is that these games are designed around very anemic hardware, and so they don't use the greater resources of the PC efficiently nor do the games have features which require better hardware.
 
Batman AA = Yes
Bioshock = It was OK
Dead Space = WTF are you smoking?

Dead Space never got a sigle patch.

Eventually ppl got used to controls (or they gave up) , and it was ok. Save the the save game difficulty bug.
Dead Space 2, EA said it has no plans to fix dithering...

Capcom makes good multiplatforms
 
Hold on there...Dead Space had one of the most god awful PC control schemes I've ever encountered. I heard wonderful things about it, but actually attempting to play it on PC was so frustrating I didn't even complete the first level before uninstalling and vowing never to touch it again.

I felt that way until I turned V-Sync off. Beforehand I was like, "Is this game supposed to control this badly?? Maybe it's like this so it'd be more difficult because.. you know.. it's a horror game and all, and it's supposed to be.. hard. Yeah." After turning V-Sync off mid-way through the game, however: "Holy crap, I can aim now! BOOM! HEADSHOT! I see your glowy bits and I RAISE YOU A PLASMA ROUND, BITCH! BOOM! BWAHAHAHA--"
 
man, alot of pc games coming out has no AA wtf.

Any PC game that comes out without AA, can have fullscreen AA enabled at least via the Catalyst Control Center/nVidia's control panel//nhancer. Its not as efficient as the game itself having AA, but it works. Granted, you know a company is getting really really cheap when they dont even include AA.
 
Now we're downplaying consoles? I guess that's another plus point for consoles, since they can run the same games we can, with your eye candy, for a lot less than your computer. Sounds pretty good.
Your unfortunately missing the point. We are not 'downplaying' consoles we are accurately potraying consoles. Consoles do not have 'your eye candy' as you speak. They do not have 'high' detailed textures or texturse anywhere near 'high' or 'medium' setting. They have very low texture details.

When we tell you to play your game at 800x600 in windowed mode(not 1024x768 as windowed mode is generally more taxing b/c what might be being rendered in the background) with all low details, we do this b/c that's what a console renders at.

Mass Effect 2 looks pretty much the same on 360 as it does on a PC. But even though my 4870 is many times more powerful than the 360, I can still notice frame rate dips down to 30s-high 20s at 1080 occasionally.
Yes, you were playing Mass Effect 2 on your PC at a resolution of 1920x1080 or in other words just over 2 Million pixels per second. Compare this to an xbox 360 that renders about 0.6 million pixels per second. So literately, your pc is rendering three times the pixels.

The same settings on an xbox 360 would bring it to its knees with those '30s to high 20s' becoming 10s to 8s.

Keep in mind, performance would probably be hindered further since you were likely playing on medium or high detail settings and not the lowest possible option across the bar you would be if you were playing on a 360.

Finally, your gameplay experience was likely better due to loading times being significanlty faster while playing on the PC.

I'm not comparing directly, I'm saying in order to have an equivalent experience aesthetically you should at least have a 3850 class GPU. Else you'll be running on your PC at 800x600 lowest settings, which really won't look the same as the 360. But Mass Effect would still be fun even with laughable texturing. It's a story driven, good game. That'll all still be there.
The equivilent aesthic experience would be 800x600 in windowed mode with all the details sliders set all the way down. That's actually the closest aesthic equivilent you can make. What your doign when you render the game at 1080p with medium to high detail settings is experiencing significantly better aesthetics which pc gamers have become acustom to.

Mostly console gamer's dont know their aesthetically experiencing 'crap' b/c they've never experienced better. Its the same way if you go back and play early shooter like Deus Ex, you'd think the graphics are kind of ugly and awful. Its only ugly/awful because you've seen games like Mass Effect 2. Back in the day, Deus Ex was a beautiful game. Now its ugly by comparison.

Consoles are ugly and really holding back gaming by keeping the games designed to fit into console hardware. This even begins to effect gameplay with 'small map sizes' or 'small boxes' // 'closed world' game design. The xbox 360 with 256mb gpu and 256mb of system ram really can't render very large open worlds all at once. So everything needs to be 'streamlined' to be loaded in 'small peices' 'as needed'.

That's why in modern day gaming you'll often run into impassable walls for 'seemingly silly' reasons and games sometimes feel like they are almost on 'rails' with only '1' or '2' paths to pick. Ie, see: Final Fantasy 13. If you started basing pc game's minimum system requirements around 512mb for the gpu and 1-2GB for system ram, you'd still be within 85% of all pc gamers who've bought a gaming pc within the last 5 years. Even the $67 ATI Radeon 66xx series card fits these requirements.

People who haven't bought a graphics card within the last 5 years or are unwilling to spend $67.00 on a new graphics card that will run on virtually any psu, honestly, probably aren't buying a lot of games b/c they clearly dont spend a lot of money on their hobby. A reasonable publishing company or developer therefore shouldn't be those people into consideration when setting the minimum system requirements.

Also, please consider the forum you are talking on. This is a computer-enthusiast forum. If we attached on page 1 of this thread a poll saying:

"I use Intel Integrated Graphics Solution for Gaming" => 'I'm a Noob' or something similar.
and the other other option was:
"I use Intel Integrated Graphics Solution for Gaming" => 'Not necessarily a newb'

You'd probably see a overwhelming majority think gaming on integrated graphics makes you a bit of a newb.

Why do I bring this up? You mentioned your friends who try to game on Intel graphics cards. That's fantastic. Tell your friends to spend $67.00 on one of the low-tier radeon cards. The 66xx cards. Voila. They'll be able to play most modern games at 1080p with 33-45fps. That's what the cards are designed for. Even Metro2033 on that $67.00 card runs 33 fps.

If someone is not willing to spend a few bucks on a video card thats about the price of '1' game. I don't really see that consuming going out and spending a whole lot on software. You know, software their pc can't run.

You're still scapegoating.
Pot meet kettle.
 
Back
Top