Gears of War PC Performance and IQ @ [H]

This contiuning lack of support for AA with this supposedly 'next gen' 3D engine is laughable. :eek: :rolleyes: Is Sweeney and company not capable of coding it like Cryteks programmers ?
Seems like someone needs to re-read both the article and the thread, as well as about a half-dozen threads at Beyond3D before forming an opinion.
 
Ah ok so I wasn't crazy :D

Did you guys (or anyone) have any luck forcing AA through NV control panel in XP or Vista?

Sad to see DX10 not living up to the hype at all so far...

Negative, forcing AA from the control panel did not work in any mode.
 
Oh no I do agree with you, I just thought I read somewhere in the article that they pointed out that DX10 offered AA which was a nice improvement, not that DX10 itself was. :)

Yes but there is no reason why they couldn't write the engine to include AA in DX9, especially as DX9 is offering better performance across the board.

I smell another rat with the lack of AA in DX9, maybe there is a "ini" files somewhere we can change lol.

Time for us to send a message to these company's to stop nerfing DX9 to make DX10 look like something worthwhile.
 
Yes but there is no reason why they couldn't write the engine to include AA in DX9, especially as DX9 is offering better performance across the board.
I suggest you read Brent's article again (and the thread, too).
Also, this engine uses deferred shading to speed up dynamic lighting and shadows, and as such does not support antialiasing in DirectX 9.
We know this is because of the inherent nature of deferred shading in DX9
Couldn't be any more clear if you ask me. Yes, they could have built an entirely separate renderer that would support multisampling in DX9, but said renderer would not leverage deferred shading for lighting and shadowing, and would, as a result, perform much more slowly. If that were the case, we'd see people complaining about how slow the DX9 path is as opposed to the DX10 path. How many renderers do you think Epic should have written?
 
I suggest you read Brent's article again (and the thread, too).
ow many renderers do you think Epic should have written?

2..

Basically the excuses sound like fluff and BS to me considering we have a thousands of DX9 games with AA.

Mind due some people believe fluff, like you for example :p
 
How many renderers do you think Epic should have written?

Judging from everything I have read in the last months: ONE

A Good DX9 engine is all you need. I would like to see a company with the stones to go DX9 and say the obvious. There is no return to going DX10 at this time.

But hey, gotta get that check box on the product box. If they are going to invest the effort in doing DX10 just for the checkbox, they can at least do DX9 right and that means AA.
 
Judging from everything I have read in the last months: ONE

A Good DX9 engine is all you need. I would like to see a company with the stones to go DX9 and say the obvious. There is no return to going DX10 at this time.

But hey, gotta get that check box on the product box. If they are going to invest the effort in doing DX10 just for the checkbox, they can at least do DX9 right and that means AA.
LOL, I like your idea.

Unfortunately, I think many more people will be buying games like these because they "support" dx10, compared to the ones who won't be buying because it doesn't support AA.

And probably some people will buy them because of the DX10 marketing checkbox thinking they will upgrade and get the full benifits later.

Look at Crysis, tons of people will be buying it because its the first "true" dx10 game... despite the fact that you can edit config files and enable nearly every DX10 feature under DX9. It makes you wonder why the dev's disabled the very high detail setting in dx9.

Its kinda like SATA 3g (aka SATA 2). You know how many people were insisting the only hard drives they would consider would be SATA 2, instead of SATA 1, simply because of that marketing checkbox? Its more of a valid point now what with the hdd transfers continually increasing, but when SATA 2 was announced, hdd transfers hadn't even hit a sustained 90 MB/s, and the one that came closest was the raptor 150gig. There were tons of people who didn't buy that drive because it didn't have the marketing check box of SATA 2, despite the fact that it didn't matter at all then in a hard drive. At the time that hard drive was the fastest non scsi drive availible. (and yes, I know about burst transfers).
Now a motherboard, insist that it have SATA 2 to have the availible bandwidth and to futureproof it, thats another matter.
Nowadays its a much more valid point to insist on a SATA 2 drive, but at the time it was silly to pass up the fastest non scsi drive on the planet just to have that marketing checkbox. And yet tons of ppl did.
 
LOL, I like your idea.



Look at Crysis, tons of people will be buying it because its the first "true" dx10 game... despite the fact that you can edit config files and enable nearly every DX10 feature under DX9. It makes you wonder why the dev's disabled the very high detail setting in dx9.


Not sure why people think Crysis is the first true DX10 game. Crysis was mostly developed under DX9 long before DX10 was even released. DX10 is nothing more than a after through end of development strap on (pretty much proven with the ini tweak).

We will see the true benefits of DX10 when we see games developed only with DX10 API's, and as most games take about 3 years+ to develop it seems likely that we won't have a true DX10 game for at least another year and a half.

I just wish we could cut the DX9 restriction crap, mainly talking about Crysis and the Joke that is Halo2 PC.
 
Basically the excuses sound like fluff and BS to me considering we have a thousands of DX9 games with AA.
Seems like that'd be true, but it's very simply not the truth. If you have a real reason to suggest otherwise, then I'm all ears (the fact that other games support AA is not an actual reason).

Mind due some people believe fluff, like you for example
Hey, don't take my word for it. Do some research, and you'll inevitably come to the same conclusion I have (or ask Brent, if you want). Or, better yet, show me a renderer that renders the same way UE3.0 does, but supports MSAA in DX9.

A Good DX9 engine is all you need. I would like to see a company with the stones to go DX9 and say the obvious. There is no return to going DX10 at this time.
I'd like to see that too, but you can't do the impossible. Based on everything I know, you can't do it in DX9 with the rendering techniques Epic's employing in UE3, not without some sort of driver hack.
 
I'd like to see that too, but you can't do the impossible. Based on everything I know, you can't do it in DX9 with the rendering techniques Epic's employing in UE3, not without some sort of driver hack.

So don't render in a way that makes AA impossible. Like already noted we have thousands of DX9 games with AA. Do the best you can with AA.

What makes Epics rendering so great that it is worthing giving up AA for?

I won't buy a game that doesn't support AA, I don't care what rendering tricks it uses. Nothing is more jarring than jaggies all over the place.
 
kinda ironic. the games i thought were MUST BUYS (crysis, GOW, UT3), actually turned out to be big let downs, both technically and creatively. games I didnt even know/cared that were coming out, TF2 and COD4, turned out to be fantastic and must buys.
 
I'm getting ~10fps on a 7950GX2 on default settings, and about ~20fps on low.
Bioshock runs at around 50fps...

Anyone else having problems with a GX2?
 
I was surprised to see that GOW runs really well on my 8800GTS 320meg. I didn't see any hitching/glitching in 1680x1050 with textures on "highest". I did set shadows to "medium" though. Everything else on high. According to Rivatuner, it was hovering around 300meg of texture memory.

Edit: That's on XP with 169.04. Although I'm not disputing the article, unless a person is determined to run a higher resolution, 320meg seems to be enough (barely).
 
limited x64 drivers - check
limited dx10 optimizations - check
aero desktop being bloated rather then original less then 5% cpu - check

go microsoft!
 
Brent in coming reviews like this could you add testing for proper wide screen as well
you touched on the lack of AA but at this point cropped wide screen is just as bad
 
Brent in coming reviews like this could you add testing for proper wide screen as well
you touched on the lack of AA but at this point cropped wide screen is just as bad

On the same note can you please add testing for "proper" 4:3 as well :p. Cropping the widescreen view down for the 4:3 view is just as much of a cause for 4:3 gamers to complain as the inverse but we don't think we have a "right" to a bigger view area as the WSers seem to.
 
On the same note can you please add testing for "proper" 4:3 as well :p. Cropping the widescreen view down for the 4:3 view is just as much of a cause for 4:3 gamers to complain as the inverse but we don't think we have a "right" to a bigger view area as the WSers seem to.

Agreed, only ~15% game on WS, but they sure as hell make some noise...
 
Agreed, only ~15% game on WS, but they sure as hell make some noise...

I used to think they could offer two FOV choices to keep everyone happy. Say default 75 degrees and 90 degree option.

But then I realized that half the widescreen squawkers wouldn't be happy unless you locked out 4:3 users from the alternate choice. They figured they bought a WS monitor so they deserve something special. :rolleyes: It isn't really about having a proper view for their monitors it is just more e-peen crap.
 
Negative, forcing AA from the control panel did not work in any mode.

But ...... Under DX9 , Vista - also work in XP ;) -

No AA

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-23-17-0.jpg

AA 4x with Nhancer and antialeasing compatibility for Oblivion

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-22-41-0.jpg

32 x S combined - 2x2 supersampling with 8x multisampling- with the Nhancer and antialeasing compatibility for Oblivion

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-22-04-8.jpg

. The methods are the same methods of the games of the Unreal Engine 3, as Medal Airborne or Unreal 3 , the antialeasing compatibility for Oblivion or rename the exe to Oblivion exe . The AA work in many games of the Unreal Engine 3 - I believe that in all except Strangehold -, an the problem not is the deferred rendering , not is posible ,
 
After taking 30 min to install this 12gb beast I tested for myself to see how lower rez looked with AA over high rez and no AA. As I figured it runs and looks better at a lower rez with AA using DX10:D I run at 60fps (Vsync on) all the time. Oh, the 2232BW scales diff resolutions better than any monitor I have seen;) Looks just as good as my CRT did (1024x768) at 1280x768 in 16:10 ratio. Though I prefer 1280x800 these 169.01s do not offer that resolution:(
 
But ...... Under DX9 , Vista - also work in XP ;) -

No AA

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-23-17-0.jpg

AA 4x with Nhancer and antialeasing compatibility for Oblivion

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-22-41-0.jpg

32 x S combined - 2x2 supersampling with 8x multisampling- with the Nhancer and antialeasing compatibility for Oblivion

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-22-04-8.jpg

. The methods are the same methods of the games of the Unreal Engine 3, as Medal Airborne or Unreal 3 , the antialeasing compatibility for Oblivion or rename the exe to Oblivion exe . The AA work in many games of the Unreal Engine 3 - I believe that in all except Strangehold -, an the problem not is the deferred rendering , not is posible ,


yeap works for me too. although to be honest this game really doesnt need AA. atleast at 1680x1050 i dont think it does.
 
But ...... Under DX9 , Vista - also work in XP ;) -

No AA

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-23-17-0.jpg

AA 4x with Nhancer and antialeasing compatibility for Oblivion

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-22-41-0.jpg

32 x S combined - 2x2 supersampling with 8x multisampling- with the Nhancer and antialeasing compatibility for Oblivion

http://i4.photobucket.com/albums/y117/jonelo/capturas/wargame-g4wlive2007-11-0810-22-04-8.jpg

. The methods are the same methods of the games of the Unreal Engine 3, as Medal Airborne or Unreal 3 , the antialeasing compatibility for Oblivion or rename the exe to Oblivion exe . The AA work in many games of the Unreal Engine 3 - I believe that in all except Strangehold -, an the problem not is the deferred rendering , not is posible ,

You sure this works in Vista? I set the appropriate in game option to OFF (DX9) then I renamed startup.exe to Oblivion.exe, then I loaded up nhancer and adjusted the Oblivion.exe settings, launched the game...but no AA :( ...I must be missing a setting in nHancer???
 
Ok I finally got 4xMSAA forced in XP...the FPS in Vista (4xAA) and XP (4xMSAA) were pretty much exactly the same for me. I'll just stick with Vista with AA on, the lowest it ever dips to is 30fps, very playable at that fps.
 
EXACTLY! Same here. So many of the hyped features for vista were either removed, or turned out to be not that big of deal, or are just pure annoyance (like UAC).

And when I do install vista, it will be as a dual boot, with XP as the primary.

LOL where are all the rabid Vista defenders? So far in my dual XP Vista 64 boot I have installed exactly one game in Vista: Bioshock-Then I installed in on XP and saw no differences. Since its the only game on Vista I said the hell with it and deleted it and installed it for good on XP. Same with the Witcher, which is a DX9 game anyway.

So far there has been no compelling reason at all for me to switch to Vista as a gamer and I boot into Vista so infrequently its not even funny. Every time these comparisons come out I wait for "The Game" that will make me move over but every time I read one of these its another disappointment for DX10. My wait continues, as I happily play with my XP. :)
 
I used to think they could offer two FOV choices to keep everyone happy. Say default 75 degrees and 90 degree option.

But then I realized that half the widescreen squawkers wouldn't be happy unless you locked out 4:3 users from the alternate choice. They figured they bought a WS monitor so they deserve something special. :rolleyes: It isn't really about having a proper view for their monitors it is just more e-peen crap.

ITS CALLED WIDE SCREEN not SHORT SCREEN
VALVE DOES IT
ID DOES IT RIGHT
MASSIVE DOES IT RIGHT
WHY CANT EPIC?

and only 15%
HAVE YOU TRYED TO BUY A PC ANY MORE WITH A 4:3 MONITOR
EVEN THE CHEAPEST DELL COMES WITH 17" WS MONITOR
http://www.dell.com/content/products/features.aspx/inspndt_bundles?c=us&cs=19&l=en&s=dhs
WE HAVE A RIGHT TO BITCH

MAY BE WE SHOULD NOT HAVE MULTITHREADED GAMES ONLY 24% HAVE 2 CPUS?

800 x 600 2.12 %
1024 x 768 36.66 %
WIDE SCREEN 1152 x 864 6.02 % <-- 16:10
WIDE SCREEN 1280 x 800 40.89 % <--16:10
WIDE SCREEN 1440 x 900 5.07 % <--16:10
1600 x 1200 1.75 %
WIDE SCREEN 1680 x 1050 4.91 %
WIDE SCREEN 1920 x 1200 1.37 %
Other 13,237 1.21 %

LETS DO THE MATH HERE

OH SHIT WHOS IN THE MINORITY NOW BITCH
58% USE A WIDESCREEN RES
4:3 IS OLD AND DEAD
MOVIES SHOW MORE IN WIDE SCREEN
TV SHOWS AND SPORT CASTS SHOW MORE

YOU LOGIC IS FLAWED
 
You sure this works in Vista? I set the appropriate in game option to OFF (DX9) then I renamed startup.exe to Oblivion.exe, then I loaded up nhancer and adjusted the Oblivion.exe settings, launched the game...but no AA :( ...I must be missing a setting in nHancer???

This not is posible in game, is AA 32x .You have two methods , rename the exe, or the Nhancer method

wargame-g4wlive2007-11-0810-22-04-8.jpg


HJJHJ.jpg


My ini

ALLOWD3D10=FALSE
MAXMULTISAMPLES=0

A question .. some problem with the image size and the forun rules ?
 
ITS CALLED WIDE SCREEN not SHORT SCREEN
VALVE DOES IT
ID DOES IT RIGHT
MASSIVE DOES IT RIGHT
WHY CANT EPIC?

[snipped rant]

See, that's exactly what I was talking about. You come in here and actually say you have "a right to bitch". But you don't. To use your analogy, if I buy a quad core processor which has a slower speed per core than an equivalently priced single or dual core (which it will of course), I would then have a right to demand that all games run faster on it, which obviously they won't as some are more oprtimised towards running via a single faster thread. I have a dual core 2.6Ghz, and know for a fact that a single 3.2Ghz would be better for some games. I don't go calling those bad games or bad companies because of it.

Multi-core support is nice to have, as is WS, but ultimately you made a purchasing decision to move away from 4:3 and there is absolutely no reason the industry should optimise for what is a personal choice.

A bit of maths about the 17'' you mention:

17'' 4:3 has a height of 10'' and a width of 14'', giving an area of 140 sq. in.
17'' 16:10 has a height of 9'' and a width of 14'', giving an area of 126 sq. in.
17'' 16:9 has a height of 8.2'' and a width of 15'', giving an area of 123 sq. in.

For comparison:

16'' 4:3 has height of 9.6'' and a width of 13'', giving an area of 124.8 sq. in.

Which clearly means that assuming you want equivalent image quality, i.e. what you can see looks the same, you need to crop the widescreen picture, simply because there is less space to show it on. It is, in fact, (as you were at pains to say it wasn't) shortscreen. Sure, if you get a bigger WS you can show more, but I can then get a bigger 4:3 and it'll show more at the top and bottom. To put it another way, if you try and squeeze more into that WS screen that the equivalent 4:3, the image will need to shrink accordingly, and using that arguement I can do the same on a 4:3.

While some of the larger screens have their uses (I have a 50'' WS TV), many people are buying these things just because they think it's the "right" choice, but mathematically, it really isn't, and I'm sure some people wouldn't want a WS on their desk anyway since they need to extend quite far sideways before you get a decent height.

My 20'' 4:3 is 12'' high by 16'' wide, to get the same height a 16:10 would be 19'' wide and a 16:9 22''. In turn it would take up more space (which surely is why a lot of us switched from CRTs in the first place) and you would need to sit farther away from the screen to see it all at once. We're also in the realm of 24-25'' screens here which, while they're in my budget, are hardly standard issue or cheap, and certainly not the sort of thing you find on Dell's lower end PCs. In fact, their $5000 XPS still only comes with a 24'' WS 2405FPW, which is a 16:10 screen and is therefore 13'' high by 20'' wide. The screen also costs $2000, a lot of money considering it's only 1'' taller than my screen which was half that price. No 16:9 screens in sight at Dell at all, and if there were they'd be either massive or too short.

Sports footage and movies show more in WS because the source material is recorded in WS and cropped for 4:3, so WS is the "original image". If a game is designed in 4:3 (as for instance Bioshock was) that is then the "original image" and the WS view should be cropped accordingly so it's correctly framed. If a sports show or movie filled the extra space with stuff that wasn't supposed to be there I'm sure you'd still complain, and that's the case here.

One final point, to link into the subject of the thread. Have you tried to buy a PC with XP? That cheap Dell you linked to and most others now come with Vista. For this reason the amount of Vista users is growing as the number of WSers is. Doesn't make Vista better than XP though, does it? And in fact, many people are getting pretty pissed off with companies making Vista-exclusive/enhanced games/content when there is no partifcular reason to do so. Cf. widescreen.
 
See, that's exactly what I was talking about. You come in here and actually say you have "a right to bitch". But you don't. To use your analogy, if I buy a quad core processor which has a slower speed per core than an equivalently priced single or dual core (which it will of course), I would then have a right to demand that all games run faster on it, which obviously they won't as some are more oprtimised towards running via a single faster thread. I have a dual core 2.6Ghz, and know for a fact that a single 3.2Ghz would be better for some games. I don't go calling those bad games or bad companies because of it.

Multi-core support is nice to have, as is WS, but ultimately you made a purchasing decision to move away from 4:3 and there is absolutely no reason the industry should optimise for what is a personal choice.

A bit of maths about the 17'' you mention:

17'' 4:3 has a height of 10'' and a width of 14'', giving an area of 140 sq. in.
17'' 16:10 has a height of 9'' and a width of 14'', giving an area of 126 sq. in.
17'' 16:9 has a height of 8.2'' and a width of 15'', giving an area of 123 sq. in.

For comparison:

16'' 4:3 has height of 9.6'' and a width of 13'', giving an area of 124.8 sq. in.

Which clearly means that assuming you want equivalent image quality, i.e. what you can see looks the same, you need to crop the widescreen picture, simply because there is less space to show it on. It is, in fact, (as you were at pains to say it wasn't) shortscreen. Sure, if you get a bigger WS you can show more, but I can then get a bigger 4:3 and it'll show more at the top and bottom. To put it another way, if you try and squeeze more into that WS screen that the equivalent 4:3, the image will need to shrink accordingly, and using that arguement I can do the same on a 4:3.

While some of the larger screens have their uses (I have a 50'' WS TV), many people are buying these things just because they think it's the "right" choice, but mathematically, it really isn't, and I'm sure some people wouldn't want a WS on their desk anyway since they need to extend quite far sideways before you get a decent height.

My 20'' 4:3 is 12'' high by 16'' wide, to get the same height a 16:10 would be 19'' wide and a 16:9 22''. In turn it would take up more space (which surely is why a lot of us switched from CRTs in the first place) and you would need to sit farther away from the screen to see it all at once. We're also in the realm of 24-25'' screens here which, while they're in my budget, are hardly standard issue or cheap, and certainly not the sort of thing you find on Dell's lower end PCs. In fact, their $5000 XPS still only comes with a 24'' WS 2405FPW, which is a 16:10 screen and is therefore 13'' high by 20'' wide. The screen also costs $2000, a lot of money considering it's only 1'' taller than my screen which was half that price. No 16:9 screens in sight at Dell at all, and if there were they'd be either massive or too short.

Sports footage and movies show more in WS because the source material is recorded in WS and cropped for 4:3, so WS is the "original image". If a game is designed in 4:3 (as for instance Bioshock was) that is then the "original image" and the WS view should be cropped accordingly so it's correctly framed. If a sports show or movie filled the extra space with stuff that wasn't supposed to be there I'm sure you'd still complain, and that's the case here.

One final point, to link into the subject of the thread. Have you tried to buy a PC with XP? That cheap Dell you linked to and most others now come with Vista. For this reason the amount of Vista users is growing as the number of WSers is. Doesn't make Vista better than XP though, does it? And in fact, many people are getting pretty pissed off with companies making Vista-exclusive/enhanced games/content when there is no partifcular reason to do so. Cf. widescreen.

you do not seem to under stand PIXEL SPACE do you?
the size of a monitor it self is not what dictates what you can see

lets take 1920x1200 here becouse its easy
this is the widescreen of 1600x1200 they have the SAME vertical space
does this mean that at 1024x768 you should see LESS then some one at 1600x1200?
same image more pixels
you getting size and aspect mixed and pixel space messed up

the biggest issue here is that with a game thats croped for widescreen the wider you go the more gets croped
your right there are NO 16:9 pc monitors becouse pc monitors are 16:10 becouse this is the aspect of 2 pages of paper side by side
NOTE:
SXGA AND WSXGA+ 1680x1050 is the WIDE screen of 1280x1024
UXGA AND WUXGA 1920x1200 is the WIDE screen of 1600x1200
THIS IS HOW MONITORS WORK not based on physical size if that was the case 13" laptops would have to have the image croped even more regardless of what res they are
also not that wide screen res are always the same vertical res or MORE then there 4:3 counter part

Vector_Video_Standards2.png
 
This game crashes constantly in-game and at the start menu. If you leave it idle long enough about 1-3 minutes, it'll crash. But if you alt-tab out of the game when it crashes and enter again, it runs fine. I'm not sure what's the game trying to access that is causing it to crash. It's a annoyance and hopefully there'll be a patch soon.
 
I understand pixel space just fine, but I don't think it's relevant to gaming. I would expect to see exactly the same view of the game (in terms of elements displayed) in 640x480 as I would in 1600x1200, it'd just look more blocky in one case. However on a smaller screen an image which shows the same view will necessary show all the elements at a smaller scale, thereby meaning that the smaller the screen, the harder it is to see things (common sense).

The point here is that if you make the monitor shorter and wider, e.g. a 17'' 4:3 to a 17'' 16:10, there is less height for the picture to fit into, so if you keep the vertical FOV the same the picture will be harder to see. The resolution doesn't come into it as when gaming the resolution only dictates how clear the image is, not how much you can see. On the desktop or in 2D apps it's a different matter, but you have no larger a viewing area at high-res than you do at low-res when playing GoW, just more detail.

If a figure in game is 10'' tall on a 17'' 4:3, they will take up all the vertical space, and if the resolution changes, the figure will remain the same size, obviously. If you want to display that same figure on a 17'' 16:10, you only have 9'' of space to work with, so to fit the imae onscreen you must either downsize it (what you are advocating) or crop the top/bottom 1''. If you downsize the image, you are putting the same amount of content in a smaller space and therefore see less detail. Again this is irrespective of resolution, it's simply that smaller things are harder to see.

To keep the same level of detail visible you have to crop the image, which is what Unreal Engine 3 does, as seen in Bioshock. The WS "fix" for that game forces the game to display the same amount of vertical content as in 4:3, plus additions to both sides, but that means that you are gaining content at the expense of detail, unless you increase the physical size of the screen. You can jack up the resolution all you want but details too small (physically) will still be impossible for your eyes to perceive.

If I go and get a 24'' screen that can do 1920x1200 and stand it next to my 20.1'' 1600x1200, then yes, it SEEMS (but isn't, you'll see why in a minute) logical to expect it to display more of the image, i.e. the 4:3 picture at exactly the same scale plus extra content on each side. But wait a minute, what if I then go and fetch a larger 1920x1440 4:3 screen? By that logic I should now see even more on that one than on the WS. Clearly if we follow this pattern by the time we get up to a 50'' or so screen you'd have a fisheye or nearly 360 wraparound view, which is just silly. Therefore the arguement that either more pixels or more size = a bigger view is flawed.

This is interesting since it's an easier way of getting to the point I was trying to make, which is that neither 4:3 or widescreen is "better" in terms of more viewing area. The same arguements that can be used for one also work for the other (as you can keep sticking more vert/horiz resolution and/or physical size ad infinitum), and it is therefore completely arbitrary which developers choose to show more on. It's purely a matter of personal choice, it's a complete fallacy that you see more on a widescreen. Which brings we back to us 4:3 gamers having just as much of a right to complain when WS gets the better view. But we don't.
 
you cant get screen physical size thats a load of shit
by that laptop show even less if there <17" also find me a laptop any more thats not wide screen

BUY A BIGGER SCREEN
should we lock all games to 1024x768 just becouse most poeple cant afford a top end PC?
lets all go back to DX7 since most poeple still own a a GF4 MX card

and AA lets lock out AA in ALL games becouse you know little Jonn's 6200 cant run games with AA so its not fair

Valve got it right
ID got it right
Massive FIXED there game with 2 week of there support being messed up and the next patch even as added support for Tripple Head 2 Go uses thats 3:1

the thing is if you crop for wide screen the wider your screen the LESS you see veritcal
windows doesnt crop my desktop why should games?

FOV is based on Aspect and pixel space no the physical size
wider screen wider FOV
whats to say some one with a 24" wide screen needs to play at 1440x900
there getting shafted buy your logic since there screen is physical as tall or taller then a 20" 4:3
also
most LCDs are 1280x1024 for 19" and 1600x1200 for 20"
my 22" is 1680x1050 with the SAME physical hight as a 19" but higher dotpitch and more vertical pixel space i should have the same (more realy) view in a game as some one with a 19" 5:4 monitor but wider

the projection matrix in games is based on the res not the screen type unless the FOV is locked
there is no way for a game to know what size your monitor is
if the pixel pitch is to high for you get a bigger screen not my fault your poor and cant afford a better monitor

and the same goes for video cards
maybe ATi and nVidia should only sell one card that low end
so every one can affored it oh wait

sounds like you need to get out of PC gaming and get a console
 
ITS CALLED WIDE SCREEN not SHORT SCREEN
VALVE DOES IT

Snippage: All caps E-peen rantings deleted.
Your childish rant punctuated my point perfectly, almost satirically so.

All I was advocating was having a switchable FOV. There is no reason you shouldn't let someone with a 20" 16x12 monitor choose the same horizontal FOV as someone with a 20" 16x10 monitor, unless you are a frothing at the mouth e-peen, "my aspect ratio is superior" nutbar.
 

Snippage: All caps E-peen rantings deleted.
Your childish rant punctuated my point perfectly, almost satirically so.

All I was advocating was having a switchable FOV. There is no reason you shouldn't let someone with a 20" 16x12 monitor choose the same horizontal FOV as someone with a 20" 16x10 monitor, unless you are a frothing at the mouth e-peen, "my aspect ratio is superior" nutbar.

im not im talking the inverse of that
and btw Widescreen of 1600x1200 is 1920x1200
1400x1050 is the 4:3 for 1680x1050

1600x1200 is the same vertical of 1920x1200 i have no issue with any one using any FOV you want want to use 180 at 4:3 be my guest but i dont like fisheye my self
 
FOV is purely a function of the physical size of the screen and the distance your head is from it - resolution doesn't come into it. This should be obvious if you think about what FOV represents. I'll add more detail later if needed.
 
no
thats not how it works in games


there is no way for the game to know A. what the physical size of a screen is
and B. how far you are from it

there for FOV in games is based on resolution not any thing else

the only thing the game knows is the what res your using
locking the FOV to say 90 at 4:3 means that at 16:10 you pull in on that and croping the view

all this talk about viewing dist and screen physical size is BULL SHIT unless every one is using ONE SIZE AND ASPECT MONITOR WHICH WERE NOT

now admit you lost and that Epic fucked up just like with the AA
if you can bitch about lack of AA i can bitch about widescreen

becouse we both know the mid range cards FAR out number the high end and theres no way a mid range card is going to run AA at good frame rates
its not far so i say out law AA IN ALL GAME who needs it right?
 
Obviously the game doesn't know the size of the screen or where your head is, you're expected to place your head appropriately! You wouldn't game from 20 feet away or with your nose on the screen would you?

The FOV is NOT related to resolution in the slightest. Are you implying that 1600x1200 has a wider FOV than 640x480?

An FOV of 90° is just as valid for 16:9 as it is for 4:3, you just put your head in a different place. Your FOV (of your eyes, not any game) is the angle bisected (split in two) by imaginary lines drawn from the two edges of your view to the centre point of your eyes. If the edges of said view are the edges of the monitor, as they should be when gaming to preserve realism, then the FOV is necessarily related to how big the screen is.

To use my screen as an example, for a 90° FOV, the screen is 16'' wide so the calculation is as follows:

Let x inches be the distance from my eyes to the screen

Then using tan=opposite/adjacent (basic trigonometry):

tan(90°/2)=8/x
tan(45°)=8/x
1=8/x
x=8

So I should position my face 8'' from the screen.

For your 22'' WS with 90° FOV, first get the width:

22²=(width)²+(height)²
484 = (width)² + ((10/16)*width)²
484 = (width)² x (256+100)/256
123904 = 356 x (width)²
width²=348
width=18.66in
(and height=11.66in)

So:

tan(90°/2)=9.33/x
x=9.33

So basically you just move your face back 1.33'' , then a 90° FOV is fine. Strikes me the WSers complain too much; the point here is that 16:10 and 4:3 are both as valid as each other, there is no arguement to say either should have a bigger view. The AA analogy doesn't work as AA is clearly better than no AA - you're gaining something tangible, whereas with WS you're just changing the shape of what you have. If you want to see more to the sides, just use 4:3 and move your face in. You will notice as you move closer the view expands to over a greater angle, i.e. closer screen means larger FOV, and you then have the added bonus of having content above and below you if you want to look around. WS is overrated.
 
This game crashes constantly in-game and at the start menu. If you leave it idle long enough about 1-3 minutes, it'll crash. But if you alt-tab out of the game when it crashes and enter again, it runs fine. I'm not sure what's the game trying to access that is causing it to crash. It's a annoyance and hopefully there'll be a patch soon.

if it wasa widespread problem, i can see them making a patch, but in my own experience, it works fine with no problems, at least so far.
 
Back
Top