Games that don't have Fisheye with Eyefinity?

FireBean

Gawd
Joined
Apr 12, 2010
Messages
994
Thread title says it all. I got me my new eyefinity all working, and I love it. There is just one problem that I keep encountering that totally ruins my experience. It's the damned fish-eye effect. I tried multiple games already, and I was dajusting the FOV for them as well, but I can't get rid of the Fish-eye effect.

I know that the games are meant to be viewed on a single 2d plane or something like that. I just don't understand people when they start to defend it. IMHO, it's not fracking realistic at all! I know alot of people will say that it is, and that's how your peripheral vision works, or to just imagine that it's supposed to be like that. I'm sorry, but when and object at the center of you view and 100 meters out, and looks that way, it doesn't get zoomed in when it's at the edge of you peripheral vision.

So, are they any games where this effect doesn't happen or did I just waste my money?
 
Try playing 3rd person games and imagining that the camera is swinging behind the character until you get used to not looking at the side monitors. If you simply can't live with it, you can reduce fisheye by running portrait mode. I would also recommend playing driving and flying games. Arma 2 is my favorite. If you want to completely remove fisheye from games like BC2, try FOV's around 30.
 
What monitors are you using?

If your lucky you can rotate them or buy a stand that rotates and put them in portrait mode, that's how I set them up. I love the extra vertical space, but the FoV isn't too much larger horizontally. Overall though everything is way less squashed then it seems in landscape eyefinity, has zero fisheye problems that I've seen so far and it makes for a killer desktop workspace.
 
You can see the supposed "fisheye" effect on a single monitor if you're looking for it. The farther an object moves from the center of the screen, the wider it becomes; that's just how game cameras work.

Game cameras all use fixed one-point perspective, with the point locked to the dead-center of the display. As long as you're looking at the point they've chosen, the camera will look correct. Everything beyond that falls within the cameras peripheral vision, and therefor should also stay within your peripheral vision. The elongation of objects is the correct way to represent them for your peripheral vision (assuming the display is a flat plane, not curved).

A game that wanted to allow you to look away from the center would need an expensive (high resolution and high-speed), eye-tracking rig so that it can move the game camera's point of perspective in accordance with where you're looking.
 
Thats why I play in Portrait Mode, to many games I was seeing the fisheye look. I am very happy with my portrait setup. You actually feel your in the game.
 
You can see the supposed "fisheye" effect on a single monitor if you're looking for it. The farther an object moves from the center of the screen, the wider it becomes; that's just how game cameras work.

Game cameras all use fixed one-point perspective, with the point locked to the dead-center of the display. As long as you're looking at the point they've chosen, the camera will look correct. Everything beyond that falls within the cameras peripheral vision, and therefor should also stay within your peripheral vision. The elongation of objects is the correct way to represent them for your peripheral vision (assuming the display is a flat plane, not curved).

A game that wanted to allow you to look away from the center would need an expensive (high resolution and high-speed), eye-tracking rig so that it can move the game camera's point of perspective in accordance with where you're looking.

This isn't true for Supreme Commander 2, which means that it's not true for (at the least) any isometric game, were developers to code as such. There is zero, and I do mean zero, fisheye effect in that game across a 3xL configuration.

Anyway, this is exactly why I've been playing on my "mismatched" Eyefinity setup. Much more logical than the recommended config.
 
This isn't true for Supreme Commander 2, which means that it's not true for (at the least) any isometric game, were developers to code as such. There is zero, and I do mean zero, fisheye effect in that game across a 3xL configuration.

Anyway, this is exactly why I've been playing on my "mismatched" Eyefinity setup. Much more logical than the recommended config.

SupCom (and most isometric, side-scroller, top-down games) use an Orthographic projection to achieve their render. This eliminates scaling with depth, so objects of the same size take up the same screen space no matter their depth in the scene. "3D" games (shooters, racers, etc....basically all other games) use a perspective projection, which simulates as best as it can the way we view the world. Objects of the same size will take up different screen space based on their depth in the scene.

The fisheye effect is an artifact of this, and cannot be avoided. At the same time, games in a 3D world can't really use an orthographic projection, as it would eliminate any sense of depth and make navigating them near impossible.
 
This isn't true for Supreme Commander 2, which means that it's not true for (at the least) any isometric game, were developers to code as such. There is zero, and I do mean zero, fisheye effect in that game across a 3xL configuration.
Poster above me did a very good job explain this.

An orthographic camera doesn't display this behavior, but it's really only top-down or isometric games that can even think about using orthographic projection. It's unusable in 1st / 3rd person games due to the complete lack of perspective and depth scaling... believe me, I've tried. You can set a 3D camera to Orthographic in UnrealEd, and it makes it impossible to figure out where you're going.

Anyway, this is exactly why I've been playing on my "mismatched" Eyefinity setup. Much more logical than the recommended config.
What's more logical about playing with mismatched monitors?

Having smaller flanking monitors makes it impossible to correctly align your peripheral and central vision with the given FOV. For a given size of display, and a given FOV, your head is supposed to be a certain distance away from the display (this is why console games tend to have a lower FOV, they're intended to be viewed from farther away, which means a lower FOV is correct). Since you have two different sizes of screen, you'll always be out of whack.
 
Last edited:
What's more logical about playing with mismatched monitors?

Having smaller flanking monitors makes it impossible to correctly align your peripheral and central vision with the given FOV. You'll always be out of whack.

What's more logical is that it looks better and plays better - for the games I play. You don't have to "align" anything because you're focusing on the center monitor - which is now actually feasible, since it's bigger than the side monitors. I never notice that the content on the side monitors is misaligned with what's on the center monitor unless I'm playing a game where there's a fixed HUD that goes across the screens.

So it might be an issue in some racing sims but it's ideal for FPS, which is where I spend most of my Eyefinity time. I used to run things the "normal" way and I like this much better.
 
The stuff about orthographic camera makes sense, though - thanks for the explanation.
 
What's more logical is that it looks better and plays better - for the games I play. You don't have to "align" anything because you're focusing on the center monitor - which is now actually feasible, since it's bigger than the side monitors. I never notice that the content on the side monitors is misaligned with what's on the center monitor unless I'm playing a game where there's a fixed HUD that goes across the screens.

So it might be an issue in some racing sims but it's ideal for FPS, which is where I spend most of my Eyefinity time. I used to run things the "normal" way and I like this much better.
This all sounds horribly wrong.

Your peripheral displays will be too small, which means their FOV will actually be much higher than it should be relative to the larger monitor (since your seating distance for both will be the same). You've just made the problem worse, not better.

Objects also wont smoothly transition across displays, the size will suddenly jump down. That's pretty jarring, even in my peripheral vision...

The game camera is set up to project to a rectangular flat plane. It will align with your vision correctly on such a surface (tilting your side monitors inwards is actually bad, and causes fish-eyeing to be noticeable even in your peripheral vision). What you've done breaks everything.
 
You can theorycraft and wring your hands wildly all you want all day long, but I'm speaking from experience. It's a better experience for every game I play and it's perfectly logical to me.

The point that everyone emphasizes is this: "stare at the center monitor." Well OK, how the hell do you do that when the center monitor is smaller than the side monitors (same size, but angling makes it seem smaller)? It's blatantly counter-intuitive. At best, you're looking at a contiguous piece of real estate where the "center" is no bigger than the "peripheral". Ergo, there is nothing peripheral about the peripheral and nothing central about the center.

The mismatched setup solves that problem, period. The center is bigger, so you are naturally drawn to it. Conversely, the sides are smaller, so you are not obsessing over what you are "seeing" on them; it's enough to know that it is "there."

Zorachus, another longtime user, discovered the same exact thing 1 or 2 years ago in another thread he started.

It just works, with the exception of games where a fixed and ever-present HUD is stretched across all three screens (the buggy in HL2).
 
You can theorycraft and wring your hands wildly all you want all day long, but I'm speaking from experience. It's a better experience for every game I play and it's perfectly logical to me.
No theory or speculation involved. The methods by which FOV is calculated and the laws of perspective are quite clear-cut. What you're doing doesn't make any technical nor logical sense.

The point that everyone emphasizes is this: "stare at the center monitor." Well OK, how the hell do you do that when the center monitor is smaller than the side monitors (same size, but angling makes it seem smaller)? It's blatantly counter-intuitive.
The game camera expects that you DON'T angle the monitors. It needs to be projected onto a flat surface in order for the resulting image to be optically correct.

At best, you're looking at a contiguous piece of real estate where the "center" is no bigger than the "peripheral". Ergo, there is nothing peripheral about the peripheral and nothing central about the center.
Now you're not making any sense. Three same-size displays creates a wide rectangle. The center of the rectangle is central, the edges of the rectangle are peripheral. Making the edges larger or smaller will not change the fact that they are the edges and are peripheral (as long as you remain looking forward, of course).

The side monitors are located farther from the center of your visual field, ergo they are in your peripheral vision and elongate objects so that when they are observed through the peripherals of your vision they appear the correct size and dimension). Assuming you are at the proper sitting distance for the given FOV, this will generate an optically correct image where it almost feels as if the scene wraps around the sides of your head.

If you've ever messed with setting correct FOV values before, you WILL know when you get it right. All of a sudden everything "clicks," and as long as you don't look away from center, it feels uncannily like looking through into a room rather than looking at an image of a room (as is what you usually experience when FOV is too high)

The mismatched setup solves that problem, period. Zorachus, another longtime user, discovered the same exact thing 1 or 2 years ago in another thread he started.

It just works, with the exception of games where a fixed and ever-present HUD is stretched across all three screens (the buggy in HL2).
Doesn't solve any problems, and in fact it creates new ones as it breaks all rules of FOV and perspective. I've emulated such a setup before by window-boxing the image on my side monitors to see if I could make any sense of it, and all it did was confirm my suspicions that it is entirely the wrong thing to do and can't look anything but incorrect.
 
This issue comes up so often, I decided to whip up a handy visual aid so everyone can better understand what's happening, and WHY it's happening.

unled1rc.jpg


Lets say we have an object at Position A. It is displayed at the width shown at Position A and its apparent size when viewed by an observer looking straight ahead is shown at Position 1.
Now, lets say we move that object to the side until it is at Position B. It has to be much drawn much wider so that, by the time the image reaches the observer at Position 2, it is the same size as it was at Position 1.

The higher you make your FOV, the more pronounced this elongation becomes, because it expects the observer to be positioned closer to the display. Ergo, if fisheye is bothering you, you either need to SIT CLOSER or TURN DOWN the FOV.

This requires all monitors to be the same size and positioned in a flat plane to work correctly. Angling the monitors destroys the lines of perspective that the camera assumes will be used. Using smaller monitors gives the side monitors a higher relative FOV and again destroys the lines of perspective.

There are such things as a cylindrically and spherically projected cameras, but then those require a cylindrical or spherical display to look correct. No game really uses those methods because such displays are extremely rare.
 
Last edited:
I can see where he's coming from about the side monitors APPEARING the be larger as an optical illusion and yes the fisheye effect is just something we have to live with until devs figure out a way to properly render a game for multiple displays.

Let's not forget that multi-monitor gaming for the masses is still a cutting edge tech. that devs are still working on coding for. With games taking years to develop, the games coming out now may not have had any idea that multi-monitor for the common man was going to be coming like it has.
 
the fisheye effect is just something we have to live with until devs figure out a way to properly render a game for multiple displays.
This IS how you properly render for multiple displays on a flat plane... The only thing missing is eye-tracking, which would allow you to look anywhere rather than just dead-center.

Or did you mean that you want options to correct for tilted peripheral displays, or options to use cylindrical projection for curved displays?

Edit: As an example, this is what happens when you attempt to use cylindrical projection when using a flat display (composited from Portal 2 screenshots). The optically correct elongation of peripheral vision you normally see has been replaced by the entire image being horribly warped. Straight lines have been lost, and the edges of the image appear to curve away from you, as a curved display (with the edges curved towards you) is expected:

unledpanorama1.jpg
 
Last edited:
This IS how you properly render for multiple displays on a flat plane... The only thing missing is eye-tracking, which would allow you to look anywhere rather than just dead-center.

Or did you mean that you want options to correct for tilted peripheral displays, or options to use cylindrical projection for curved displays?

Edit: As an example, this is what happens when you attempt to use cylindrical projection when using a flat display (composited from Portal 2 screenshots). The optically correct elongation of peripheral vision you normally see has been replaced by the entire image being horribly warped. Straight lines have been lost, and the edges of the image appear to curve away from you, as a curved display (with the edges curved towards you) is expected:

unledpanorama1.jpg

...I meant them coming up with a way to keep everything in perspective....without additional peripherals such as eye tracking...
 
...I meant them coming up with a way to keep everything in perspective....without additional peripherals such as eye tracking...
The only way to keep everything in perspective regardless of where you look is to use eye-tracking. This will allow the game to shift it's single point of perspective to stay aligned with your field of vision at all times.

Either that, or pop your head in the middle of a spherical display and use an in-game camera based on spherical projection. Honestly, eye-tracking is easier to implement since it doesn't require custom display hardware, just a good camera and some background software.
 
This issue comes up so often, I decided to whip up a handy visual aid so everyone can better understand what's happening, and WHY it's happening.

Lets say we have an object at Position A. It is displayed at the width shown at Position A and its apparent size when viewed by an observer looking straight ahead is shown at Position 1.
Now, lets say we move that object to the side until it is at Position B. It has to be much drawn much wider so that, by the time the image reaches the observer at Position 2, it is the same size as it was at Position 1.

The higher you make your FOV, the more pronounced this elongation becomes, because it expects the observer to be positioned closer to the display. Ergo, if fisheye is bothering you, you either need to SIT CLOSER or TURN DOWN the FOV.

This requires all monitors to be the same size and positioned in a flat plane to work correctly. Angling the monitors destroys the lines of perspective that the camera assumes will be used. Using smaller monitors gives the side monitors a higher relative FOV and again destroys the lines of perspective.

There are such things as a cylindrically and spherically projected cameras, but then those require a cylindrical or spherical display to look correct. No game really uses those methods because such displays are extremely rare.

The fisheye effect is basically what's stopped me getting an Eyefinity setup. I find it far too distracting.

I know WHY its there, but I still think its wrong for an Eyefinity set up. If you have your setup as a long flat array of screens its fine, but I would rather have a wrapping around arrangement of monitors which would be better suited to a cylindrical projection, which unfortunately almost no games use (I believe rFactor does use a cylindrical projection, not sure).

So I want this...

3screenfov.png



Not this....

1screenfov.png


Which would be better approximated with a cylindrical projection.
 
The only way to keep everything in perspective regardless of where you look is to use eye-tracking. This will allow the game to shift it's single point of perspective to stay aligned with your field of vision at all times.

Either that, or pop your head in the middle of a spherical display and use an in-game camera based on spherical projection. Honestly, eye-tracking is easier to implement since it doesn't require custom display hardware, just a good camera and some background software.

I was talking about maybe coding as multiple perspectives or something. Also, just because we can't do it now doesnt mean it can't be done. Im not trying to argue here, im just being optimistic.
 
A lot of people find the fish eye effect becomes less noticeable and seems to 'go away' if you adjust your seatng position too. Like moving closer to or in some cases further from the screen will cause the perpherial monitors to more match up with your actual perpherial vision. The fish-eye effect remains, but, its properly in your perpherial vision so you don't really notice it as much.
 
A lot of people find the fish eye effect becomes less noticeable and seems to 'go away' if you adjust your seatng position too. Like moving closer to or in some cases further from the screen will cause the perpherial monitors to more match up with your actual perpherial vision. The fish-eye effect remains, but, its properly in your perpherial vision so you don't really notice it as much.

To me that's like saying "its wrong but just try harder to ignore it and you wont notice its wrong".

I know not many games do it currently, but I'd like to see some games starting to display wide FOVs properly.

To me multimonitor SHOULD be such that the centre monitor is where your body is pointing, but the side monitors allow you to move your head to view more, just like real life. I know that's not what it is, its just what I wish it was.
 
Last edited:
If you've ever messed with setting correct FOV values before, you WILL know when you get it right. All of a sudden everything "clicks," and as long as you don't look away from center, it feels uncannily like looking through into a room rather than looking at an image of a room (as is what you usually experience when FOV is too high)

This is the best description I've ever read of how eyefinity looks to me. It really feels like I'm looking into a 3D image when it's all setup just right. There appears to be far more depth into the monitor than I've ever been able to see on a single monitor setup.
 
The point that everyone emphasizes is this: "stare at the center monitor." Well OK, how the hell do you do that when the center monitor is smaller than the side monitors (same size, but angling makes it seem smaller)? It's blatantly counter-intuitive. At best, you're looking at a contiguous piece of real estate where the "center" is no bigger than the "peripheral". Ergo, there is nothing peripheral about the peripheral and nothing central about the center.
.
Neither eyefinity nor any games are designed to support three monitors angled to each other.

Also, I'm not sure why this keep coming up;
it shouldn't matter whether or not your staring at the center screen.
If you imagine your screens as windows into another gameworld (as Unknown-One pointed out) and you have it setup correctly, you can look at your side screens all you want

If you look at your side screens and see blatant stretching, it's not setup correctly.

The only true way to correct for fish eye is to flatten the monitors (sucks, i know) to a single plane, and then either adjust the FOV or get closer or farther from the monitors until it looks correct. (You have to imagine the side monitors arent actually there and that you are looking through them, because obviously you can still tell that the images are stretches compared to the screens, but if you imagine the screens are windows into another world, the game world will eventually look correct.)

There is one major issue with this; game developers don't always give you easy controls to adjust FOV, sometimes they don't give you the controls at all. Also, sometimes there default FOV forces you to either sit to close or too far away. Ultimately, i wish game develops gave you more advanced resolution, ratio, and FOV options in their video settings.
 
Last edited:
Neither eyefinity nor any games are designed to support three monitors angled to each other.

Also, I'm not sure why this keep coming up;
it shouldn't matter whether or not your staring at the center screen.
If you imagine your screens as windows into another gameworld (as Unknown-One pointed out) and you have it setup correctly, you can look at your side screens all you want

If you look at your side screens and see blatant stretching, it's not setup correctly.

The only true way to correct for fish eye is to flatten the monitors (sucks, i know) to a single plane, and then either adjust the FOV or get closer or farther from the monitors until it looks correct. (You have to imagine the side monitors arent actually there and that you are looking through them, because obviously you can still tell that the images are stretches compared to the screens, but if you imagine the screens are windows into another world, the game world will eventually look correct.)

There is one major issue with this; game developers don't always give you easy controls to adjust FOV, sometimes they don't give you the controls at all. Also, sometimes there default FOV forces you to either sit to close or too far away. Ultimately, i wish game develops gave you more advanced resolution, ratio, and FOV options in their video settings.

If "correctly" = a flat plane then that's kinda gay. :p If you want to set up your screens like in the first picture I posted a few posts back then there's no way to avoid stretching.

I wonder how hard it is for game developers to include an option for cylindrical projection for wrap-around Eyefinity set ups instead of flat plane set ups.
 
If "correctly" = a flat plane then that's kinda gay. :p If you want to set up your screens like in the first picture I posted a few posts back then there's no way to avoid stretching.

I wonder how hard it is for game developers to include an option for cylindrical projection for wrap-around Eyefinity set ups instead of flat plane set ups.
yes, that is exactly what i meant unfortuantely.
For the game engine to support three monitors angled differently, the engine would basically have to have 3 different cameras render in-game for each screen.

I'm not sure how difficult this would be but i imagine it might even be possible for ati to implement a driver to do this (this is basically what nvidia has to do for stereoscopic vision, they render two different cameras that are a few inches apart, even though the game might not have anything like this programmed in).
 
I would rather have a wrapping around arrangement of monitors which would be better suited to a cylindrical projection, which unfortunately almost no games use (I believe rFactor does use a cylindrical projection, not sure).

So I want this...

3screenfov.png

Sure, you could spawn two more cameras and rotate them horizontally slightly to compensate for your angled displays, but then it ends up looking like this

anglecorrection.jpg


You end up with blind spots where the cameras meet, and straight lines don't mate up properly across displays. It does make the perspective check out for angled side monitors, though. Since the cameras capturing for the peripheral screens are rotated, elongation of objects is not as pronounced as it has to be for a flat surface.

A lot of people find the fish eye effect becomes less noticeable and seems to 'go away' if you adjust your seatng position too. Like moving closer to or in some cases further from the screen will cause the perpherial monitors to more match up with your actual perpherial vision. The fish-eye effect remains, but, its properly in your perpherial vision so you don't really notice it as much.
1. To me that's like saying "its wrong but just try harder to ignore it and you wont notice its wrong".

2. I know not many games do it currently, but I'd like to see some games starting to display wide FOVs properly.

3. To me multimonitor SHOULD be such that the centre monitor is where your body is pointing, but the side monitors allow you to move your head to view more, just like real life. I know that's not what it is, its just what I wish it was.

1. It's not wrong. The image is optically correct, you're just sitting too far away.

2. Many games do it currently because it's correct. They are, in fact, displaying wide FOVs properly.

3. Only possible with eye-tracking (poor-mans version possible with head tracking) so that the in-game camera knows where you're looking. it can then adjust FOV based on how far your head is from the screens, and move the camera's single point of perspective to the exact point on your display(s) where your eyes are focusing.

here are three examples of head-tracking being used to adjust FOV and perspective on the fly based on where your whole head is located and pointing.
Playstation3 Head Tracking
Head Tracking using faceAPI & webcam only
Head Tracking for Desktop VR Displays using the WiiRemote

This technique uses the exact same type of camera you've been complaining about, the difference is that this makes sure the outer edges of the "fisheye effect" stay in your peripheral vision AT ALL TIMES since the in-game camera is pointed where you're looking. It also lowers the FOV the farther from the screen you get, again, keeping it optically correct at all times and preventing the fisheye from entering your central vision.

As you can see in the second video, this only works for a single observer (the one being tracked), but it solves all your problems. There is no other way to accomplish this effect aside from free-standing holograms.

You could also just learn to play eyes-forward and swivel the mouse to rotate your in-game head rather than your real head. lol.
 
Last edited:
Well, AMD demos the system and advertises it on its website with the side monitors angled in, so once again, it's a question of real-world use versus theorycrafting. I tried a more obtuse angle and it didn't make it anymore immersive than an overly acute angle. I don't think I've seen a single setup anywhere, for gaming, with Surround/Eyefinity NOT an angle.
 
I don't see how games are rendering it properly when everything is stretched like hell. Putting the monitors flat won't fix that (if I had the desk space though I'll curve it, the transition between monitors isn't smooth when curved)
 
Sure, you could spawn two more cameras and rotate them horizontally slightly to compensate for your angled displays, but then it ends up looking like this

anglecorrection.jpg


You end up with blind spots where the cameras meet, and straight lines don't mate up properly across displays. It does make the perspective check out for angled side monitors, though. Since the cameras capturing for the peripheral screens are rotated, elongation of objects is not as pronounced as it has to be for a flat surface.

That picture is wrong, that's not how it should look. I dont know where you got that picture, but its flat out wrong that you'd lose some of the image. It should look like this...

fovexample.jpg


Now that's a stupidly high FOV, as that's the FOV I use with a single screen and is set to just over 90 degrees horizontal FOV. So for an eyefinity set up it'd imply the screens are wrapping around behind you quite a lot. I simply took a screen shot, turned my character, took another screenshot. You dont lose any image (well, I lost a bit from the transition between the middle and right screens as I didn't quite line up the images properly).

Of course that's "correct" but it'd also be hard to set up and wouldn't look 100% right because when you turn your head, your eyes move. That is, your head doesn't rotate about your eyes, it rotates about your neck.

The compromise that I suggest would be best would be to have a cylindrical projection... I'd show you what that looks like however I dont own any games (that I know of) which use a cylindrical projection. I believe rfactor can because I've seen people with triplehead set ups that dont have stretched side screens. But I dont own rfactor.


1. It's not wrong. The image is optically correct, you're just sitting too far away.

Its only optically correct if you're viewing a flat panel. Its wrong if you want to set up what I've pictured previously.

Do a little exercise. Look at the centre of your monitor, now place something in your peripheral vision, say a bottle or a speaker or something, so that its about 45 degrees away from the center of your monitor. Focus at the centre of your monitor, however pay attention to the bottle in your peripheral vision (without focusing on it). Is it stretched? It shouldn't be, just like the side monitors in a wrap around eyefinity display shouldn't be either. (unless you have your monitors set up as a flat panel array and are sitting close to them).

2. Many games do it currently because it's correct. They are, in fact, displaying wide FOVs properly.

They're displaying the FOV as if you're viewing a big flat panel, which is fine if you are viewing a big flat panel and sitting really close to it. To actually get a FOV of over 100 degrees from a flat panel array you have to sit pretty close to it.

3. Only possible with eye-tracking (poor-mans version possible with head tracking) so that the in-game camera knows where you're looking. it can then adjust FOV based on how far your head is from the screens, and move the camera's single point of perspective to the exact point on your display(s) where your eyes are focusing.

No, a cylindrical display would allow you to do that fine. Head tracking = moving head away from screen but keeping eyes on screen to cause things to shift.... eye tracking = I dont know? How does that even work? You move your eyes away from the screen and you can no longer see it? Or do you mean eye tracking as in a multimonitor set up which tracks your eyes and adjusts the focus accordingly? If that's what you mean then I think that'd be extremely wrong and probably give me a headache.

A cylindrical display (or wrap around multiple displays, ie, eyefinity, the poor mans cylindrical display) with a proper image projection (not the flat panel image projection in most games) would allow you to turn your head, as in your real life head, to look sideways whilst not moving the camera at all.

here are three examples of head-tracking being used to adjust FOV and perspective on the fly based on where your whole head is located and pointing.
Playstation3 Head Tracking
Head Tracking using faceAPI & webcam only
Head Tracking for Desktop VR Displays using the WiiRemote

Ok, now we're talking about 2 different things.

What those videos are showing is headtracking for shifting PERSPECTIVE. What we're talking about here with fisheye effect on widescreens is as a result of the type of PROJECTION being used.

Perspective head tracking (which is kinda cool IMO) can still exist regardless of the type of projection being used.

Perspective is talking about how you move your head left or right to see around objects and forward and backward to change the relative size of objects. You can have a cylindrical projection like I'm talking about and still have perspective head tracking like shown in those videos... in fact that would be fucking awesome.
 
Since the picture I posted above was for a really high FOV, I lowered it a bit, here...

fovex2.jpg


If you stretch that across 3 screens, arrange the screens like this...

3screenfov.png


And it should look pretty much correct (I might need to lower the FOV a bit lower, I just estimated the FOV based off how many times I had to change perspective to get 360 degrees :p).
 
I don't see how games are rendering it properly when everything is stretched like hell. Putting the monitors flat won't fix that (if I had the desk space though I'll curve it, the transition between monitors isn't smooth when curved)
Objects are horizontally elongated on-screen so that when the image reaches the observer (looking straight ahead at the center display) it looks correct as it meets your peripheral vision. If it somehow wasn't elongated (but still displayed on a flat display group), objects in your peripheral vision would appear far too narrow.

Again, see the diagram. This is the optically correct method for an in-game camera to project onto a flat plane. If you're seeing fisheye or stretching, then you're viewing the image incorrectly (either sitting too far away, looking too far off-center, or observing with tilted monitors).

unled1rc.jpg


That picture is wrong, that's not how it should look. I dont know where you got that picture, but its flat out wrong that you'd lose some of the image. It should look like this...

fovexample.jpg


Now that's a stupidly high FOV, as that's the FOV I use with a single screen and is set to just over 90 degrees horizontal FOV. So for an eyefinity set up it'd imply the screens are wrapping around behind you quite a lot. I simply took a screen shot, turned my character, took another screenshot. You dont lose any image (well, I lost a bit from the transition between the middle and right screens as I didn't quite line up the images properly).
*sigh* I was afraid someone would think this. That is NOT how it should look. What you've set up there will not work correctly, and it becomes immediately obvious why as soon as you see it in-motion in a game like R-Factor.

What you have there are three cameras with three individual points of perspective locked to the center of each monitor. What you're forgetting is that peripheral elongation is NOT exclusive to your side monitors. The elongation of objects as they get closer to the edges of a display is visible even on a single 4:3 display.

When you put three cameras edge-to-edge like that, you end up with that effect three times over. Lets say we place a virtual object so that it appears in the center of your center display. Now we'll start moving it to the left. The farther left you move it, the wider the object will become until it crosses the boundary onto your left monitor. If we continue to move the object to the left it will become narrower again as it gets closer to the center of the left monitor's camera.

Seeing this effect in motion is terrible. The second you take a step forward or turn your character you can VERY obviously see the image warp thinner and thicker as the scene around you moves across display. You DO NOT want your cameras set up this way.


Here, this is how you properly use three cameras while maintaining proper perspective. Note that this requires you look straight ahead at the center monitor, just as a flat setup would. The only thing that has changed is that the perspective on the side-screens now accounts for the displays being tilted.
unled1oo.jpg


Three cameras, all created with a 48:10 aspect ratio, are used. Only the appropriate segment (outlined in red) of each camera is rendered and presented on its according display. With the camera(s) configured in this manner you may tilt your outer monitors inwards slightly, sit at the appropriate distance from your displays, then look straight ahead at your center monitor; there will not be any observable fisheye in your peripheral vision as there usually would be with tilted monitors and a single camera.

Do a little exercise. Look at the center of your monitor, now place something in your peripheral vision, say a bottle or a speaker or something, so that its about 45 degrees away from the center of your monitor. Focus at the centre of your monitor, however pay attention to the bottle in your peripheral vision (without focusing on it). Is it stretched? It shouldn't be, just like the side monitors in a wrap around eyefinity display shouldn't be either. (unless you have your monitors set up as a flat panel array and are sitting close to them).
*bigger sigh* This is not a valid comparison for the simple fact that the bottle is free-standing. Remember how I said free-standing holograms wouldn't have this issue? Same applies to free-standing physical objects, as they have not been projected onto a flat surface

This elongation is REQUIRED for images projected onto a flat 2D plane to appear correctly proportioned (when observed from just the right position). You can see the very same effect in use on those cool 3D sidewalk chalk art pieces like the one below. The image had to be greatly elongated so that, when viewed from the correct angle, it appeared correctly proportioned. This is exactly how the peripheral vision of in-game cameras works. It requires you be positioned very specifically, but when you are, the effect is outstanding.

unled1lm.jpg


The in-game camera expects a flat surface. The plane of a flat display becomes physically more distant from the observer towards the edges. Again, going back to the diagram; elongating objects more as they approach the edges of the projection allows them to appear correctly when the image actually reaches the observer. As long as the observer is looking towards the center display, the elongated image from the peripheral display will reach the observer at the expected off-angle and will be perceived as being correctly proportioned.

unled1rc.jpg


If your screens are on a flat plane, you will ONLY see stretching if the peripheral vision of the in-game camera enters the central vision of the observer. The peripheral vision of the in-game camera should always be in the peripheral vision of the observer so that the observer seen the a correct undistorted image. If you're seeing stretching, then you've either angled your monitors (which you should know by now most games won't support), OR you're looking at the in-game camera's peripheral vision with your central vision (either by looking to far tot he sides or by sitting too far away).

2. Many games do it currently because it's correct. They are, in fact, displaying wide FOVs properly.
They're displaying the FOV as if you're viewing a big flat panel, which is fine if you are viewing a big flat panel and sitting really close to it. To actually get a FOV of over 100 degrees from a flat panel array you have to sit pretty close to it.
They're displaying the FOV as if you're viewing the resultant image on a large flat surface made up of as many panels as you like.

Yes, if you want to use a high FOV you need to sit closer for the image to remain correct. This has always been true, even on a single display.

3. To me multimonitor SHOULD be such that the centre monitor is where your body is pointing, but the side monitors allow you to move your head to view more, just like real life. I know that's not what it is, its just what I wish it was.
3. Only possible with eye-tracking (poor-mans version possible with head tracking) so that the in-game camera knows where you're looking. it can then adjust FOV based on how far your head is from the screens, and move the camera's single point of perspective to the exact point on your display(s) where your eyes are focusing.
No, a cylindrical display would allow you to do that fine.
Correct. You would need a cylindrical (curved) display and the game would need to have its camera set to cylindrical projection mode.

There are very few monitors available which are capable of this, and very few games that support cylindrical projection mode. Games that don't use cylendrical projection would also look VERY wrong on such a display, so you'd have a large library of titles that use a normal camera that you'd need a flat display setup for anyway...

I just don't see it as necessary, especially since the normal cameras that games are all using today can be fixed with the addition of head or eye tracking.

Head tracking = moving head away from screen but keeping eyes on screen to cause things to shift.... eye tracking = I dont know? How does that even work? You move your eyes away from the screen and you can no longer see it? Or do you mean eye tracking as in a multimonitor set up which tracks your eyes and adjusts the focus accordingly? If that's what you mean then I think that'd be extremely wrong and probably give me a headache.
Getting the sense that you're not very familiar with the laws of perspective here...nor how head or eye-tracking technologies work.

Head tracking can detect when your head moves closer or farther, higher or lower, and/or pans to the left or the right. Head tracking can also pick up rotation, detecting when you look up, down, left, or right.

If the computer knows what direction your head is pointing (and assumes your eyes are pointed forward from your head) then it can use head tracking to adjust the in-game camera's central perspective point so that it always stays directly in front of your face as you turn your head left and right or look up and down. Since the camera's central perspective point follows your face wherever you look, you will always be seeing an optically correct image, with peripheral elongation kept in your peripheral vision at all times so that objects in your peripheral vision appear normal as they reach your eyes.

Eye tracking is a bit tougher to do, but would then the computer would not need to assume that you're always looking straight ahead and could adjust when in-game camera's single perspective point to stay in front of your gaze even when your entire head isn't facing that direction. This would make it physically impossible for the individual being tracked by the system to view the image incorrectly, only an outside observer would be able to see the fisheye effect.

here are three examples of head-tracking being used to adjust FOV and perspective on the fly based on where your whole head is located and pointing.
Playstation3 Head Tracking
Head Tracking using faceAPI & webcam only
Head Tracking for Desktop VR Displays using the WiiRemote
Ok, now we're talking about 2 different things.

What those videos are showing is headtracking for shifting PERSPECTIVE. What we're talking about here with fisheye effect on widescreens is as a result of the type of PROJECTION being used.

Perspective head tracking (which is kinda cool IMO) can still exist regardless of the type of projection being used.

Perspective is talking about how you move your head left or right to see around objects and forward and backward to change the relative size of objects. You can have a cylindrical projection like I'm talking about and still have perspective head tracking like shown in those videos... in fact that would be fucking awesome.
No, we're talking about exactly the same thing. As I've explained many times now, the method of projection IS NOT the problem as long as it knows where you're viewing it from so it can angle itself towards you to keep everything looking correct. The reason you're seeing "fish-eye" is because it's not adjusting to your sitting position or head rotation to keep peripheral elongation in your own eyes peripheral vision. You're breaking the illusion by turning or moving from where the camera has to assume you will be observing from

That's where head or eye tracking comes into play, it can feed the in-game camera the information it needs to adjust to the position of the observer. With that information, the in-game camera won't have to keep it's single point of perspective locked to the center of the display; it can be moved freely to match your gaze and keep the image looking correct no matter where you turn your head or eyes.
 
Last edited:
Lets just agree to disagree because I can't be fucked making a bunch of diagrams to prove my point. You dont need to keep posting the same picture as if I dont understand it... I posted an almost identical picture at the end of the last page.

The current cameras can't be fixed by head tracking. What they display is perfectly correct for a large flat panel when viewed from a short distance... head tracking isn't needed, its correct. However its incorrect for a circular array of screens, and head tracking wont make it correct, it'll still be incorrect.

I guess that's the difference... I want eyefinity to be a circular array of monitors but games aren't set up for it. They could be, but it'd require a bit of intelligence in how cameras are set up and currently that doesn't exist.

Ok maybe I can't stop posting, but just to address this point...

When you put three cameras edge-to-edge like that, you end up with that effect three times over. Lets say we place a virtual object so that it appears in the center of your center display. Now we'll start moving it to the left. The farther left you move it, the wider the object will become until it crosses the boundary onto your left monitor. If we continue to move the object to the left it will become narrower again as it gets closer to the center of the left monitor's camera.
That's what its supposed to do... if you actually calculate the FOV properly there's no issue with this and its how it should happen. To you, the gamer/observer, the corners of the screens are further away than the centres of each screen, thus the objects are larger and slightly skewed as that section of the screen isn't being viewed perpendicular like the centre of the screen is. If you set the FOV correctly, then it should look correct, and if you had headtracking to keep track of exactly how close your eyes are to each point on each screen, the skewedness would be able to constantly adjust to your moving head.
 
Last edited:
*sigh* I was afraid someone would think this. That is NOT how it should look. What you've set up there will not work correctly, and it becomes immediately obvious why as soon as you see it in-motion in a game like R-Factor.

What you have there are three cameras with three individual points of perspective locked to the center of each monitor. What you're forgetting is that peripheral elongation is NOT exclusive to your side monitors. The elongation of objects as they get closer to the edges of a display is visible even on a single 4:3 display.
Surely it depends on how much the monitors are angled. If his two side monitors are angled to be completely perpendicular to his view, then the images he attached are pretty much correct

edit: your views look correct as well.. except your views are assuming a slight angle on the surround screens (the surround screens are not perpinduclar to your view, they are still close to being flat unlike tudz example

edit edit: double edit, there is one fault in your views, you are for some reason cutting out material that shouldn't be cut out. Your two sidescreens are too far left and too far right. Although then for some reason the verticals dont line up in your screens. hmm...
 
Last edited:
The current cameras can't be fixed by head tracking
I just got through explaining exactly how they could be used to keep current cameras looking perfect (for a single observer) at all times. Yes, head tracking can fix current cameras.

Now, curved displays, or angled monitors. That's another matter entirely. That you'll need additional cameras for, in the configuration I described.

Edit: It might be possible to get away with cleverly distorting the rendered image on the side displays to account for angled screens.

When you put three cameras edge-to-edge like that, you end up with that effect three times over. Lets say we place a virtual object so that it appears in the center of your center display. Now we'll start moving it to the left. The farther left you move it, the wider the object will become until it crosses the boundary onto your left monitor. If we continue to move the object to the left it will become narrower again as it gets closer to the center of the left monitor's camera.
That's what its supposed to do... if you actually calculate the FOV properly there's no issue with this and its how it should happen. To you, the gamer/observer, the corners of the screens are further away than the centres of each screen, thus the objects are larger and slightly skewed as that section of the screen isn't being viewed perpendicular like the centre of the screen is. If you set the FOV correctly, then it should look correct, and if you had headtracking to keep track of exactly how close your eyes are to each point on each screen, the skewedness would be able to constantly adjust to your moving head.
I'm sorry, but no. The configuration you posted with three cameras aligned edge-to-edge will never be correct. No FOV value will make it correct. If you saw it in person, in motion, you'd immediately understand why it's terribly wrong.

No matter what you do, you still have three separate cameras with three separate fields of vision. Each of these cameras has its own single point of perspective, and the farther objects get from that the more elongated they will become. You will see objects go from normal, to elongated, and back to normal as they pass across displays. That is not correct behavior.

A single, normal camera that knows the direction your eyes are pointing will be 100% correct no-matter where you look (at least, on a flat display group. I'm working out how to make that work correctly with angled monitors).

Surely it depends on how much the monitors are angled. If his two side monitors are angled to be completely perpendicular to his view, then the images he attached are pretty much correct

You're going to put the monitors at 90 degrees to one another? You'd have to have some pretty large screens (possibly projectors) to sit in the middle of a setup like that comfortably. You'd also still have issues with objects warping from narrow to wide as they pass from display to display.

edit edit: double edit, there is one fault in your views, you are for some reason cutting out material that shouldn't be cut out. Your two sidescreens are too far left and too far right. Although then for some reason the verticals dont line up in your screens. hmm...
That's simply the view you get when you add two rotated cameras. The perspective lines will not match up between monitors due to the rotation no matter what you do.

Edit: Imagine drawing lines radially from the center of the image out towards its edges. Now duplicate that radial pattern and shift it the the side slightly. Only one line (a perfectly horizontal one through the middle of the image) will align between displays, and all other lines of perspective will be broken
 
Last edited:

*sigh* Just because you say it doesn't make it true. And here you are accusing me of not understanding optics, I think you're just too invested with all your diagrams and explanations to admit you either dont get it or are wrong.

A single, normal camera that knows the direction your eyes are pointing will be 100% correct no-matter where you look (at least, on a flat display group. I'm working out how to make that work correctly with angled monitors).

A little tip while you're working it out... it works exactly how I said it will work and exactly how my Crysis pictures look :p

I'm not arguing flat panels here. I know what eyefinity does is correct for flat panels. My contention is flat panels are limited and I want a circular array of monitors. When I say "flat panels" I mean several displays arranged in a single plane.
 
You're going to put the monitors at 90 degrees to one another? You'd have to have some pretty large screens (possibly projectors) to sit in the middle of a setup like that comfortably. You'd also still have issues with objects warping from narrow to wide as they pass from display to display.
his example doesn't need the monitors to be perpendicular to one another, just perpendicular to your view. If you sit close enough, then yes, the monitors will be at 90 degrees to eachother. Heck, if you sit even closer, you can have three monitors completely surround you full 360.


That's simply the view you get when you add two rotated cameras. The perspective lines will not match up between monitors due to the rotation no matter what you do.

Edit: Imagine drawing lines radially from the center of the image out towards its edges. Now duplicate that radial pattern and shift it the the side slightly. Only one line (a perfectly horizontal one through the middle of the image) will align between displays, and all other lines of perspective will be broken
I must say you are wrong. I can't currently think of the mechanics of why you are wrong, but i must say your are wrong if you simply go back to the window analogy you used earlier.

Simply imagine 3 borderless windows looking out into the world. Our eyefinity setup had those three windows parrallel to eachother on the same plane. But imagine instead, that we take the two side windows and angle them. The image will definitly line up looking through the window, there wont be any missing image either.

What is relevantly different about our three screen setup that we can't imitate those three window panes perfectly?
 
Sorry, for double post, but, how did you create your portals images? did you simply "turn" the view horizontally for each view, or did you actually move the character?
 
*sigh* Just because you say it doesn't make it true.
Hence the detailed explanations and diagrams...

A little tip while you're working it out... it works exactly how I said it will work and exactly how my Crysis pictures look :p.
I've already explained why your mockup is incorrect...

Here, it's not as pretty as multiple cameras (you lose some screen real-estate) but this corrects for angled displays and gives you an "apparent" flat display:

unled1h.jpg


The image could be blown up slightly (and the FOV increased accordingly) to cover the gaps on the peripheral displays. That would allow a single normal camera to work correctly across angled displays, with proper perspective preserved, AND with this method (since we aren't using multiple cameras) head-tracking can still be used for perspective correction just as it can on three screens configured in a flat plane. That'll keep the image aligned within your field of vision at all times (you wont see fisheye).

Best part? This type of effect can be accomplished (aside from the head tracking) by a post-processing shader and an FOV tweak (SoftTH has toyed with features like this). The game doesn't even need to support it itself. If you're good with just looking straight ahead, then that's all you'd need, the image will look as correct as a flat configuration.

Sorry, for double post, but, how did you create your portals images? did you simply "turn" the view horizontally for each view, or did you actually move the character?
Rotated the character in place using the arrow keys to make sure I didn't accidentally look up or down. The camera was not strafed off its starting position.
 
Last edited:
Back
Top