Field of view/ number of displays efefinity question

iansilv

Limp Gawd
Joined
Jun 10, 2004
Messages
335
For surround gaming- as in, screens literally surrounding you, how many would you need to properly simulate field of view without stretching and visual artifacts? And how would you configure each display's field of view angle to properly display things in a 360 degree display? I am thinking you would really need 8 displays, front, front left, front right, left, right, back left, back right, and back.

Anyone have any thoughts on this?
 
I don't think Eyefinity can do a full 360 view, nor can any game. Also, there's no way to change a specific monitor's field of view.
 
The game would need to be specifically coded to be able to have a side/rear view.
 
So eight projection screens, with rear projection on each surrounding the player would not work? What about for source games, and quake engine games?
 
No, because the game is not setup to show a "behind" view, only a forward facing one. Even if you could get it to display on all 8 screens, the content would be severely stretched to the point of being unrecognizable.
 
Eyefinity is limited to 6 displays, isn't it? (But what about that 5970 with 12 miniDP ports?) Also, 1680x1050 times 8 would be 13440x1050; isn't there some problem with resolutions greater than 8192?

This is a good question, though. What limits are there on FOV? Do we know that the rear view couldn't be made to look right?

I experimented with the new FOV Calc on WSGF by plugging in 360 for hFOV, 13440x1050, and 8 across 1 tall, and the calculator breaks a little but shows interesting screenshots of what it would look like.

One problem that my experiment shows is that 8 screens isn't right. Eyefinity would split the crosshair between two monitors, just like it would with two or four monitors. How do you solve this besides going to an odd number of monitors?

Further thought on this: 75 degrees of FOV per monitor times 8 = 600 degrees of FOV; 90 degrees of FOV per monitor times 8 = 720 degrees of FOV. 360 degrees of FOV on 8 monitors would be 45 degrees of FOV per monitor. This doesn't seem right.
 
In theory, it's perfectly doable, with some major hacks. The main problem is that cameras (both in GL/DirectX and in real life) are single point cameras in a single direction. To simulate the 360 degree view, you would need two or more viewpoints that then stitch the picture into a final full view. This could be simulated like a 360 degree camera, which are usually two 180 degree fish-eye lens cameras, but it wouldn't look particularly good in the end result. To get a better result, it would likely take a render per screen to reduce the inherent skewing of objects further from the center of the render. This only increases the load on the screen, and would at this point be rather tough for this hardware generation to do at reasonable framerates.
 
I believe there are a couple simulator type games that will actually render each monitor as its own camera which is the way to do it proper, if you just increase the FoV to 360 then you are going to get a crazy panoramic fisheye effect since games aren't designed for this type of rendering.

Anyway I saw a post over at WSGF of a guy laying out plans to do this with projectors, don't know if he ever bought the hardware.
 
Well traditional FPS were about 90 degrees hFOV each so mathematically you'd need 360/90 = 4 monitors for a full 360 degree view.

However as already mentioned a single view point in a first person shooter game isn't going to achieve this, essentially what you're talking about is a point in space which draw 4 lines each to the corner of a square/rectangle and what can be seen inside that rectangle from the point is what is rendered, this only works until 180 degree at which point the point would be inside the actual rectangle itself (the point and rectangle move closer together as FOV increases to give you that wider angle.

As you approach 180 degree the image becomes fisheyed where the more extreme angle are getting stretched at the periphery, the minimum viewports in game you'd need to use is 2 but you'd have some dodgy streching going on if you weren't facing directly forwards or backwards.

To get a really good surround you'd want to render one point of view for each monitor at the correct spacing in the engine, which is some highly customized code, I know it has been done in quake 3, see above post. It would also severly degrade performance.

It's also questionable if this would be even useful, with 360 surround view you're encouraged to rotate your body when it's actually easier to rotate your viewpoint in the game itself, it would be a novelty at best, especially since you can only aim in one direction and would need to constnatly re-orientate your gun with your actual viewing direction.

Something like an eyefinity < 180 degrees setup with eyetracking or headtracking software would definately work better, i think eye tracking would be cool but thats not commercial yet that I'm aware of, headtracking is but supported by like 2 games.
 
Back
Top