Talk:Cg Programming/Unity/Projection for Virtual Reality

Change of Eye Point
Hi, so I am trying to use this script, am I correct in assuming that if you want to change the perspective according to eye point, you simply change the Vector3 coordinates of pe? —Preceding unsigned comment added by 2620:103:a000:401:b081:9a9e:fd80:a116 (discuss • contribs) 18:11, 3 April 2013


 * If you use the script in Unity, you should attach it to your Camera object. Then "pe" will automatically be set to the eye point. Otherwise, yes: pe should be set to the eye point in world coordinates. (Are you using Unity?) --Martin Kraus (discuss • contribs) 12:12, 9 April 2013 (UTC)

Yes, I am using Unity, and I am a little confused, I have a projectionscreen set as a plane with the center at 0,0,0 and the corners should be at (-135,-54,0) (lower left) (135,-54,0) (lower right) and (-135,54,0) upper left in world coordinates, but when I change the default values of (-5,0,5), etc., in the script the perspective is incorrect. However when I use the default values of the script it appears correct, and I am confused as to why this is? Also, I am simply trying to test the perspective accuracy of the script so I figured if I change the pe to some new position then the perspective should shift accordingly, but it appears to remain stationary. What exactly is the difference between eye point and camera? Isn't the camera representative of your eye point? (I am new to Unity) Thanks for any advice you can give. —Preceding unsigned comment added by 2620:103:a000:401:b081:9a9e:fd80:a116 (discuss • contribs) 20:34, 9 April 2013


 * The default values (-5,0,5) etc. are Unity-specific local coordinates of the corners of the built-in Plane object. You would only change these values if you use your own mesh for the projection plane instead of the built-in Plane object. I don't understand why you want to change "pe": if you want to move the camera position (i.e. eye point) then you should move the Camera object. But I would assume that changing pe (for whatever reason) should set a new camera position. However, the new perspective will only show up in game mode (when you run the game). What exactly are you changing? (Please show the code.) --Martin Kraus (discuss • contribs) 20:43, 9 April 2013 (UTC)

Okay, well then I guess I don't truly understand the whole concept, so I eventually want to use head tracking, but I need to verify if the method works first, so when I first start up the script the eye point is at 0,0,0 correct? then lets say I move my head 2 world units to the left, how would I adjust the script to just show me the new correct perspective according to my new head position? Do you have any interest in joining the #unity3d IRC at http://webchat.freenode.net/ to discuss this at a faster pace? And I guess since I am new to unity I do not fully understand the built-in Plane Object, I just created a plane object and scaled this so that it fit my fov exactly. Again, thanks for discussing this with me. —Preceding unsigned comment added by 2620:103:a000:401:b081:9a9e:fd80:a116 (discuss • contribs) 20:56, 9 April 2013‎


 * OK, I'm in the IRC. --Martin Kraus (discuss • contribs) 21:05, 9 April 2013 (UTC)

Changing projection matrix in Unity - Problem light and shadows
Hey, first of all thank you for your publication on generalized perspective projection. It helped me a lot for my project.

I have a question on "limitations of this implementation in Unity". Do you as well experience troubles with lights and shadows? In the scene view my shadows/lights look perfectly normal but in game view they are missing. I am positive that this is due to the changes I made to the projection matrix since with a regular projection matrix the lights/shadows look normal.(So I rule out a Unity configuration mistake: e.g. forward vs. deferred lightning rendering path)

Here is a screenshot of my running project to clarify what I mean:

Scene- vs. Game-View

My guess is that the transformations which are made to geometry do not apply to lights/shadows although they are supposed to. Do you know how to solve this problem? Or can you point me in a direction of possible solution?

cheers, Markus L. —Preceding unsigned comment added by 139.13.81.160 (discuss • contribs) 15:43, 8 May 2013‎


 * I haven't tried lights and shadows with this and I don't know how to fix it. My point of view is that almost everything should work. If it doesn't, it's a bug in Unity. ;) --Martin Kraus (discuss • contribs) 19:10, 22 May 2013 (UTC)


 * Apparently, Unity's shadow maps don't work if you put all matrix transformations into the projection matrix and use an identity matrix for the modelview matrix. I've split it in the traditional way and this appears to work with shadow maps in Unity Pro. --Martin Kraus (discuss • contribs) 14:15, 28 September 2013 (UTC)

Using multiple projection planes
Hi, I am using your script for a CAVE-like application with four projection planes which runs Unity: a front plane which has the dimensions 4.7 meters x 2.64 meters, a left plane placed at a 90° angle to the front plane (2.35 meters x 2.64 meters), a right plane placed at a 45° angle (2.35 meters x 2.64 meters) and a bottom plane placed at the bottom between front and left plane (2.35 meters x 2.64 meters). This installation is fixed and cannot be changed. As long as I use only the front plane, everything appears to be fine - even a headtracking system works as intended. But when I try to use more than one plane, I am not able to get a consistent image over multiple planes, i.e. the rendered images of each plane don't fit together.

I have imitated this construction in Unity and placed the Unity plances according to the real ones. Do I have to scale the Unity planes in order to fit the real dimensions? Do I have to position the Unity planes exactly in the way the real ones are positioned or must there be a difference? I also have changed the values pa, pb and pc to fit my application but I am not sure if they should be kept at the default values of +- 5.

Any help would be greatly appreciated! If you need any further information, please let me know. —Preceding unsigned comment added by Andre Koza (discuss • contribs) 07:48, 11 July 2013


 * Hi Andre, sorry for the late reply. Yes, you have to scale the Unity planes in order to fit the real dimensions (note that the standard Unity plane is of size 10 meters x 10 meters for scale factor 1). Yes, you have to position the Unity planes exactly in the way the real ones are positioned. No, you must not change pa, pb and pc. (Vector3(-5.0, 0.0, -5.0) is the lower left corner in object coordinates for ALL standard Unity planes, even after scaling and positioning.) Hope this helps. --Martin Kraus (discuss • contribs) 15:02, 22 July 2013 (UTC)

Hi Martin, I'm also sorry for the late reply but thanks for your help. It works now for the front, left and right plane. Andre --131.234.86.137 (discuss) 15:05, 14 August 2013 (UTC)

Camera positioning and Projection on tilted planes
Hi, I somehow reached this page looking for a solution to off-axis projection that might already be out there and looks like this pretty well suits for the purpose. I am trying to build a CAVE like virtual environment as well with multiple projection planes (some of the planes may be straight and others might be at angle - eg: one plane placed at 90 degree to the other).

I wasn't able to get the script working out of the box, so I did some modifications. First, as mentioned in the article that the Unity's built-in Plane game object has local coordinates of (+-5, 0, +-5), I had to scale the X and Z by a factor of 10 to get the 5 on each side. I also had to rotate the place by angle of -90 degree around X to make it straight facing the plane. If I don't do the above, I wasn't able to get any kind of output on the game view even when I had placed some cubes in front of camera to check the output given from a different perspective by off-axis projection.

I used the script's method of getting the Left, Right, Bottom and Top coordinates and used the PerspectiveOffCenter method to get the off-axis projection matrix. This output does look fine to me as I see in the Game Preview. Here's the pastebin of my code with a little bit of modifications that I did - http://pastebin.com/E23Nh8Xv

However, I have to place the camera at the user's eye/head position as well and if I do that using the Translation matrix that is there in this script, the off-axis projection doesn't seem to correct. If I rotate the plane that I have here and then try multiplication with the Rotation Matrix that's there in this script, I get no output at all. It would be really great to know what's going wrong in here and how to get it working correctly for my scenario (multiple projection planes, placed at an angle to each other)


 * Hi, please sign your comments with --~ . What do you mean by "I had to scale the X and Z by a factor of 10 to get the 5 on each side"? Do you scale in the Inspector or somewhere else? What is the size of your physical projection wall in meters?
 * The script cannot work if you don't set the translation and rotation matrix. Are you sure that the front face (i.e. the visible face) of the projection plane is facing the camera, that you have some light sources in the scene, and that your cubes are not occluded by the projection plane? --Martin Kraus (discuss • contribs) 08:56, 26 July 2013 (UTC)

Hey Martin. The real world setup that I am trying to create has three projection screens of 1.55m (width) x 2.43m (height). The left screen is at an angle of 45 degrees to the middle and the right screen is at angle of 45 degrees to the middle as well. To begin with, I was trying to get the correct projection on a single projection screen which is rotated in the world and is not prependicular to the user (i.e Viewing plane not prependicular the user in the real world setup). Yes, I scaled the Unity Plane Game Object from the inspector. With the local coordinate system for the plane starting from the center, I tried placing spheres at the corners (eg: Bottom Left sphere atprojectionScreen.transform.TransformPoint(Vector(-5, 0, -5))) and for the default Plane object width and height, these sphere were getting placed out of the plane. Default vertices of the corners of the plane with the local coordinate system starting at the center are in the range of (+-1, 0, +-1) probably as I saw and to match the vertices of the plane as to what is specified in the script, I increased the dimensions by a scale of 10 so that the corner vertices become (+-5, 0, +-5).

When I just added the Plane into the scene, all rotation transformations are set to Zero, the plane was sitting on the XZ plane and the front side of the plane wasn't facing the camera. So, I gave a X rotation of -90 to the plane to bring it in world XY plane and to make it face to the camera. I do have light sources and the cubes appear very well in the Game output when I don't use the rotation matrices in the script. The Plane game object's Mesh Renderer is also set to false, so that is not occluding the scene as well. I have attached the small test project itself to cross-check if there's something that I am doing wrong here and why isn't the rotation effect working out properly. Here's the link: https://www.dropbox.com/s/e603eq96pb8rlvu/CamTest4.zip Would be great to know of the problems here. --Harpreetsareen (discuss • contribs) 06:25, 27 July 2013 (UTC)


 * Change line 100 of UserPerspectiveCam.js to:  (the rotation matrix is important). And add a line after line 100:   (the standard modelview matrix should not be applied and is different from rm*tm in any case). Then it works for me. Also, I found a bug in the original code in the wikibook for adjusting the view frustum (which you are not using in UserPerspectiveCam.js). I've fixed that in the wikibook. Let me know whether these changes work for you. --Martin Kraus (discuss • contribs) 14:28, 28 July 2013 (UTC)

Hey Martin. Many apologies for not being able to come back on this sooner. I had almost given up on this but gave it another shot after your response. The objects behind the projection screens appear correctly now (even in the case of a rotated projection plane).

However, there's a small problem I am facing after a short modification. As stated earlier, I have three projection screens and hence three cameras, one for each of screen respectively. For each of the cameras, I created a Render Texture, and each of the projection screen camera renders its output to its respective Render Texture. For each of these Render Textures, there's a corresponding Material as well to which respective Render Texture is applied.

These materials are correspondingly applied to the Left, Middle and Right projection planes. The projection planes are on a separate layer and the Perspective Cameras do not render them. I have a main camera in the scene which is seeing all the projection planes and for only this main camera, the projection planes are visible [being on a separate layer]. When I move the cameras [i.e emulate user's movement with w/s/a/d keys to move left/right], the materials applied on the planes seem to be inverted because of which there is no stitching in the scene behind the projection screens. The sample for this is attached here: https://www.dropbox.com/s/q36lrgjytl46ci4/ThreeProjectionScreens.zip It would be great to know of a solution for the problem because this seems to be a very common problem to occur in such a scenario of multiple cameras. --Harpreetsareen (discuss • contribs) 13:13, 20 August 2013 (UTC)


 * Can you undo the inversion by adjusting the x or y tiling parameter for the texture in the material to -1? --Martin Kraus (discuss • contribs) 13:26, 20 August 2013 (UTC)

Setting the x and y tiling parameter both to -1 (since both directions of material seem to be inverted) seems to just not show anything on the plane afterwards. Evidently, with default texture behavior on the material, simple movement of the game object behind the screen shows the object on the screen moving in the opposite direction. So, the inversion definitely seems to be there but I am not precisely how to deal with correcting this. --Harpreetsareen (discuss • contribs) 03:58, 21 August 2013 (UTC)


 * Did you set the wrap mode of the texture to repeat? (http://docs.unity3d.com/Documentation/ScriptReference/Texture-wrapMode.html ) --Martin Kraus (discuss • contribs) 09:02, 21 August 2013 (UTC)

Works brilliantly. Just to get a little more detail on modelling the virtual world same as the real world setup, the Plane game object in Unity has a size 10 by default (unlike other game objects which have size 1). So for a wall of width 5.0ft x 8ft (~1.5m x 2.3m), in Unity, the scale of the Plane primitive game object becomes nearly equivalent to x ~= 0.15 and z ~= 0.23 [width and height divided by 10 since the default size of Plane is 10 as well; z specified since Plane object has been rotated by -90 degrees to make it front facing]. Some of the Unity modules assume 1 Unit of Unity to be 1m and so I was wondering if scaling down objects to this level [as Plane scaled down to 0.15 and so objects behind will be further scaled down] has some effect on the quality obtained on the screen. Right now, I am getting kinda low quality output on the Projection Screen material (rendered with Camera's Render Texture) from the game objects which are behind the projection screen. Is that the case? How did you model your virtual world to match exactly the real world setup? Is there a setting for the output quality of the material of such that has to be set or something similar? --Harpreetsareen (discuss • contribs) 13:10, 22 August 2013 (UTC)


 * Your scaling appears to be correct and it shouldn't have an effect on the render quality. I haven't worked with RenderTextures, but I assume the quality depends a) on the width and height of the texture (in pixels), and b) the pixelWidth and pixelHeight of the camera. For the best quality, these dimensions should probably be large and they should be the same for the camera and the texture. But that's just an assumption. --Martin Kraus (discuss • contribs) 13:36, 22 August 2013 (UTC)

And lastly, to make this an all in one exhaustive discussion on the Talk page, I would just go ahead and ask about how does one manage multi-monitors/multi-projectors/multi-displays for such a VR projections built out from a scene. As far as I know, Unity doesn't support multi-window setups in the same application and so is the only way to get around is to make exceptionally long windows extending across screens? Probably Networking components could be built to send the data to different clients but that seems to be an overkill. Can different cameras render to different windows in the latest Unity version or is there another way to achieve this in Unity for an actual CAVE environment? --Harpreetsareen (discuss • contribs) 17:18, 23 August 2013 (UTC)


 * Excellent question. I don't know. We got rid of our CAVE about a year ago (and I never worked with it). Nowadays, we only render stereo for various devices. And I think that is done by rendering a long window (or a long screen spanning multiple displays). But I leave these details to my students. ;) I'll ask a colleague, maybe he knows something about it. --Martin Kraus (discuss • contribs) 19:23, 23 August 2013 (UTC)

Shadows disappearing on using the custom projection matrix script
Hey Martin, There's another bug that I discovered while I was just working with a very well lit 3D scene and was trying to put in the perspective camera in the same. The cam does work fine with the lighting but it does not render shadows at all with the script and the output comes out to be dismal as compared to the scene that one can see in the Scene view. I tried using the PerspectiveOffCenter function independently in a project to check if unity behaves weirdly with custom projection matrices, but the shadows do render precisely if I use only that function. However, if I use the script that we have here with custom rotation and translation matrices as well, there seems to be an issue with the shadows. I made a small project with the repro of this issue for you where you can toggle off/on the script for User Perspective Cam and you'll be able to see the shadows getting rendered/not rendered respectively. Would be great to know of a solution since this is turning out to be real head turner in the rendering quality. The scene view of my game view looks absolutely stunning but this loss of shadows/quality in the final game view just spoils the final output. I am not sure but is the problem arising due to Matrix4.Identity that we are doing in the camera.worldToCameraMatrix? --Harpreetsareen (discuss • contribs) 12:00, 23 September 2013 (UTC)


 * Apparently, it is a bad idea to put all matrices in to the projection matrix; I changed the code to split the matrices in a projection matrix and a modelview matrix in way that you suggest and this appears to work with Unity's shadows maps. --Martin Kraus (discuss • contribs) 14:18, 28 September 2013 (UTC)