Unity / Oculus Rift VR Development
Challenge yourself with a new platform: graphics development in Unity and Oculus Rift VR
Andy Kong | October 16, 2015
When I first took on the challenge of developing multiple Unity scenes using the Oculus Rift VR goggles I was excited and overwhelmed at the same time. The only prior experience I had with graphics programming was back in college when I took a computer graphics course. From what I could recall, the course was interesting, but it also involved a significant amount of math (e.g. matrix algebra), physics knowledge, and programming with OpenGL’s shader language (GLSL). I will refer you to a “quick”, 8-page GLSL reference card at http://www.khronos.org/files/opengl-quick-reference-card.pdf to give you a sense of what I thought I was getting into. Long story short, developing for the Oculus Rift on the Unity development platform was a pleasure much to my surprise.
Unity development community
It is apparent that Unity Technologies stands behind their game engine and its developer community. The video tutorials are super easy to digest, the documentation is very complete, the asset store is well-maintained, and they are working hard to make the development experience better with each release (e.g. they recently re-designed the UI creator).
Unity – Oculus Rift VR development
Making a game Oculus Rift-compatible simply involved dropping in a pre-fab Oculus VR camera rig onto the scene and removing the default camera. That is essentially the only step of setting up an Oculus Rift-compatible game within Unity.
Some considerations when making an Oculus Rift Unity-powered game:
- The Oculus Rift camera can be thought of as your head, but that it can rotate freely in all directions. It is important to keep this in mind when developing say heads up displays attached to the “player” rather than the “head”.
- What you see in the Unity development simulator may or may not be what you get while wearing the VR goggles. Test out the game scene with the actual Oculus Rift as often as possible.
- It is important to keep in mind that the Oculus Rift is an immersive experience and as such, considerations should be made to ensure that users are not subjected to scenes that might have an adverse effect on their mental well-being.
The following is a piece of code that controls the flashlight associated with a user’s first person view camera. In the Unity editor, this script would be associated with the game component that represents the flashlight (something with a light source attached to the camera).
- The flashLightObject is an object that is made publicly available for use by the Unity editor.
- The Start method initializes the object (like a constructor).
- The Update method is fired off every so often within the Unity game engine’s tight loop if this script, “FlashLightController” is attached to an active Game Object via a Component. You can think of a Game Object as things physically present in the game world and Components as features on those Game Objects.
- Basically, if the ‘F’ key is held down while the function is going through an Update, toggle the flashlight on/off which is attached via the flashLightObject reference. You can do this attaching via the Unity Editor or also programmatically.
There were of course some challenges during my experience with Unity development.
- Git/Source Control Management – Many of the Unity files are binary files, and there is a .meta file for every asset file in a Unity project. This makes for an interesting development team strategy of having one person who touches the main scene file (*.unity) to do the high level wiring up while other developers work on bringing in assets (sounds/textures/pre-fab objects) and writing scripts.
- Coordinate system and UI development – While there are many scenarios where using the Unity visual scene designer is good enough for placing elements, learning how the coordinate system works and how things that are nested are spatially defined is key to more advance scene design. There was somewhat of a learning curve to remembering.
- Editing things in Play Mode (aka the debug mode) and having them disappear upon exit of Play Mode – This was personally the most of annoying of the three listed here. At first you wonder why they would ever have a feature like this, but then you remember that many debugging IDEs allow on-the-fly changes that are only temporary within that debugging session. To add insult to injury, the Unity editor’s only indications of play mode is the inverted color of the Play Button on the top and a slightly darker gray background for all the menus (see the screenshots of the new Unity 5 IDE below).
If you are using Unity and have ever experienced this, you will want to review this tip to make it obvious when you’re in game mode: http://answers.unity3d.com/questions/22941/change-editor-colours-when-in-play-mode.html.
Unity 5 screenshots
Unity 5 Game Mode Screenshot (Sample Asset – Car Scene)
Unity 5 Scene (Edit) Mode Screenshot (Sample Asset – Car Scene)
Overall, the Unity development experience was positive, and I would recommend considering learning Unity to anyone even remotely interested in game development. Also, according to the Wikipedia article, it is the default SDK for the Nintendo Wii U.
Looking for more engineering tips?
Our engineers have a whole lot to say about custom software. They’re in the trenches every day, building, breaking, re-building, and sharing their hard-won wisdom along the way. Find their latest and greatest discoveries on Slalom’s new software engineering blog.