Blog - Unity / Oculus Rift VR Development

  • Share
  • Share
Unity / Oculus Rift VR Development Slalom Virtual Reality

Unity / Oculus Rift VR Development

Challenge yourself with a new platform: graphics development in Unity and Oculus Rift VR

Andy Kong | October 16, 2015

When I first took on the challenge of developing multiple Unity scenes using the Oculus Rift VR goggles I was excited and overwhelmed at the same time. The only prior experience I had with graphics programming was back in college when I took a computer graphics course. From what I could recall, the course was interesting, but it also involved a significant amount of math (e.g. matrix algebra), physics knowledge, and programming with OpenGL’s shader language (GLSL). I will refer you to a “quick”, 8-page GLSL reference card at http://www.khronos.org/files/opengl-quick-reference-card.pdf to give you a sense of what I thought I was getting into. Long story short, developing for the Oculus Rift on the Unity development platform was a pleasure much to my surprise.

Programming languages

After three weeks of developing within the Unity environment, I can say that the developer experience between programming in GLSL and Unity is night and day. Unity currently supports three programming languages: UnityScript (a variant of JavaScript), C#, and Boo. They are slowly phasing out support for this as of Unity 5’s release earlier this year as the documentation no longer has an option for Boo code snippets. The availability of a very popular scripting language and a strongly-typed language makes developing in Unity very accessible to developers of different backgrounds. As a seasoned C#/.NET developer, I felt right at home using familiar programming constructs and imported libraries to get the game scenes to perform the functionality I wanted with relatively few lines of code. The Unity Game Engine runs off of Mono so targeting multiple platforms is simply a matter of making sure the Mono platform supports the target machine. Here is a list of the supported platforms: http://www.mono-project.com/docs/about-mono/supported-platforms/. For those interested, any scripts written in JavaScript is compiled down into the .NET’s Microsoft intermediate language (MSIL), so Unity builds are just .NET assemblies and executables.

Unity development community

It is apparent that Unity Technologies stands behind their game engine and its developer community. The video tutorials are super easy to digest, the documentation is very complete, the asset store is well-maintained, and they are working hard to make the development experience better with each release (e.g. they recently re-designed the UI creator).

Unity – Oculus Rift VR development

Making a game Oculus Rift-compatible simply involved dropping in a pre-fab Oculus VR camera rig onto the scene and removing the default camera. That is essentially the only step of setting up an Oculus Rift-compatible game within Unity.

Some considerations when making an Oculus Rift Unity-powered game:

  1. The Oculus Rift camera can be thought of as your head, but that it can rotate freely in all directions. It is important to keep this in mind when developing say heads up displays attached to the “player” rather than the “head”.

  2. What you see in the Unity development simulator may or may not be what you get while wearing the VR goggles. Test out the game scene with the actual Oculus Rift as often as possible.

  3. It is important to keep in mind that the Oculus Rift is an immersive experience and as such, considerations should be made to ensure that users are not subjected to scenes that might have an adverse effect on their mental well-being.

Code snippets

The following is a piece of code that controls the flashlight associated with a user’s first person view camera. In the Unity editor, this script would be associated with the game component that represents the flashlight (something with a light source attached to the camera).

Explanation:

  1. The flashLightObject is an object that is made publicly available for use by the Unity editor.

  2. The Start method initializes the object (like a constructor).

  3. The Update method is fired off every so often within the Unity game engine’s tight loop if this script, “FlashLightController” is attached to an active Game Object via a Component. You can think of a Game Object as things physically present in the game world and Components as features on those Game Objects.

  4. Basically, if the ‘F’ key is held down while the function is going through an Update, toggle the flashlight on/off which is attached via the flashLightObject reference. You can do this attaching via the Unity Editor or also programmatically.

Challenges

There were of course some challenges during my experience with Unity development.

  1. Git/Source Control Management – Many of the Unity files are binary files, and there is a .meta file for every asset file in a Unity project. This makes for an interesting development team strategy of having one person who touches the main scene file (*.unity) to do the high level wiring up while other developers work on bringing in assets (sounds/textures/pre-fab objects) and writing scripts.

  2. Coordinate system and UI development – While there are many scenarios where using the Unity visual scene designer is good enough for placing elements, learning how the coordinate system works and how things that are nested are spatially defined is key to more advance scene design. There was somewhat of a learning curve to remembering.

  3. Editing things in Play Mode (aka the debug mode) and having them disappear upon exit of Play Mode – This was personally the most of annoying of the three listed here. At first you wonder why they would ever have a feature like this, but then you remember that many debugging IDEs allow on-the-fly changes that are only temporary within that debugging session. To add insult to injury, the Unity editor’s only indications of play mode is the inverted color of the Play Button on the top and a slightly darker gray background for all the menus (see the screenshots of the new Unity 5 IDE below).

    If you are using Unity and have ever experienced this, you will want to review this tip to make it obvious when you’re in game mode: http://answers.unity3d.com/questions/22941/change-editor-colours-when-in-play-mode.html.

Unity 5 screenshots

Unity 5 Game Mode Screenshot (Sample Asset – Car Scene)

Unity 5 Game Mode Screenshot (Sample Asset – Car Scene)

Unity 5 Scene (Edit) Mode Screenshot (Sample Asset – Car Scene)

Unity 5 Scene (Edit) Mode Screenshot (Sample Asset – Car Scene)

Final thoughts

Overall, the Unity development experience was positive, and I would recommend considering learning Unity to anyone even remotely interested in game development. Also, according to the Wikipedia article, it is the default SDK for the Nintendo Wii U.

Computer screen with lines of code

Looking for more engineering tips?

Our engineers have a whole lot to say about custom software. They’re in the trenches every day, building, breaking, re-building, and sharing their hard-won wisdom along the way. Find their latest and greatest discoveries on Slalom’s new software engineering blog.

Read our engineering blog

Andy Kong is a software engineer in Slalom's Cross-Market team located in Chicago. Andy helps his clients deliver high-quality, modern responsive website solutions by leveraging modern JavaScript frameworks and coding practices. Find him on Twitter or LinkedIn.

            

Start a conversation

What’s on your mind? Let’s explore the possibilities.