Goal & Motivation

We wanted to create an immersive gameplay of an astronaut lost in space. Inspiration of the project comes from the movie Gravity and we wanted to create as realistic physics as possible, while at the same time focusing on making the game playable. To make it as immersive and multimodal as possible, the player uses Wii remotes for steering, the Oculus Rift for gazing around the scene and sound has been added to further engage the user in the scene.

Hardware

The Oculus Rift provides an immersed experience and thanks to the DK2 we get a pretty good resolution for the user (960x1080 per eye). Thanks to the positional tracking the movements of the player in real life corresponds to the avatar in the game and connecting the game with the device was pretty straight forward thanks to the Oculus plugin for Unity. The Wii remote and nunchuck are being used to engage different thrusters on the player model, sort of a jetpack. To connect them with Unity, we also use a plugin (UniWii). Unfortunately the connection wasn’t as smooth as with the Oculus and we have encountered a couple of problems, more on that below. The plugin used is based on the DarwiinRemote framework.

Software

Because of the limited scope of this course, we felt the need for a fast and reliable program for development. We have been using the Unity 3D game engine to create the scene since it gives you what you want and development is fast. During the first phase, simpler models where imported from the assets store but until ComicCon, we have developed a couple of models using Blender, which is a free and open source 3D animation suite. The code in the project is made in C# and developed in MonoDevelopment, which is an IDE created for Unity.

Challenges & Obstacles

Creating an immersive and enjoyable game without much prior experience in the field of computer graphics has definitely been a challenge for the group. The atmospheric scattering that takes place on the earth’s atmosphere took its share of time to develop; neither of us has programmed a shader before. The project intially only used a wii remote and the nunchuck as controls for the avatar. Most of the attempts only one of the controls would connect, and out of every tenth attempt would the second also connect. Connecting two wiimotes simultaneously was really unreliable, but after some development of the scenes we have managed to connect two wiimotes at the same time. So for the latest version, it is possible to use two wii remotes, as well as a single wiimote and a nunchuck (if needed).

Lessons Learned

  • Plugins doesn't always work
  • Divide the workload
  • Wii remotes unreliable
  • A good topology crucial when modelling
  • Create normalmaps between high and low poly
    meshes to save computing power

Related Work

  1. Modeling and Rendering of the Atmosphere Using Mie-Scattering
  2. Atmospheric Scattering in Unity from GPU
  3. Designing Games with a Purpose
  4. Unity 3D
  5. Blender
  6. UniWii
  7. OculusPlugin

The Components



Oculus Rift

Wii Remotes

UniWii & OculusPlugin

ForskarFredag





"When do you release it?
I will buy it instantly!"

"It is exactly like 'Gravity'"

"Wow, this is really cool!"

Some user testimonials from ForskarFredag



The Team



Stefan Etoh

Interactive Media Technology @KTH

Oscar Friberg

Interactive Media Technology @KTH

Johan Bäckman

Interactive Media Technology @KTH