VR exploration to learn design and dev principles for creating virtual reality content. Created with Unity 3D, and compatible with Google Cardboard.
My role: Concepted, designed and built the game. Models from Ted Sachi
The idea came when on a walk with my dog. I was thinking, what superpower would I want to do that I currently couldn't? First thought was "I would like to move cheerios with my mind...Matilda" but that concept wasn't layered enough. So then what if I could zap things? Staying with the cereal theme I stumbled upon zapping out the marshmallows from Lucky Charms.
Concept is: Use your head movement and gaze to look at the marshmallows and zap them out, leaving only the cereal left.
The whole point of doing this game was to learn some design and dev principles for creating in VR. Things that I wanted to learn from this were:
1. Difficulty of using your head as a gaze input... and was gaze as a trigger even ideal
2. How far is too far away when using vision to select things
3. Using/reading text in VR
Gaze as an input
I targeted development for the Google Cardboard because it was a lowest common denominator of sorts. Cheap and easy to use. The main selection input was also gaze. I started development pretty crudely, using spheres + limited shapes and testing how easy it was to select them based on size. Technically it was easier to start with dev and getting things to work than it was focusing on completely designing.
Things I used to kick start dev were:
What I learned about gaze... Getting something to be easily selectable is tricky because everyone is different. Those with better coordination of their neck can more fine tunely select something, where as others have more trouble. For me, the ideal size of objects I wanted selectable were scaled 1.5x. Using gaze as a selection input is nice because it eliminates the need for a controller or button, but there is also the accessibility problem that arises when someone cannot steadily move their head.
Learning how to design and build for VR is changing the way I think about experiences to more of a multi-sensory journey.
The mid way point of dev came when things really started falling into place. The 3D models were progressing (Thanks to Ted) and the level of game intricacy was advancing. Things like buoyancy, proper marshmallow deletion after selection, and starting to create the scene was happening! I started learning about materials, meshes, prefabs and efficiency.
Through user testing I discovered the importance of sound was crucial to the experience, without it the game was boring. Learning how to trigger specific sounds for actions was a new learning curve because I haven’t had to deal with audio in previous projects.
Getting the app published on Android was actually pretty easy. Unity has a built in export feature that automatically creates an Android app file, that is then submitted through the Play store.
Cardboard model for onboard instructions
While user testing the beta 1 app there were a few common themes. To those who were not familiar with using the Cardboard, understanding how to move the cursor, or how to even work the game was challenging. The lack of intro or on boarding didn’t help the issue either. A common expectation was for something rewarding to happen upon completing the level, in the beta is was anti-climactic because the game just restarted. There were a few areas I need to improve on; Having some kind of intro or how-to screen explaining the goal of the game and how to work the mechanics. And a more exciting reward needs to happen when completing the level.
I’m currently working on getting the iOS app to work, it’s not quite as simple as Android sadly. I’m rebuilding the app in attempts to debug, so stay tuned it’ll be in the App store soon!
I am also looking into creating an onboarding screen, as well as building out 1 more level.