Designing for VR

Virtual reality is cool, there’s no denying it. Yet it’s still in its infant stages, and there aren’t that many consolidated design guidelines around (Oculus does have a really good Best Practices doc though). And so I’ve decided to come up with my own set of general guidelines, based on my personal experience with VR, as well as some of the stuff I learnt at the recent Game Developers’ Conference (more on that here).

Note that this guide doesn’t have a specific headset in mind, and that I am by no means an expert in this subject.

On simulator sickness

As many as one in two people suffer from VR sickness. I myself have it really bad – think not being able to eat (and eating is my favorite thing in life!) after spending a day developing a VR experience, or wanting to puke after an experience which over a hundred other people have tried and are fine. Yet I still love VR, especially those which don’t make me feel sick after. Yes, it’s possible to eliminate sim sickness from an experience!

Don’t change the position of the camera

This is the number one thing which makes me break out in sweat. Whether the camera translation is caused by the game taking away control from the player and moving his character forcibly, or by the player voluntarily making his character move, it still feels bad. In the real world, we’re the ones moving through it, not the world moving around us. Not being able to move around and explore the game world does seem to undermine the power of VR though, so if you really must have movement, here are 2 things to keep in mind.

1. Don’t accelerate the camera

Our brains detect acceleration, not speed. Don’t believe me? Think back to the last time you were on an airplane. At an average cruising speed of around 900km/h (~550mph), why does it feel almost like the plane is stationary? And yes, I’m aware that acceleration is unavoidable when starting/stopping the camera movement, but that happens in a split second and is generally too short for our brains to register.

2. “Flash” to a new position

This is my preferred method, as moving the camera at a constant speed still tends to make me sick. The main idea here is that the player chooses the new location he wants to go to, and teleports there. However, since teleporting somewhere means that there is a sudden change in environment, this can be disconcerting and immersion breaking.

The solution is to move the camera from the old position to the new in <100ms. For lack of a better analogy, it’s sort of like the superhero Flash running from one place to another (hence “flashing” lol). The movement duration is too short to cause our brains to get sick, yet long enough that we’re aware we’re actually moving through space and not just appearing somewhere. If you do this, though, make sure you texture large objects such as a wall, so the player doesn’t flash to a position in front of a completely featureless white wall and think he broke the game.

3. Stepping through a portal

This one is inspired by Budget Cuts, an amazing and hilarious stealth VR game for the HTC Vive. In this game, the controllers also act like a “portal” gun-of-sorts. Aim and shoot it at a location, and a portal to that location opens on the handle of the gun, which you can peek through to check for danger before “stepping through” by triggering the gun again. Well, technically the portal edges grow until they completely surround you and disappear, but it feels like you’re stepping through the portal.

savkeg0njedmaA “portal” gun which is nothing like Aperture’s one!

Without giving too much away, my favorite thing about this is that it fits in with the story really well. It may sound weird just hearing about it, but when you actually try it, everything makes total sense!

Another interesting method I’ve seen is in Headmaster, a game which puts you in the Football Improvement Center because, as the story goes, you suck at that sport. I really like how the environment is darkened while the player is moved and the new scene is set up. This works because it makes story sense (it’s night, and you’re in a soccer field lit by spotlights which can be switched off), and the camera movement is minimal.

That said, it’s possible to design really good games without having to move the camera around – take a look at I Expect You to Die and EVE: Gunjack.

Don’t ever rotate the damn camera!

Seriously, don’t. The headset already tracks the head rotation of the user, so please don’t rotate the camera as it feels even worse than translation. This is especially so if the rotation is around the z-axis and the horizon is no longer horizontal, because our brain uses the horizon as a reference for our sense of balance.

Instead, design your experience so that the player is the one rotating his head. First time VR users tend to keep their head in a fixed direction, so it’s your job to give them a reason to look around. The best way to do this is to have something interesting move out of the user’s field of vision, which will naturally cause him to turn his head to follow the movement. It doesn’t have to be complicated – a character simply walking to the player’s side and out of his FOV does the trick. And don’t forget to get the player to look up and down too, in addition to left and right!

Keep the frame rate high and consistent

It’s been said many times that VR experiences should aim for at least 90fps, with 60fps being the absolute minimum. I completely agree with that, and I really like Sony’s move to reject all potential PSVR games which drop below 60fps. All VR publishers should follow that lead.

However, I’d argue that a lower, consistent frame rate is more important than having an inconsistent, high frame rate, i.e. a fixed 60fps feels better than a 85-95fps range. Our brains can and will adapt to a lower frame rate after a while, but if you keep changing the frame rate, there’s no way for it to do so. Case in point: in a 360 video player for mobile which I was previously working on, trying to do 60fps led to a lot of frame dropping and lagging issues. It felt extremely terrible. Conversely, when I dropped the frame rate to 30fps, even though it still didn’t feel great, the experience made me a lot less nauseous. Note that I’m not saying you should ever publish a VR experience at 30fps!

Get a naive user to playtest it

As mentioned in the previous point, your brain is really good at adapting to stuff. So the fact that the more time you spend in VR, the more your brain adapts to it should not come as a surprise. Unfortunately, your target audience is unlikely to have spent anywhere near as long as you in VR. The solution? Playtest for comfort with people who are not used to VR. Now that VR is recognized as the up and coming cool new thing, you shouldn’t be short on playtesters who fit the bill.

On user interfaces

Traditional UIs don’t really work that well in VR. Take advantage of the sense of presence in VR and design for it, instead of trying to copy and paste what worked in another medium to VR.

Avoid HUDs

Traditionally, HUD elements are rendered orthographically, even for 3D games. In VR, this means that both eyes receive the same image, making the focal distance of that image infinity.  The problem arises when environmental objects which are occluded by the HUD have a smaller focal distance. This confuses the heck out of our brains, because the monocular depth cue of occlusion is saying the 3D object is further away, yet the binocular depth cue of stereopsis is saying the HUD is further away.

You might think this can be solved by rendering the HUD in world space, on a plane near the cameras instead. Well, yes and no. For a player to be able to comfortably focus on and read it, the UI element needs to be at least 30cm (~1 foot) away. To make that work, you need to always ensure no other object comes in front of that UI plane and occludes it. Can you imagine your player trying to open a menu while directly in front of a wall, and wondering why the menu isn’t appearing?

A better solution would be to have the UI appear on an object in the environment. Using a crosshair for selection? Render it on the object which is currently being selected. Want the player to be able to toggle that UI whenever he wants to, and have it make story sense? Well, have you heard of the mobile phone? It’s a device which we carry around everywhere. Creating a fantasy/medieval game? Even better, you can now bend the rules slightly and add magic. Don’t want magic? Fine, then just use a scroll. Think out of the box!

Use environmental interactions

The most powerful thing about VR is the sense of presence it gives. So why not make selections by interacting with objects in the environment? This takes my previous point on rendering UI elements on environmental objects a step further. In Fantastic Contraption, a puzzle game in which you build fantastic contraptions (duh :P), you save the game by picking up a miniature model of your contraption and putting it on the save table. In Cloudlands: VR Minigolf, you start the game by hitting your golf ball into the miniature castle.

Another really great example is the exit burrito in Job Simulator. You literally eat it (bring it up to your mouth) – the first bite selects the exit option, and the second bite confirms your choice. Gamasutra has a nice and short article about it here.

Exit burritoExit burrito FTW!

Have shortcut actions

This is pretty much inspired by Fantastic Contraption. The dev showed off a really impressive demo at GDC, and oh man. For some reason, it made so much sense to pull a stick from behind your shoulder, sort of like drawing a arrow from a quiver on your back, rather than having to select something from your inventory. As with any shortcuts, shortcut actions should be geared towards players who are already familiar with your game, so you do still need a default way of selecting something. In Fantastic Contraption, that’s done by an inventory cat who follows you around, because why not?

Unfortunately, this feature is only feasible if there is another input device aside from the HMD, like motion tracked hand controllers.

Nodding or shaking your head

Now this is something that would work with just the HMD. You know how in real life, we sometimes reply to a yes/no question by nodding/shaking our heads? What’s stopping us from doing that in VR too? These 2 head gestures come naturally and don’t require teaching, are distinct and simple enough to detect even without a separate positional tracking camera, yet are woefully underused in VR. The only game I’ve come across which uses this (maybe I haven’t played enough VR games) is Headmaster – there’s a point in the tutorial when the NPC voice asks you to nod your head if you accept the terms and conditions.

On hands

One of the most important forms of input we have in the real world are our hands. Which is why giving the players hands in VR can be very powerful. Of course, this is feasible only if you have some form of tracking the player’s hands, like motion tracked hand controllers.

Ensure the hands have the right scale

Your brain uses your hands as a reference for the scale of the world. This means that virtual hands which are too large/small or too far away/near to the camera (our “eyes”) can confuse the player. Most hand tracking devices which have a game engine plugin already take care of of the distance from the virtual hands to the camera, which leaves you with just the hand size to worry about.

Unfortunately, there’s no hard and fast way to ensure the right hand size. Using the average human hand size is a start, but it really depends on what sort of experience you’re creating. The only way to figure what works for your particular game is to playtest with as diverse of an audience as you can get.

Make the hands generic

Unless you’re using a camera to capture the player’s real hands, like how Leap Motion does its image hands, you can forget about giving the virtual hands “character”. What do I mean by character? Basically anything which differentiates those hands and make them feel “yours”, like skin color. The reason for this is because seeing somebody else’s hands replace your own is a major immersion breaker. Can you imagine if a female player tried your game, only to discover that her virtual hands were big, hairy, dark skinned, and clearly male? How disconcerting is that!

The solution to this is surprisingly simple. Just make the virtual hands wear gloves. Now it fits everybody! And we don’t have to stop at just gloves. Try cartoon hands, robot hands, animal paws, a sword, even a model of the controller you’re holding itself! Oh wait, that’s already a default for motion tracked hand controllers. And for good reason too – the hardware makers have done lots of playtesting and are offering to take this problem (and the one about scale too) off your hands completely. Do you accept?

Fantastic Contraption takes the default model of the Vive controllers and spruces them up.

Make the hands disappear when you pick up an object

This idea is from Job Simulator. The whole game is about picking stuff up (and throwing them :P). Because we have a different grip for every single object, that means they have to come up with a separate animation for each object you pick up. Which is totally not feasible, given the number of interactable objects in their game.

Their solution? Make the problem disappear, literally. If you take a look at the Job Simulator GIF above, you’ll notice how the hands disappear when you grab onto something, only to reappear when they let go of the object. Most people don’t even notice the disappearing hands (I bet you didn’t), because they’re focused on the object they have picked up. Brilliant move by the devs!

On new users

Most of the people who play your game will be first time VR users. That’s a fact that we can’t avoid. What we can do, however, is design with them in mind. Their first experience is the most important one – it more or less determines if they’re gonna love VR, or hate it and not want to have anything to do with it again.

Ease them into it

I’m going to start off by saying this applies to non VR games as well. Like all good games, an experience should always start off simple, slowly becoming more difficult as the player becomes more skilled. This is known as flow, which is not specific to video games.

I really like how HTC did their Vive demo when they came to ETC last year. It started off with the theBlu, an underwater experience which brings you face to face with a huge blue whale. That experience doesn’t require the hand controllers, or even for the player to walk around. All it does is encourage you to look around, and be awed by the sense of scale and the majesty of the whale. One of the next few demos was the Tilt Brush, which taught you how to use the hand controllers. Finally, the last demo was the Aperture Science one (yes, you do get to see GLaDOS and Atlas!), in which you have to walk around and use the controllers to play the game.

Slow down the pace

While this also applies to experienced VR users, it’s crucial for first timers. Give them time to look around. Remember, VR feels very realistic and you don’t want to bombard them all at once and overload their senses. Slow things down when possible, have natural pauses in the story, or let the player choose their own pace using character location based event triggers. And most importantly, playtest it in VR – most well-paced VR experience tend to feel a bit draggy if you’re just watching it on a screen.

A great example is Headmaster (yes, this game again – it’s really well designed), in which your only action is to head a soccer ball. For non soccer players, having a ball fly at your face can potentially be very scary. Headmaster mitigates this by having the ball lobed at you. The high arc of the slow travelling ball also gives you time to think and aim, instead of just reacting.

A soccer ball flying towards your face can be really scary.

Make failures positive

Again, this is not specific to VR. When your player fails, you want to encourage them, or have something happen that makes the failure seem not so bad/worthwhile/interesting. Don’t insult them! An example of this is how ILMxLabs did their Star Wars VR experience, Trials on Tatooine. At one point in the game, Han asks you to do something (not gonna spoil it!). Given his sarcastic nature, you’d expect him to make fun of you if you fail, but he doesn’t because well, an insult in VR feels as bad as if somebody came up to you in real life and called you stupid.

On the other hand, there’s a difference between failing because the player isn’t skilled enough, and willfully disobeying instructions, e.g. you tell them to choose A but they choose B. Think Stanley Parable style. In the case of the latter, feel free to have some fun with them!


Whew that was long. I actually still have a few more things, but these are the main points which I really want to capture. I’ll probably update this in future with other topics (social VR, sense of presence, etc.) to be a bit more comprehensive. For now, I don’t really have the time with all my deadlines looming, so I’m just gonna end this piece here, very abruptly :D

P.S. my mentor took a look at this article and she wants me to submit it to Gamasutra!! Jesse Schell agrees and has offered to help, except that he thinks it’s only 60% complete and I need another 40% more content before it’s ready :O

Advertisements

4 thoughts on “Designing for VR

  1. Pingback: GDC ’16 in a nutshell | Sarrie's

  2. A good collection of tips for VR! I liked the tip “Make hands generic” — I worked on a non-VR Myo game where we used the hand models from Leap Motion — a lot of people were discomforted by the large hairy hands in the model — so I guess this isn’t specific to VR either, although it’s probably more of a problem in VR.

    I guess I’ll share something I’ve learned — I worked on a social VR game. With voice chat, latency, or the time from when you start talking to when the other person hears it, matters quite a lot. Some latency is unavoidable due to information having to travel (the speed of light), on the order of around ~10ms. Cutting down all other latency can matter a great deal though. There’s some threshold, maybe in the ballpark of 100ms, which feels much more real. It’s a subtle detail, but beyond that threshold (whatever it is), you lose a lot of nuances of speech — effectively passing off the discussion (it’s harder to interrupt otherwise since the person doesn’t hear you for the duration of the latency and continues talking). It starts to become much more like one person says a whole long contiguous amount of speech, and then the other person does (kind of hard to describe). It is noticeable though, and with experience, I’ve learned to somewhat accurately guess the latency on video conferences, phone calls, etc. So, if you make a social VR game, aim for low latency!

    Like

  3. This is a really handy list of tips! Having designed a VR experience myself, I can definitely speak to keeping a consistent frame rate and guiding new users around the space. The section on new users is extremely important to keep in mind; in most consoles you can assume your audience has an intuitive understanding of the controller, but with a new platform like VR, players have so much freedom they may not even think about, like looking up or shaking their head. I also really like the examples on encasing UI elements in objects: the exit burrito was pretty fun to see.

    Like

  4. Wow. All of the guidelines you’ve outlined here are easy to understand and make sense, and I think this is totally something that would be great for Gamasutra. This should save me some of the pain of trial and error should I ever work on a VR project. I can’t really contribute much to this dialogue since you’ve covered pretty much everything I would have considered and more, so that ought to be a testament to how comprehensive and instructive this was. Thanks, I learned a lot.

    Like

Share your thoughts with me!

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s