Spinning Up in VR — Part 7: Challenges in Virtual Reality

Manorama Jha
9 min readDec 2, 2018

Hi, welcome to Spinning Up in VR! I am Manorama and this is the seventh of a 9-part tutorial on Virtual Reality (VR) for beginners.

In the previous parts I have talked about the HCI in VR and Navigation in VR and in this chapter, I will introduce the Challenges in Virtual Reality. If you are interested in reading the previous or the next chapters, the links are at the bottom of this article.

  1. Realism | Graphics

Here, we’re going to discuss challenges in rendering realistic looking graphics in VR. In VR applications it is overly important to create realistic looking graphical representations of the real world.

  • For instance, in training and therapy, if the graphics do not reflect the real world then the skills acquired in VR won’t be transferable. In this field, you would really like to be able to represent the real world just like a photo and that’s why we use the term photo-realistic to describe those virtual environments.
  • Other times we might want to create an imaginary fantasy world but this does not mean that we no longer care about realism altogether. We still want users to somehow understand and connect with the real world. In order for this to happen the virtual objects must interact with light in a similar way as they do in real life. For instance, water reflects light in a certain way — that makes us perceive it like water. In order to create an environment that looks photo-realistic or at least reflects a certain degree how the real world works, we need to have a basic idea of how lighting works in real life.
  • If you look around at the real objects around you, you can see that light interacts with different objects in different ways. Something made from fabric is a perfectly diffuse surface, which reflects light evenly and equally in all directions and produces a smooth and even looking reflection. On the other hand, something made from metal is a specular surface. In this case, lights are reflected towards a certain direction which produces a highlight effect.
  • Diffuse reflection is computationally cheaper to produce. Specular reflection, on the other hand, is a bit more mathematically complicated so often it is more computationally expensive to render. But in everyday life, we usually have both of these extreme situations and more often a combination of both.
  • While you’re planning to illuminate your VR environment, typically, there are three types of surface you will need to deal with: diffuse, specular and glossy or mirror.
  1. Diffuse surfaces are the cheapest to render.
  2. Specular is a little more complex.
  3. Glossy or mirror surfaces are very expensive.

So in VR or 3D computer graphics in general, when we talk about illumination realism we mean how lights are reflected off 3D objects. This is important in creating a realistic and believable environment. There are also practical functions. For example, architects may want to render their architectural design before building it for real, so that they can understand the lighting inside a building as well as the impact the building will have on surrounding areas.

2. Realism | Animation

Here, we’re going to discuss the challenges of creating realistic and believable animations in VR. Suppose there is a nice looking computer-generated environment but it would not feel so real if everything just stayed still. You would expect to see the leaves on the tree moving because of wind, the birds on the house flying, and the sun either setting or rising. So it would get darker or brighter soon. None of this is too difficult to animate, there needs to be a lot of programming and computation.

  • The good news is that most game engines nowadays come with a built-in physics engine that takes care of physical simulations of this kind, so you can just play with the parameters until you are happy with the effect. This physics engine would animate the clothes (for example) automatically in an optimized way. Another powerful animation tool that often comes as part of a game engine is called a particle system. It often is used to generate effects, such as smoke, fire, and snow.
  • However, the really difficult thing to animate is Superman or any other 3-D virtual characters in VR. There are a lot of things you need to get right when you implement a virtual character. As human beings ourselves, we all are very critical observers of those virtual characters. So if there is anything that goes slightly wrong, we will instantly spot it. A lot of the challenges in creating believable virtual character’s animations in VR overlap with the ability of users to interact with them.
  • The animation is not a problem with most 360-degree videos as in this case you are just capturing objects and their animation. But a problem is that the only thing you can do once you’ve captured a video is cut and re-sequence the video clips. You can not manipulate objects in the footage like you can with model-based VR. So you would not be able to program the real human actors in the video to rotate their heads to follow the user at least not in a naturalistic way.

3. Navigation

Let’s take a look at navigation in VR. With high-end VR devices which support position tracking, you can shift your body naturally to see what’s behind an object in front of you and walk around with your legs just like you do in real life. On the other hand in mobile VR as there is no position tracking, most of the time you will need to use the VR controllers for both, to shift your viewpoint or to move to a different place.

  • First of all not all VR applications require the user to move around that much. In most of the VR applications in the areas of training, therapy, or experimental studies in social psychology participants are just asked to sit down on a chair. They can still shift their body or move their body back and forth supported by position tracking, but there is no need for them to walk around with their legs. If you do need your participants to walk around in the VR space then the best way is for them to move around exactly the same way as they do in real life i.e., using their legs to move forward or backwards and using their body to turn left or right. This is often called physical navigation.
  • The problem with this is that you are limited by the physical space you’re in. Another problem is obviously that it won’t work with mobile VR devices as we cannot track where a user is in the 3-D space. The other extreme is to use virtual navigation where users control their movements entirely using the joystick or touchpad that comes with their VR controllers. So just like in a 2-D game, users shift the joystick to the left and right to look around and forward and back to move. This isn’t a problem for games on a 2-D display.
  • Another method to move from one place to another is called teleportation. If you’ve ever used Google street view or something similar then you have probably tried it. Basically, there are a few fixed positions the users can choose to teleport themselves to. In all the methods we have introduced, physical navigation is the most natural one but it is constrained by the real world physical space the user is in. Walk-in-place is less natural and it gives no sense of acceleration but it does not normally cause nausea.

Teleporting is a good way to move around quickly in VR without making the user feel too dizzy. But a user often gets a bit disoriented when they land in a new location. Virtual navigation, which relies entirely on a 2D user interface, that is, the joystick or touchpad is not encouraged as it causes nausea. Finally, in 360 videos, the users normally cannot navigate the environment freely. However, some 360 video allows users to teleport for example, as we mentioned earlier Google Street View.

4. Nausea

Nausea is one of the common health hazards of VR. Let us discuss how VR triggers nausea and ways to avoid or reduce this uncomfortable experience.

  • Nausea in VR is also called simulation sickness, which refers to the discomfort induced by simulated environments. The main cause of nausea is conflict between information received in the brain from the vestibular system and the visual system. In the real world case you may feel nauseatic when you are physically moving while looking at something relatively still. So your vestibular system tells the brain that you are moving, but your visual system says you are not, and that’s what makes you feel sick.
  • With simulation sickness, it’s almost the opposite. Say, you’re sitting in your living room chair playing a VR car simulation game. You’re physically not moving, but because you are immersed in VR your visual system is convinced that you are. This then causes the conflict between your vestibular and visual systems which makes you feel very sick.
  • The best way to avoid simulation sickness is to pay attention to how users could navigate in the virtual environment. Physical movements where users use their own body to move around in VR just like they do in real life tend not to cause much nausea.
  • If you don’t want to limit user movement by their physical space, you could consider methods such as walk in place or teleporting. The worst method is to directly import the default first person controller used in many games where users can move around with the keyboard and mouse and a joystick.
  • Finally, just as not everyone experiences motion sickness in cars but some people are less prone to simulation sickness than others. Especially if you are a VR developer and have spent quite a lot of time in VR, your brain is probably more used to it. So it’s important to remind yourself that your game might feel completely harmless to yourself but it could make other people feel very sick and the best way to find out is to test it on typical users your app is designed for.

5. Haptic Feedback

Another big problem in VR interaction is haptic feedback.

  • VR displays are now very good at tricking us into thinking that virtual optics are really in front of us but our ability to interact with them in a naturalistic way is still quite poor. Thanks to the tracking technology and the vibration feedback that current VR controllers provide we can use them to select and manipulate virtual objects. However, in order to really do this in a naturalistic way, we would need VR haptic gloves.
  • Instead of holding on a controller and pressing a button to grab objects, we should be able to just grab them with our hands. There are a few VR gloves available which are all well integrated with current VR displays. But in order to really grab a virtual object and feel that we are holding something in our hands ideally, the VR gloves should be able to give force feedback. These are normally called the exoskeleton.
  • VR gloves, as they rely on an external skeleton outside of the glove to support force feedback. There are a couple of bodysuits which provide both force feedback and tactile feedback. They are mainly designed for gameplay so when you’re being punched by your opponent you actually feel the force of that.
  • In summary, the current VR controllers are more like a borrowed concept from 2D user interfaces and do not really support VR interactions in the 3D space in a very naturalistic way. I believe we will soon see VR gloves in the consumer market some with beauty in force feedback so maybe soon enough you will be able to shake hands with someone virtually. For hardcore gamers, a full body VR suit could be something to look forward to which will make gaming more immersive.

However, certain challenges in haptic feedback, such as tactile feedback, or the feeling of gravity when holding an object, remain unsolved for now.

In the later chapters, we will look closely into what are the applications in VR and what are the open use-cases in virtual reality. Stay tuned!

I hope that anyone who reads the entire series in the given order you will have a clear visualization of VR technology. Please feel free to leave your comments below for feedback. You can find me on Twitter(@mnrmja007), Facebook(@mnrmja007) and Medium(Manorama Jha).

--

--

Manorama Jha

Software Development Engineer at Gridraster Inc. | Mixed Reality | Computer Vision | www.manoramajha.com