Ronald van Gimst, a senior R&D engineer at NLR, explains what the project aims to accomplish: “In a VR environment, the user sees only the images presented by the VR headset and nothing of the surrounding world. So the user doesn’t even see his own hands in the simulation and gets no haptic feedback when touching virtual objects. This is not a problem when playing computer games for fun, but it does pose a problem when you want to experience highly realistic training situations. The bottom line is that even when wearing a VR headset you want to be able to use physical instruments in a simulation to enhance its realism, because that’s extremely important in military training. Without having to think about it, you must be able to interact with your environment in a natural way.”
This latter point presents a major challenge: how do you make actions feel exactly the same as they do in the real world? It can be accomplished in various ways, but the most important thing is that the physical object must be in exactly the same place as it is in the VR visualisation. When touching a physical object in the training environment, the movement must correspond with the one that occurs in the virtual world. By way of example, Van Gimst says that if you want to pick up a coffee cup in a virtual world and the physical cup is 5 centimetres to the left, you will inevitably encounter problems.