Virtual reality is more immersive when you can pick up objects with your bare hands, rather than a controller or a pair of wand-style remotes. Leap Motion is one of the frontrunners in this area, having pivoted its candy bar motion-tracking sensor from desktop accessory to VR headset companion. To raise interest in the product — which you still have to attach manually to an Oculus Rift or HTC Vive — it’s developed a new piece of software called the “Interaction Engine.” Available as an add-on for Unity, it promises a more realistic experience while interacting with make-believe objects.
The big problem, Leap Motion argues, is that traditional game engines weren’t designed with human hands in mind. We move in sudden, unpredictable ways, gripping objects with different levels of proficiency. When you pick up a sponge, for instance, it should flex and compress in the places where your fingers are exerting pressure. In VR, these nuances are difficult to track and simulate. If you push a rubber ball against the floor, for instance, most physics engines will be overwhelmed and send the sphere flying in a weird, unrealistic direction. The Interaction Engine solves this issue by implementing “an alternate set of physics rules” which trigger whenever your hands are touching or “inside” a virtual object.