Augmented reality is awesome. Whether you’re trying out Google Glass for the first time or killing virtual zombies in The Rolling Dead, AR makes you feel like the coolest kid on the block. But let’s face it, not everyone knows exactly what the term means, or exactly what makes it so awesome. Here are the details you’ve always pretended to know, answered by our very own vision engineer for Sphero’s augmented reality technology.
Augmented Reality – A technology that allows virtual objects and events to be added to digital representations of reality. Primarily, this applies to rendering virtual objects into real video feeds, like the one in your smartphone. One example is Sharky the Beaver, who is virtually imposed on top of the physical object, Sphero.
Mixed Reality – In terms of gaming, mixed reality refers to drawing real world objects and virtual elements into gameplay simultaneously. Sphero provides a mixed reality gaming experience because while you control the ball via an app on your smartphone, it’s a real life object.
Fiducial / Marker – An object that is trackable in video using computer vision techniques. Such objects range from the very simple (e.g. Sphero) to complex images, to 3D objects.
Superimpose – This is the same as ‘overlay’. Sharky the Beaver is superimposed on top of Sphero, the fiducial, during gameplay to create a dynamic augmented reality experience.
3D – Once we have coordinates for the position of the device’s camera and the objects we care about (Sphero), existing 3D technology lets us draw whatever we want (Sharky). This 3D rendering is then superimposed on the physical object.
Accelerometer – This is one of the solid-state sensors in Sphero that measures changes in Sphero’s linear velocities as well as collisions. This allows augmented reality gameplay with Sphero to seem especially lifelike, because 3D characters like Sharky are directly impacted when Sphero runs into something like a wall.
Gyroscope – This is the other sensor in Sphero that measures how the ball is rotating in space, like the spin of a baseball.
Sensor Fusion Algorithm – This is a bunch of math that combines parts of the accelerometer and gyroscope measurements into an orientation for Sphero – that is, in which direction he is pointing in space (the same way a flashlight has a 3D orientation and as a result, lights up a section of the room). This allows us to render Sharky facing the right direction and to know what he’s doing, even when he’s off camera.
Sensor Data – This is the driving force behind all of Sphero’s AR. The data consists of the raw numbers that come right out of Sphero’s sensors that have two components: signal and noise. We use software techniques to filter out the noise, leaving only the data, which is then presented to the fusion algorithm.
Vision SDK – This software developer kit (SDK) gives devs access to our vision algorithms which compute useful information like the camera position/orientation and Sphero position/orientation from the video feed so that they can make their own apps.
Unity Vision plugin – This allows software developers to use our Vision SDK from the Unity game engine, making it easy to deploy AR apps for both Android and iOS devices without having to write them twice.
Now that you’re an expert in augmented reality terminology, you’re ready to dive deeper. Check out our articles on Understanding Augmented Reality and Sphero Augmented Reality to get an even better idea of the tech behind the Sphero experience. For more information about our SDK’s, head over to the developer page.