The buzz about augmented reality continues to grow and isn’t showing any sign of stopping. In fact, as tech giants introduce the fusion of digital and the real work to mass markets, users embrace the unfamiliar with open arms. In the case of augmented reality (AR), it seems to be coming easier than virtual reality. There are a couple reasons for this. The layered nature of AR—digital enhancements that augment a real backdrop—might feel more comfortable to new users. And it’s available on tools we already own: smartphones and tablets. Simply taking a headset out of the equation removes a barrier to usage that VR still struggles with today.
Pokémon Go is the most ubiquitous adoption of AR so far, breaking usage and revenue records within a week of launching. The game sent players out into the real world to catch Pokémon characters and gather spoils in a scavenger hunt-like mission. Images were digitally overlaid into parks, neighbourhoods, famous landmarks, and even the ocean, to encourage players to immerse themselves not only into the game, but into the real world. The blend of screen time and outdoors was a smashing success; it brought people outside again! For some, it gave renewed hope to a future that might potentially enable technology to isolate people.
With that hope firmly in place, Apple’s ARKit and Google’s ARCore are about to cement augmented reality more securely into the zeitgeist. The mass market launch brings free development tools into the hands of more than 100 million users. Anyone with an iPhone 6 and up, Google Pixel, Samsung S8, or Android N on up will be supported at the end of the year. Users will be able to explore and create marker-free immersive experiences, meaning that using QR codes to shore up images will become a thing of the past. This new marker-free era of augmented reality tracks the environment around you using the camera and accelerometer and gyroscope to provide an accurate sense of space and even objects.
Plane detection gauges the geometry of the ground beneath you, as well as tables, chairs, and any flat surfaces. This promotes incredible realism because it enables accurate shadow casting; the user is treated to a previously-unseen sense of depth, elevating the experience and comfort level. But geometry isn’t the only high school throwback that makes plane detection and visualization cool. You’ve got physics, too. Dropping cubes on a table, knocking objects off the edge, bouncing balls. These have limitless potential for realism and physical interaction.
Another important feature is light estimation. It evaluates the amount of real-world light in the environment, which creators can use to render the virtual content more realistically. Shifts in the light are reflected in virtual objects, so a figure in the sunset will darken appropriately. Subtle visual improvements like these stack up to an immensely upgraded experience.
The last major feature worth mentioning in both ARKit and ARCore is hit-testing, which estimates how far something is, and returns accurate distance measurements. This helps with depth perception and placement of objects within planes.
So what are the limitations of ARKit and ARCore? For one thing, neither platform can track arbitrary objects or hand gestures. All interaction must be screen-based, although other platforms like Microsoft HoloLens does support hand gestures.
Also, virtual content cannot be hidden behind real world objects. So if you created a figure and someone stepped in front of it, the figure would reappear in front. This ruins the illusion of realness and pulls the user out of the experience. However, this does not hold true with planes. If your virtual object falls of a plane, it will remain hidden since the plane is a detected entity.
So how augmented reality will change the way we live??