Back to Blogs

6 feet away: How to create your own distance visualizer on iOS

Augmented Reality Development
Article
Technology, Information and Media
Well, just how good can Augmented Reality (AR) get? Way better, if Apple has anything to say about it.

ARKit on iOS is a multifaceted package that simplifies the task of building an AR experience. But with the latest updates, it has broken out of its mould to take its place at the top of AR development.

The chief differentiator is its World Tracking — which positions virtual 3D objects in the real world. It tracks and measures distances with greater precision than ever before — especially with its enhanced depth perception and people occlusion (the camera focuses on the virtual object and blocks out people in the frame).

But first, a look at RealityKit

A big part about quality AR is the integration of virtual objects into the real world. RealityKit is an Apple framework that makes this possible.

It’s built on top of ARKit and is preferred by our developers for 3 chief reasons:

  • Compatible with SwiftUI
  • Directly supports world tracking
  • Easy placement of virtual objects

But (this is the last but, we promise) to integrate RealityKit into an app, we need to understand how ARView and ARSession work.

ARView

This is the view in RealityKit that allows an user to interact with an AR experience. It can construct a Scene using the positions of objects and overlaying them on the real world in the view.

ARSession

ARSession is the brain of your AR experience. The session contains all the configuration settings one needs to position objects in a 3D environment.

It works with ARView to keep track of all the virtual objects in space along with the captured feature points in the real world — becoming the bridge between the real world and virtual space.

ARSessions are inside ARViews by default. We can configure how the work by creating a TrackingConfiguration object along with a few other options.

ARWorldTrackingConfiguration

This invites the question:

How does ARKit even know where to position a virtual object?

It does this by supporting different types of tracking configurations using your behavior as a cue. Since we wanted to explore real world measurements, we went with ARWorldTrackingConfiguration

ARWorldTrackingConfiguration

How does ARKit know where to position a virtual object? 


ARKit supports different types of tracking configurations based on what you desire, and since we wanted to explore real world measurements, we went with ARWorldTrackingConfiguration — which is a configuration that triangulates the iOS device’s position and orientation. This enables it to augment the user’s environment with virtual objects.

Now, we can create an instance of ARWorldTrackingConfiguration to make sure it’s configured with the right options and then pass it on to the session. 

Here’s a code sample:

To start off, we created an instance of ARView to position on the screen. Next, we made an instance of ARWorldTrackingConfiguration — which is actually super important and needs you to keep track of a few important parameters:

  • Plane Detection: We let the configuration know that we are interested in real world planes. This refers to surfaces like floors, tablets, and walls.
  • Frame Semantics: This lets us gear the configuration toward identifying people and distances within the AR environment. Additionally, you need to ensure the device you’re using has a true-depth camera to support this depth information.
  • SceneUnderstanding: We explicitly tell the ARView that we’re interested in occlusion. This helps us render depth perception and obstacles better.
  • Finally, we use the new config object to call the ‘run’ method on the ARView’s session, and make sure that the session starts on a fresh slate without previous tracking information. 


With this newly initialized AR experience, we can add virtual objects into our experience and explore anchors and entities. 

Scene Composition

Every scene in an ARView has the following relation with anchors and entities:


Anchor:

RealityKit provides a protocol called HasAnchoring. This describes points in the real world that act as hooks or as anchoring POVs for virtual objects to launch into real-world surfaces. An example of an important class is the AnchorEntity.

Entity: 

Think of this as the atom in an augmented space. It allows you to add characteristics like dimensions, surfaces, and colors that can interact in an AR scene. The Entity is rarely used by itself, so for all practical purposes, developers still choose to go with AnchorEntity or ModelEntity.


Model Entity:

Model Entities are virtual objects (with simulated physics) placed in the AR space. Just like their real-world counterparts, they come with attributes like: 

  • Dimensions (length, width, height)
  • Surfaces with texture (called Meshes)
  • Colors, and Lighting Properties


We can also add interactivity to our model entities by adding gestures on ARView. These can be linked to the model entity that the gesture is referring to — something that RealityKit calculates in real time. To make all this easy, ARView makes available a few gestures: 

  1. translation: A single touch drag gesture, to move entities along their anchoring plane
  2. rotation:  A multi-touch rotate gesture, to perform yaw rotation
  3. scale: A multi-touch pinch gesture, to scale entities
  4. all: All gesture types


To enable them, we just have to call a function named “installGestures” on the ARView

Now that we’ve explored Entities, let’s jump into some code:

So these functions generated something called Model Entities which are concentric circular discs with real world dimensions in meters.

These get overlaid on top of each other. Then, we apply a function (to create a simpler material with a color) to apply to a Model Entity constructed as a circular plane. Four of these circular planar model entities then get added to a single model entity — called the entity. 

The simulated physics for each gets grouped in the parent, which we use to construct an Anchor Entity. This Anchor can host our Model Entity to host our circular discs in the augmented reality scene. 

And the final result is this:


Voila! You now have your own 6 feet visualizer at your beck and call!

(You’re welcome)

Reimage the world you see.

Sai Teja

Sai Teja works at Mutual Mobile as the iOS Technical Lead.

More by this author