Getting started with AR-Kit Plane detection

Sai Balaji
6 min readAug 23, 2023

--

AR Kit is an application programming interface(API) which allow the developers to build augmented reality apps using device CPU, GPU, camera and motion sensors. ARKit uses SceneKit and SpriteKit for rendering 3D and 2D entities.

Getting started with AR-Kit

  • Open Xcode select iOS AR App
  • Give a name to the project and select Scene-Kit

Now open ViewController.swift it has a IBOutlet for ARSCNView which is a view that blends virtual 3D content from Scene-Kit into your augmented reality experience.

Inside viewDidLoad() method remove the existing code and the following lines

 override func viewDidLoad() {
super.viewDidLoad()

// Set the view's delegate
sceneView.delegate = self

// Show statistics such as fps and timing information
sceneView.showsStatistics = true
sceneView.debugOptions = [.showFeaturePoints,.showWorldOrigin]
// Create a new scene
let scene = SCNScene()

// Set the scene to the view
sceneView.scene = scene

}

Here first we assign delegate of type ARSCNViewDelegate which has some useful methods which can be used to mediate the automatic synchronization of Scene-Kit content with an AR session. We will use those methods later for plane detection.

Then we set showsStatistics a boolean value that determines whether Scene-Kit displays rendering performance statistics in an accessory view.

Then we set some debug options like .showFeaturePoints whic is a point cloud showing intermediate results of the scene analysis. The .showWorldOrigin display a coordinate axis visualization indicating the position and orientation of the AR world coordinate system.

Finally we create a SCNScene and assign it to the ARSCNView.

Now when we run the app we can see that ARKit uses device camera to detect the features and show them using point clouds. More the number of points more the number of interesting feature detected by the ARKit.

In addition we can also see that it displays a coordinate axis visualization indicating the position and orientation of the AR world coordinate system.

Detecting Plane is AR-Kit

ARKit can detect the planes which can be used to anchor our 3D virtual objects in real-world. ARKit supports both horizontal and vertical plane detection. Here we are going to see horizontal plane detection which can be used to anchor our 3D virtual objects.

  • Create and extension to ViewController class and confirm to ARSCNViewDelegate and implement the following methods
  • func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor)
  • func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor)
  • func renderer(_ renderer: SCNSceneRenderer, didRemove node: SCNNode, for anchor: ARAnchor)

The renderer(_:didAdd:for:) method gets called when ARKit adds a new anchor to the scene. In ARKit the anchor is a connection between real world and virtual world. ARKit keeps track of the anchors and use them know where to place the virutal object in real world.

The renderer(_:didUpdate:for:) method is called when the ARKit updates the existing anchor data. It means that ARKit has redefined its understanding of the anchor’s position, orientation with respect to the real world. This method is called when ever the anchor in the scene gets updated.

The renderer(_:didRemove:for:) method is called whenever ARKit removes an anchor from the scene. It removes an anchor when it is no longer tracking it. We can use this method to clean up the virtual objects which are attached to those anchors.

All of those above three delegate methods comes with same type of parameters.

  • renderer is responsible for rendering the scene in this case its ARSCNView
  • node is the SCNNode attached to the anchor. The node usually represents the virtual object attached to the anchor
  • anchor is the ARAnchor added, updated or removed from the scene. It holds information about position, orientation etc.

In ViewController.swift create a function named createPlane() and add the following code.

func createPlane(planeAnchor: ARPlaneAnchor,node: SCNNode){
let planeGeomentry = SCNPlane(width: CGFloat(planeAnchor.extent.x), height: CGFloat(planeAnchor.extent.z))
planeGeomentry.materials.first?.diffuse.contents = UIImage(named: "wood")
planeGeomentry.materials.first?.isDoubleSided = true
var planeNode = SCNNode(geometry: planeGeomentry)
planeNode.position = SCNVector3(x: planeAnchor.center.x, y: 0.0, z: planeAnchor.center.z)
planeNode.eulerAngles = SCNVector3(x: Float(Double.pi) / 2, y: 0, z: 0)
node.addChildNode(planeNode)
}

Then inside renderer(_:didAdd:for:) call the function.

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
print("Add")
guard let planeAnchor = anchor as? ARPlaneAnchor else{return }
self.createPlane(planeAnchor: planeAnchor, node: node)
}
  • Here first we type cast the ARAnchor to ARPlaneAnchor since we are interested in detecting planes. Then we call the createPlane() by passing the ARPlaneAnchor and SCNNode as parameter. Inside the function we create a flat plane geometry which we use to visualize the detected plane by specifying the width and height parameter equal to the width and height of the plane detected by the AR-Kit.
  • Then we assign a wood texture to the plane geometry.
  • Then we create a SCNNode using the SCNPlane geometry and set the position as the anchor’s position. By default ARKit places the SCNPlane vertical to the surface so we rotate it by 90° so that the plane is flat to the surface.
  • Finally we attach our node as a child node to the node which is attached to the detected anchor so that our node will also be attached to the anchor.

Now if you run the app in physical device you can see that a flat plane with wooden texture appears when ARKit detects an horizontal plane. If you try to move the device you can see that new planes will be added whenever ARKit detects a new surface. Because whenever a new horizontal plane is detected ARKit calls the renderer(_:didAdd:for:) delegate method in which we create a flat plane geometry and attach it to the the detected horizontal plane.

Next we need to further improvise it such that when ever ARKit refines its understanding about the detected flat plane we also need to update our flat plane geometry so that it matches with the updated plane information. We can do it inside the renderer(_:didUpdate:for:) method which gets called when ever AR-Kit redefines its understanding about the flat plane and updates the anchor data.

Inside the renderer(_:didUpdate:for:) method add the following code.

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard let planeAnchor = anchor as? ARPlaneAnchor else{return }
print(planeAnchor.center)
print(planeAnchor.extent)
self.updatePlane(planeAnchor: planeAnchor, node: node)
}

In ViewController.swift create a function named updatePlane() and add the following code.

 func updatePlane(planeAnchor: ARPlaneAnchor, node: SCNNode){
if let planeNode = node.childNodes.first{
if let planeGeomentry = node.childNodes.first?.geometry as? SCNPlane{
planeGeomentry.width = CGFloat(planeAnchor.extent.x)
planeGeomentry.height = CGFloat(planeAnchor.extent.z)
planeNode.position = SCNVector3(x: planeAnchor.center.x, y: 0.0, z: planeAnchor.center.z)
}
}
}

Here first we safely retrieve the child node attached from the node parameter to which we have attached our flat plane inrenderer(_:didAdd:for:) method then we get the geometry associated with the node and safely downcast it to SCNPlane so that we can get hold of the flat plane node which we have attached to the anchor previously. Then we update the flat plane geometry width and height property to match the extent of the detected plane. The extent of the ARPlaneAnchor has information about width (x), height (z) and length (y) of the detected plane.

Finally in renderer(_:didRemove:for:) method add the following code.

func renderer(_ renderer: SCNSceneRenderer, didRemove node: SCNNode, for anchor: ARAnchor) {
node.enumerateChildNodes { childNode, _ in
childNode.removeFromParentNode()
}
}

The renderer(_:didRemove:for:) method is called when ARKit removes the anchors which are no longer being tracked we use this to perform clean up by removing the flat plane attached to the anchors which are no longer being tracked.

Now run the app in physical device.

As you can see that when ever AR-Kit detects a new anchor we add a wooden flat plane. As it refines its understanding about the scene it updates the anchor information which can be visualized by the two separate wooden planes which are finally merged into a single plane.

--

--