Image Tracking and Detection using AR-Kit

Sai Balaji
4 min readSep 6, 2023

AR-Kit can detect detect and react to 2D images in real world which can be used to position or place AR contents. AR-Kit provides a special type of anchor called as ARImageAnchor which is added to the real world when ARKit detect an image. In this article we are going to build a simple AR App which detects a flat 2D image and places an ARImageAnchor to which we will attach a 3D model.

  • Create an ARKit project with SceneKit and Swift.
  • Inside viewDidLoad() method remove the existing sample code and add the below code.
  override func viewDidLoad() {
super.viewDidLoad()

// Set the view's delegate
sceneView.delegate = self

// Show statistics such as fps and timing information
sceneView.showsStatistics = true

// Create a new scene
let scene = SCNScene()
// Set the scene to the view
sceneView.scene = scene
}

Here we create an empty SCNScene and add it to the ARSCNView

Adding images to AR Resources

  • In Assets.xcassets add a new AR Resource Group
  • Inside it drag and drop the reference image which is to be detected
  • Provide a name and set the dimension of the image(setting a proper dimension is important in-order to accurately recognize the position and orientation of an image in the AR environment, AR-Kit must know the image’s physical size. You provide this information when creating an AR reference image in your Xcode project’s asset catalog, or when programmatically)
  • Inside viewWillAppear() add the following code
  override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)

// Create a session configuration
let configuration = ARImageTrackingConfiguration()
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else{return }
configuration.trackingImages = referenceImages

// Run the view's session
sceneView.session.run(configuration)
}

Here we create an ARImageTrackingConfiguration object instead of ARWorldTrackingConfiguration as we need to perform image tracking. Then we load the reference images from the assets which gives a Set. Finally we assign the reference images to the trackingImages property of ARImageTrackingConfiguration and start the session.

  • Create an extension for ViewController class and add the following code
extension ViewController: ARSCNViewDelegate{
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else{return }
DispatchQueue.main.async {
print(imageAnchor.referenceImage.name)
self.addPlane(anchor: imageAnchor, node: node)
}

}


}

Here implement the renderer(_:didAdd:for:) method which is called when ever a new anchor is added to the scene. First we type cast the anchor from ARAnchor to ARImageAnchor and then we call a custom function addplane() which is takes ARImageAnchor and the corresponding SCNNode as a parameter.

  • Inside the addPlane() method add the following code
  func addPlane(anchor: ARImageAnchor,node: SCNNode){
let plane = SCNPlane(width: anchor.referenceImage.physicalSize.width, height: anchor.referenceImage.physicalSize.height)
plane.materials.first?.diffuse.contents = UIColor.red
plane.materials.first?.isDoubleSided = true
let planeNode = SCNNode(geometry: plane)
planeNode.position = SCNVector3(x: anchor.transform.columns.3.x, y: anchor.transform.columns.3.y, z: anchor.transform.columns.3.z)
planeNode.eulerAngles = SCNVector3(x: Float(Double.pi / 2), y: 0.0, z: 0.0)
node.addChildNode(planeNode)
}

Here we create a simple SCNPlane to visualize image anchor added to the scene. By default the SCNPlane is added vertical so we rotate it along x-axis by 90 degree to make it horizontal. Now run the app on a physical device to view the SCNPlane which is added when ARKit detects and adds an image anchor.

Loading a 3D-Model when an image is detected

When an image is detected by AR-Kit it adds ARImageAnchor to the scene. The anchors in AR-Kit represents the features detected by AR-Kit. It provides information about position and orientation of the detected feature. But in-order to actually attach a virtual object to the anchor we use the SCNNode given by the renderer(_:didAdd:for:) method. Here we are going to use the fender stratocaster guitar 3D model provided by Apple.

  • Drag and drop the 3D Model inside the art.scnassets group and convert the .usdz file to .scn file by going to Editor -> Convert to SceneKit file format(.scn)
  • Inside the renderer(_:didAdd:for:) remove the existing code and add the below code.
 func addPlane(anchor: ARImageAnchor,node: SCNNode){        
let guitarScene = SCNScene(named: "art.scnassets/fender.scn")!
let guitarNode = guitarScene.rootNode.childNode(withName: "fender_stratocaster", recursively: true)
guitarNode?.position = SCNVector3(x: anchor.transform.columns.3.x, y: anchor.transform.columns.3.y + 0.02, z: anchor.transform.columns.3.z)
guitarNode?.scale = SCNVector3(0.002, 0.002, 0.002)
guitarNode?.runAction(SCNAction.repeatForever(SCNAction.rotateBy(x: 0.0, y: 1.0, z: 0.0, duration: 1.0)))
node.addChildNode(guitarNode!)
}

Here we first load the fender.scn present in the art.scnassets directory in app bundle. Then from the loaded SCNScene we extract the SCNNode named fender_stratocaster and set the scale for the node. Finally we create an SCNAction which will rotate the node repeatedly along the y-axis and attach it as a child node to the node provided by the ARImageAnchor so that we can place our 3D-model to the location represented by the ARImageAnchor.

Now run the app on a physical device.

--

--