Integrating Room Plan API in iOS app
In WWDC 2022 Room Plan API was introduced which uses Lidar in iPad pro and iPhones to create a 3D floor plan of a room, including key characteristics such as dimensions and types of furniture.
Designing the UI
For this we are going to keep our app UI simple open Main.storyboard and add the UI as shown below
Here we have a view controller embedded inside a navigation controller. The view controller contains a table view to list the scanned models. The ModelViewer VC is used to view the scanned models which are stored in the disk. The RoomScanner VC is the one which performs actual scanning. Here we have a UIView which is indicated by blue color which will be replaced by the RoomScanner API camera view during run time and some labels to display statistical data. Finally there is an export button which exports the saved model as .usdz file and store it in iOS files app. It will be loaded and displayed in the table view on the first view controller.
Implementing scanning using RoomPlan API
Inside viewDidLoad() of the ViewController.swift file add the following code.
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
navigationItem.rightBarButtonItem = UIBarButtonItem(title: nil, image: UIImage(systemName: "camera.aperture"), target: self, action: #selector(scanBtnPressed))
title = "Room Scanner"
navigationController?.navigationBar.prefersLargeTitles = true
}
Here we setup the rightBarButtonItem for the navigation controller which wen pressed will navigate to the RoomScanner VC.
In RoomScannerVC first import RoomPlan then create IBOutlets and the following properties.
@IBOutlet weak var doorCountLbl: UILabel!
@IBOutlet weak var wallCountLbl: UILabel!
@IBOutlet weak var openingCountLbl: UILabel!
var roomBuilder = RoomBuilder(options: [.beautifyObjects])
@IBOutlet weak var roomScannerView: UIView!
private var roomCaptureView: RoomCaptureView!
private var roomCaptureSessionConfiguration: RoomCaptureSession.Configuration = RoomCaptureSession.Configuration()
private var finalResult: CapturedRoom?
Here the RoomBuilder
is an object that generates a 3D asset from room-capture data. Then we specify the options. The .beautifyObjects
enum value realigns chairs around a table, if applicable, in the output for the captured room. Then we create a session configuration for RoomCapture Session which has configuration settings to manage room scanning process. Finally we create a variable of type CapturedRoom
which is a A struct that provides the key details of a scanned room.
Create a function and add the below code
func setupRoomCapture(){
roomCaptureView = RoomCaptureView(frame: roomScannerView.bounds)
roomCaptureView.captureSession.delegate = self
roomCaptureView.delegate = self
roomScannerView.insertSubview(roomCaptureView, at: 0)
}
Here instantiate the roomCaptureView object and set the bounds to the blue UIView from the storyboard. Then we setup the capture session delegate that observes important events in the room-scanning process and also RoomCaptureView
delegate an object that determines whether to post-process the results of a scan.
Create an extension to conform to RoomCaptureSessionDelegate protocol and implement the following stubs.
extension RoomScannerVC: RoomCaptureSessionDelegate {
func captureSession(_ session: RoomCaptureSession, didUpdate room: CapturedRoom) {
DispatchQueue.main.async {
self.doorCountLbl.text = "\(room.doors.count)"
self.openingCountLbl.text = "\(room.openings.count)"
self.wallCountLbl.text = "\(room.walls.count)"
}
}
func captureSession(_ session: RoomCaptureSession, didEndWith data: CapturedRoomData, error: (Error)?) {
//called when capture is stopped or stopped with an error
if let error{
print(error)
}
Task{
let finalroom = try! await roomBuilder.capturedRoom(from: data)
print(finalroom.objects)
}
}
}
The captureSession(_:didUpdate:)
method has the live snapshot / wholesale update of the session. We use this to get statistical data from the scanning process such as number of openings, no of doors, no of walls detected by the scanner and update the UI by using the Main Thread as the scanning process by default runs on background thread.
Create an another extension for RoomScannerVC and implement RoomCaptureViewDelegate
and the following stubs.
extension RoomScannerVC: RoomCaptureViewDelegate{
func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: (Error)?) -> Bool {
return true
}
func captureView(didPresent processedResult: CapturedRoom, error: (Error)?) {
//handle final post processed result
print(processedResult)
self.finalResult = processedResult
exportBtn.isHidden = false
}
}
The captureView(shouldPresent:error:)
method returns a bool value that determines whether the app receives and displays post-processed scan results when the scan session stops. Here we need to view the preview of the scanned result so we return true
.
The captureView(didPresent:error:)
method provides the delegate with the processed scan results as the view presents them which we assign to the finalResult
variable of type CapturedRoom
Create an IBAction for the export button and add the following code
@IBAction func exportBtnPressed(_ sender: Any) {
if let finalResult{
var fm = FileManager.default
var path = fm.urls(for: .documentDirectory, in: .userDomainMask).first!
let fileName = "\(UUID().uuidString).usdz"
path.appendPathComponent(fileName)
do{
try finalResult.export(to: path.absoluteURL)
}
catch{
print(error)
}
}
}
Here we obtain the file path for the document directory of the app sandbox and create a random unique file name for the .usdz model then export the scanned model data present in finalResult variable using export(to:exportOptions:)
method which saves a captured room to the specified location with the specified export option.
Finally inside the RoomScannerVC view controller life cycle methods add the following code.
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
exportBtn.isHidden = true
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
startSession()
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
navigationItem.largeTitleDisplayMode = .never
setupRoomCapture()
navigationItem.rightBarButtonItem = UIBarButtonItem(title: "Stop", image: nil, target: self, action: #selector(stopScanning))
}
Fetching saved models from disk
First create a custom cell for the UITableView with following UIComponents in Xib.
Then inside the Swift file of the UITableView cell add the following code.
class ModelCell: UITableViewCell {
@IBOutlet weak var modelPreviewView: SCNView!
@IBOutlet weak var creationDate: UILabel!
@IBOutlet weak var modelName: UILabel!
// var modelPath: String = ""
override func awakeFromNib() {
super.awakeFromNib()
// Initialization code
}
override func setSelected(_ selected: Bool, animated: Bool) {
super.setSelected(selected, animated: animated)
// Configure the view for the selected state
}
func updateCell(path: String,modelName: String,creationDate: String){
var model = try! SCNScene(url: URL(string: path)!)
model.background.contents = UIColor.gray
modelPreviewView.allowsCameraControl = true
print(model.rootNode.childNodes)
let lightnode = SCNNode()
lightnode.light = SCNLight()
lightnode.light?.type = .directional
lightnode.position = SCNVector3(x: 0, y: 10, z: 20)
lightnode.light?.color = #colorLiteral(red: 0.9686274529, green: 0.78039217, blue: 0.3450980484, alpha: 0.5373810017)
model.rootNode.addChildNode(lightnode)
modelPreviewView.scene = model
self.modelName.text = modelName
self.creationDate.text = creationDate
}
}
Here we load the model using the filepath and render it in SceneKit view but specifying the camera, directional light position.
In ViewContoller.swift file create a custom struct with following properties for filePath and the creation date of the model file.
struct ScannedModel{
let filePath: String
let creationDate: String
}
Then create an IBOutlet for the UITableView and array of ScannedModel objects which will act as the data source for the table view.
Create a custom function and add the following code
func getFilePaths(){
let fm = FileManager.default
let path = fm.urls(for: .documentDirectory, in: .userDomainMask).first!
do{
let content = try fm.contentsOfDirectory(atPath: path.path)
for c in content{
self.scannedModels.append(ScannedModel(filePath: path.appendingPathComponent(c).absoluteString, creationDate: "\(try! fm.attributesOfItem(atPath: path.appendingPathComponent(c).path)[.creationDate] as? NSDate)"))
}
self.scannedResultTV.reloadData()
}
catch{
print(error)
}
}
Here we get the file path to the document directory and obtain the file path for each individual items present in the document directory and append them to the scannedModels array with the file path and creation date of that file. Then we reload the table view to update its data source.
Create an extension for the ViewController and add the following code
extension ViewController{
func getFileCreationDate(path: URL) -> String?{
do{
if let date = try FileManager.default.attributesOfItem(atPath: path.path(percentEncoded: false))[.creationDate] as? Date{
let formatter = DateFormatter()
formatter.dateFormat = "MM/dd/yyyy"
return formatter.string(from: date)
}
}
catch{
print(error)
}
return nil
}
}
Here we have a custom helper function to format the date to MM/dd/yyyy format and return it as a String
Create another extension for ViewController and implement the stubs for UITableViewDelegate
and UITableViewDataSource
Protocol.
extension ViewController: UITableViewDelegate, UITableViewDataSource{
func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
return scannedModels.count
}
func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
if let cell = tableView.dequeueReusableCell(withIdentifier: "CELL") as? ModelCell{
cell.updateCell(path: scannedModels[indexPath.row].filePath, modelName: (scannedModels[indexPath.row].filePath as! NSString).lastPathComponent,creationDate: self.getFileCreationDate(path: URL(string: scannedModels[indexPath.row].filePath)!)!)
return cell
}
return UITableViewCell()
}
func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) {
if let vc = UIStoryboard(name: "Main", bundle: nil).instantiateViewController(withIdentifier: "ModelViewerVC") as? ModelViewerVC{
vc.modelFilePath = scannedModels[indexPath.row].filePath
navigationController?.pushViewController(vc, animated: true)
}
}
}
Here in tableView(_:didSelectRowAt:)
method when the table view cell is clicked we navigate to the next ViewController and also pass the file path of the model present in the selected row.
In ModelViewerVC.swift add the following code
import UIKit
import SceneKit
class ModelViewerVC: UIViewController {
var model: SCNScene!
@IBOutlet weak var optionsSegmentControl: UISegmentedControl!
var type: SCNLight.LightType!
@IBOutlet weak var modelViewer: SCNView!
var modelFilePath: String = ""
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
print(modelFilePath)
self.type = .ambient
configureUI()
optionsSegmentControl.selectedSegmentIndex = 1
}
func configureUI(){
modelViewer.scene?.rootNode.removeFromParentNode()
model = try! SCNScene(url:URL(string: "\(modelFilePath)")!)
modelViewer.showsStatistics = true
// modelViewer.debugOptions.remove(SCNDebugOptions.renderAsWireframe)
modelViewer.allowsCameraControl = true
if modelViewer.debugOptions.contains(SCNDebugOptions.renderAsWireframe){
model.background.contents = UIColor.black
}
else{
model.background.contents = UIColor.gray
}
let lightNode = SCNNode()
lightNode.light = SCNLight()
//lightNode.light?.color = UIColor.blue
lightNode.position = SCNVector3(x: 150, y: 10, z: 100)
lightNode.light?.type = self.type
model.rootNode.geometry?.firstMaterial!.fillMode = .lines
model.rootNode.addChildNode(lightNode)
modelViewer.scene = model
}
@IBAction func didSegmentValueChange(_ sender: UISegmentedControl) {
switch sender.selectedSegmentIndex{
case 0:
modelViewer.debugOptions.insert(SCNDebugOptions.renderAsWireframe)
model.background.contents = UIColor.black
self.type = .ambient
configureUI()
break
case 1:
modelViewer.debugOptions.remove(SCNDebugOptions.renderAsWireframe)
self.type = .ambient
configureUI()
break
case 2:
self.type = .directional
modelViewer.debugOptions.remove(SCNDebugOptions.renderAsWireframe)
configureUI()
break
default:
break
}
}
}
Here we first load the model using the file path passed from the didSelect row method of the table view in previous view controller and render it using SceneKit view same as the UITableView cell. In addition we have a UISegmentedControl to modify the properties of the SceneKitView to change the rendering method as textured mesh or wire frame, and modify the type of light source.
Now build the project and run it in physical iOS or iPadOS device which supports LiDAR. Simulator does not support camera or LiDAR.
Entire project can be found here