Prerequisites

Argo for iOS makes it easy to use volumetric content, but here are some aspects of iOS development you should be familiar with.

  • Xcode for developing apps. We currently require Xcode 9.1.
  • UIKit for managing UI
  • Carthage for dependency management
  • a basic knowledge of SceneKit will be useful for the manipulation of 3D assets

If you get stuck you can download the finished project from ArgoKit+ARKit-Example.zip. The HVRS file and audio file that are used in this tutorial can be found in this project too.

Installation

Setup

Create a new Xcode project using the Single View App template.

We distribute ArgoKit using Carthage, a decentralized dependency manager that builds your dependencies and provides you with binary frameworks.

In the same folder as your .xcodeproj file, create a text file called Cartfile. Add the following lines to the file and save:

binary "https://d7cvc31wlmbhf.cloudfront.net/8i+Website/Dev+Portal/Downloads/Argo-iOS-CarthageBinary.json" ~> 2.0
binary "https://d7cvc31wlmbhf.cloudfront.net/8i+Website/Dev+Portal/Downloads/HVR-iOS-CarthageBinary.json" ~> 1.1

Usage

Configure your Xcode project

  1. Build your Carthage dependencies. Open Terminal and cd to the directory containing your Cartfile. Run this command:

    carthage update --platform iOS
    
  2. Select your app target in Xcode. Select the General tab. From Finder, drag the ArgoKit.framework from Carthage/Build/iOS/ into the Linked Frameworks and Libraries section. Do not select ‘Copy items if needed’.

  3. ArgoKit also depends on another library that we provide called HVRPlayerInterface.framework. Drag that into the Linked Frameworks and Libraries section as well.

  4. Select your app target, and go to Build Phases.

  5. Add a ‘New Run Script Phase’. Rename it ‘Carthage’.

  6. Add /usr/local/bin/carthage copy-frameworks below the Shell

  7. Add $(SRCROOT)/Carthage/Build/iOS/ArgoKit.framework to Input Files

  8. Add $(SRCROOT)/Carthage/Build/iOS/HVRPlayerInterface.framework to Input Files

  9. Add $(BUILT_PRODUCTS_DIR)/$(FRAMEWORKS_FOLDER_PATH)/ArgoKit.framework to Output Files

  10. Add $(BUILT_PRODUCTS_DIR)/$(FRAMEWORKS_FOLDER_PATH)/HVRPlayerInterface.framework to Output Files

  11. Finally, in Build Settings set Enable Bitcode to No.

Make sure everything builds with a real device. Building with the Simulator is not currently supported. Find the app in the Products folder in the Xcode project navigator. Right-click, select Show in Finder. Right-click the app in Finder and select Show Package Contents. Open the Frameworks folder and ensure that the ArgoKit and HVRPlayerInterface frameworks are there.

Set up SceneKit

Now you’re ready to start adding ArgoKit to your code.

import UIKit
import ARKit // 1
import ArgoKit // 1
import Constraid

class ViewController: UIViewController {
        var sceneView: ARSCNView = { // 2
                let sv = ARSCNView(frame: UIScreen.main.bounds)
                sv.loops = true // this is necessary
                sv.showsStatistics = true // this is just for testing/debugging
                sv.debugOptions = [.showBoundingBoxes] // this is just for testing/debugging
                return sv
        }()

        // Note: This is using the VisualizePlanesARSceneRendererDelegate class that we provided so
        // that you can see the detected planes. If you don't want this behavior then simply uncomment
        // the line below, and comment out the VisualizePlanesARSceneRendererDelegate() line
        //let sceneRendererDelegate = ArgoSceneRendererDelegate()
        let sceneRendererDelegate = VisualizePlanesARSceneRendererDelegate() // 3

        override func viewDidLoad() {
                super.viewDidLoad()

                view.addSubview(sceneView)
                Constraid.flush(sceneView, withEdgesOf: view).activate()

                sceneView.delegate = sceneRendererDelegate as ARSCNViewDelegate // 3
        }

        override func viewWillAppear(_ animated: Bool) {
                super.viewWillAppear(animated)

                let configuration = ARWorldTrackingConfiguration() // 4
                configuration.planeDetection = .horizontal // 4

                sceneView.session.run(configuration) // 4
        }

        override func viewWillDisappear(_ animated: Bool) {
                super.viewWillDisappear(animated)

                sceneView.session.pause() // 5
        }
}
  1. Import ARKit and ArgoKit into the ViewController.
  2. In the view controller create a ARSCNView property
  3. ARSCNView uses a rendering delegate to perform custom rendering options. ArgoKit for iOS provides the ArgoSceneRendererDelegate class, for you to use. Note: You need to have this exist as property because the ARSCNView delegate property is a weak referenc.
  4. In the ViewController’s viewWillAppear override setup sceneView.session and configuration
  5. In the ViewController’s viewWillDisappear override, pause the sceneView.session

Setup the scene

Add the assets you’ll be using for the HVRS and audio playback.

  1. Drag the examples Wrestler.hvrs file from Finder into your project. Make sure to select ‘Copy files if needed’.
  2. Drag the examples Wrestler.wav file from Finder into your project. Make sure to select ‘Copy files if needed’.
  3. In the Build Phases of your app’s Target, make sure both files are present in the ‘Copy Bundle Resources’ section so that they are copied to your app during building. Sometimes the hvrs file won’t automatically be put into the target.

Next, the HVRS file and the audio file will be used to create an Argonaut that can be wrapped in a SCNNode and added to the scene. Add a new property to your scene to hold our hologram.

let argonaut: Argonaut = {
        let hvrFilePath = Bundle.main.path(forResource: "Wrestler", ofType: "hvrs")!
        let wavFilePath = Bundle.main.path(forResource: "Wrestler", ofType: "wav")!

        let argonaut = Argonaut(name: "myWrestlerAsset", hvrPath: hvrFilePath, audioPath: wavFilePath)

        return argonaut
}()

var hologramNode: SCNNode!
  1. Create a property to return an ArgoKit.Argonaut, used to create a node and manipulate playback state of the actor.
  2. Declare a hologramNode that is a SCNNode that we will initialize later in the viewDidLoad method.
  3. Set the sceneView.scene = SCNScene() to create the basic scene in the viewDidLoad method.
  4. Initialize the hologramNode with hologramNode = Argo.argoNode(argonaut: argonaut)! in the viewDidLoad method.
  5. Position the hologramNode with hologramNode.position = SCNVector3Make(0.00, -1.50, -3.00) in the viewDidLoad method.
  6. Add hologramNode to scene with sceneView.scene.rootNode.addChildNode(hologramNode) in the viewDidLoad method.
  7. Add ViewController as a ArgoKit playback listener with Argo.playback.listeners.add(object: self) in the viewDidLoad method.
  8. To make the previous step compile and playback we of course need to extend the ViewControllor to implement the ArgoKit PlaybackDelegate as follows.
extension ViewController: PlaybackDelegate {
        func didInitialise(argonaut: Argonaut) {
                Argo.playback.play(argonaut: argonaut)
                print("didInitialise")
        }

        func didPlay(argonaut: Argonaut) {
                //print("didPlay")
        }

        func didStop(argonaut: Argonaut) {
                //print("didStop")
        }

        func didPause(argonaut: Argonaut) {
                //print("didPause")
        }

        func didUpdate(argonaut: Argonaut, progressProportion: Float, time: Float) {
                //print("didUpdate")
        }
}

You should now be able to build and run the app and see the hologram in the scene. It however will be floating as we haven’t handled positioning the Hologram onto a detected surface yet. That comes next.

Place the Hologram on a Surface

You are likely going to want to implement a more complex placement solution than this long term. Probably one with a focus square/circle or surface visibility like we have provided in the example project. However, to get you started you can do simple placement onto surfaces by adding the following to the ViewController.

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        guard let touch = touches.first else { return }
        let results = sceneView.hitTest(touch.location(in: sceneView), types: [.existingPlane, .estimatedHorizontalPlane])
        guard let hitFeature = results.last else { return }
        let hitTransform = SCNMatrix4(hitFeature.worldTransform)
        let hitPosition = SCNVector3Make(hitTransform.m41,
                                                                         hitTransform.m42,
                                                                         hitTransform.m43)

        hologramNode.runAction(SCNAction.move(to: hitPosition, duration: 0.25))
}

The above takes the 2d coordinate of the touch on the sceneView (touch.location(in: sceneView)) and passed that into hitTest to see if the project of that 2d point into the 3d environment intersects with an existingPlane or an estimatedHorizontalPlane. If it doesn’t it short circuits and returns out of the method. If however it does hit, it then moves the hologramNode to the hit location.