ARKit 101: How to Place a Virtual Television & Play a Video on It in Augmented Reality

How to Place a Virtual Television & Play a Video on It in Augmented Reality

In a previous tutorial, we were able to place the Mona Lisa on vertical surfaces such as walls, books, and monitors using ARKit 1.5. By combining the power of Scene Kit and Sprite Kit (Apple's 2D graphics engine), we can play a video on a flat surface in ARKit.

In this tutorial, you'll learn how to make your augmented reality app for iPads and iPhones by using ARKit. Specifically, we'll go over how we can play a video on a 3D TV in ARKit.

What Will You Learn?

We'll be learning how to play a video on a 2D plane using Scene Kit and Sprite Kit inside of ARKit.

Minimum Requirements

  • Mac running macOS 10.13.2 or later.
  • Xcode 9.4 or above.
  • A device with iOS 11+ on an A9 or higher processor. Basically, the iPhone 6S and up, the iPad Pro (9.7-inch, 10.5-inch, or 12.9-inch; first-generation and second-generation), and the 2017 iPad or later.
  • Swift 4.0. Although Swift 3.2 will work on Xcode 9.4, I strongly recommend downloading the latest Xcode to stay up to date.
  • An Apple Developer account. However, it should be noted that you don't need a paid Apple Developer account. Apple allows you to deploy apps on a test device using an unpaid Apple Developer account. That said, you will need a paid Developer account in order to put your app in the App Store. (See Apple's site to see how the program works before registering for your free Apple Developer account.)

Step 1: Download the Assets You Will Need

To make it easier to follow along with this tutorial, I've created a folder with the required 2D assets and Swift file needed for the project. These files will make sure you won't get lost in this guide, so download the zipped folder containing the assets and unzip it.

Step 2: Set Up the AR Project in Xcode

If you're not sure how to do this, follow Step 2 in our post on piloting a 3D plane using hitTest to set up your AR project in Xcode. Be sure to give your project a different name, such as NextReality_Tutorial9. Be sure to do a quick test run before continuing on with the tutorial below.

Step 3: Import Assets into Your Project

In the project navigator, click on the "Assets.xcassets" folder. We'll be adding our 2D images there. Then, right-click on the left pane of the area in the right side of the project navigator. Choose "Import" and add the "overlay_grid.png" from the unzipped Assets folder.

Then, right-click on the "art.scnassets" folder, which is where you will keep your 3D SceneKit format files. Afterwards, select the "Add Files to 'art.scnassets'" option. Then, add the "tv.dae" file from the unzipped "Assets" folder you downloaded in Step 1 above.

Next, once again in the project navigator, right-click on the yellow folder for "NextReality_Tutorial9" (or whatever you named your project). Then, choose the "Add Files to 'NextReality_Tutorial9" option.

Next, navigate to the unzipped "Assets" folder, and choose the "Grid.swift" file. Be sure to check "Copy items if needed" option and leave everything else as is. Then, click on "Add."

This file will help render an image of a grid for every vertical plane ARKit detects.

Step 4: Use hitTest to Place 3D TV on Detected Horizontal Plane

To quickly go over ARKit's plane detection capabilities, take a look at our tutorial on horizontal plane detection.

Open the "ViewController.swift" class by double-clicking it. If you want to follow along with the final Step 4 code, just open that link to see it on GitHub.

In the "ViewController.swift" file, modify the scene creation line in the viewDidLoad() method. Change it from:

let scene = SCNScene(named: "art.scnassets/ship.scn")!

To the following (which ensures we are not creating a scene with the default ship model):

let scene = SCNScene()

Next, find this line at the top of the file:

@IBOutlet var sceneView: ARSCNView!

Under that line, add this line to create an array of "Grid's" for all vertical planes detected:

var grids = [Grid]()

Copy and paste the following two methods as listed below to the end of the file before the last curly bracket ( } ) in the file. These methods will allow us to add our Grid on the vertical planes detected by ARKit as a visual indicator.

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
    guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
    let grid = Grid(anchor: planeAnchor)
    self.grids.append(grid)
    node.addChildNode(grid)
}

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
    let grid = self.grids.filter { grid in
        return grid.anchor.identifier == planeAnchor.identifier
        }.first

    guard let foundGrid = grid else {
        return
    }

    foundGrid.update(anchor: planeAnchor)
}

Let's quickly go over what's happening in these two methods:

  1. The didAdd() is called whenever a new node is added to the ARSCNView. Here, we add the grid image we imported to any plane detected.
  2. The didUpdate() is called whenever newer ARPlaneAnchor nodes are detected, or when the plane is expanded. In that case, we want to update and expand our grid as well. We do that here by calling update() on that specific Grid.

Now, let's enable feature points. Under this line in viewDidLoad():

sceneView.showsStatistics = true

Add the following:

sceneView.debugOptions = ARSCNDebugOptions.showFeaturePoints

Next, let's turn on vertical plane detection. Under this line in viewWillAppear():

let configuration = ARWorldTrackingConfiguration()

Add the following:

configuration.planeDetection = .horizontal

This is very important! It will ensure that ARKit is able to detect horizontal planes in the real world. The feature points will allow us to see all the 3D points ARKit is able to detect.

Now, run your app on your phone and walk around. Focus on a well-lit horizontal surface such as the ground or a table; you should be able to see blue grids appear whenever a horizontal plane is detected:

Next, let's add gesture recognizers, which will allow us to detect where to place the TV.

Add this to the end of viewDidLoad():

let gestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(tapped))
sceneView.addGestureRecognizer(gestureRecognizer)

Now, let's add the tapped(), which converts the 2D coordinate from the tapped location on our phones into a 3D coordinate using the hitTest .

Add this to the end of the file, but before the last bracket:

@objc func tapped(gesture: UITapGestureRecognizer) {
    // Get 2D position of touch event on screen
    let touchPosition = gesture.location(in: sceneView)

    // Translate those 2D points to 3D points using hitTest (existing plane)
    let hitTestResults = sceneView.hitTest(touchPosition, types: .existingPlaneUsingExtent)

    guard let hitTest = hitTestResults.first else {
        return
    }
    addTV(hitTest)
}

Finally, let's add the addTV() at the end of the file, but before the last bracket:

func addTV(_ hitTestResult: ARHitTestResult) {
    let scene = SCNScene(named: "art.scnassets/tv.scn")!
    let tvNode = scene.rootNode.childNode(withName: "tv_node", recursively: true)
    tvNode?.position = SCNVector3(hitTestResult.worldTransform.columns.3.x,hitTestResult.worldTransform.columns.3.y, hitTestResult.worldTransform.columns.3.z)
    self.sceneView.scene.rootNode.addChildNode(tvNode!)
}

This method ensures that we add our 3D TV based on the 3D coordinate calculated by the hitTest. Run the app and tap on a detected horizontal plane. You should now be able to see a TV every time you tap, something like this:

Checkpoint: Your entire project at the conclusion of this step should look like the final Step 4 code on my GitHub.

Step 5: Play a Video on Our 3D TV!

What's cooler than watching a video on phones? Watching a video in augmented reality on our phones! If you remember from our last tutorial, we placed the Mona Lisa on a wall. Let's use that same video from that tutorial and play it on our 3D TV.

Let's import the video over to our project. In the project navigator, right-click on the yellow folder for "NextReality_Tutorial9" (or whatever you named your project). Choose the "Add Files to 'NextReality_Tutorial9" option. Choose to add the "video.mov" file (you should see something like this):

Next, let's go back to our addTV() method.

Right above this line:

self.sceneView.scene.rootNode.addChildNode(tvNode!)

Add this code:

let tvScreenPlaneNode = tvNode?.childNode(withName: "screen", recursively: true)
let tvScreenPlaneNodeGeometry = tvScreenPlaneNode?.geometry as! SCNPlane

let tvVideoNode = SKVideoNode(fileNamed: "video.mov")
let videoScene = SKScene(size: .init(width: tvScreenPlaneNodeGeometry.width*1000, height: tvScreenPlaneNodeGeometry.height*1000))
videoScene.addChild(tvVideoNode)

tvVideoNode.position = CGPoint(x: videoScene.size.width/2, y: videoScene.size.height/2)
tvVideoNode.size = videoScene.size

let tvScreenMaterial = tvScreenPlaneNodeGeometry.materials.first(where: { $0.name == "video" })
tvScreenMaterial?.diffuse.contents = videoScene

tvVideoNode.play()

Here, we'll import our video into a SKVideoNode and attach it to Scene Kit scene. We'll then set the correct size of this scene and attach it to our existing SCNNode TV Node. This makes sure that our video scene is attached to our TV. Then, we'll play the video.

Run the app once again and, now, after placing the TV, the video should start playing and look something like this:

Checkpoint: Your entire project at the conclusion of this step should look like the final Step 5 code on my GitHub.

What We've Accomplished

Success! Using the steps above, we were able to place a 3D TV in augmented reality and play a video on it using ARKit. Imagine the future implications of this kind of AR dynamic. Eventually, when AR glasses are mainstream, we're all going to be able to watch TV on very large screens anywhere. This is already possible with devices like the HoloLens and the Magic Leap One, and now we've done it using ARKit right on our phones. Now try experimenting by taking things to the next level and including your own videos onto the 3D TV.

If you need the full code for this project, you can find it in my GitHub repo. I hope you enjoyed this tutorial on ARKit. If you have any comments or feedback, please feel free to leave them in the comments section. Happy coding!

Just updated your iPhone? You'll find new emoji, enhanced security, podcast transcripts, Apple Cash virtual numbers, and other useful features. There are even new additions hidden within Safari. Find out what's new and changed on your iPhone with the iOS 17.4 update.

Cover image & screenshots by Ambuj Punn/Next Reality

Be the First to Comment

Share Your Thoughts

  • Hot
  • Latest