The ARKit Reading List

2018/3/2

Been prototyping an AR toy for a potential application, and I’ve dug deep with Apple’s ARKit and SceneKit.

In case you are still confusing the two: ARKit handles environmental recognition, telling the app if there’s a surface in scene (and if so, how far away is this surface and how big is it); SceneKit handles 3D rendering of your custom objects.

To create a realistic rendering you’d want ARKit to recognize the environment and SceneKit to handle rendering.


To start, Apple has a nifty page on how ARKit works in mostly plain language.

There’s also Apple’s ARKit section in the Human Interface Guidelines. This section provides basic do’s and don’ts on application and interaction design.


To get your hands dirty, Mark Dawson has a series of 4 posts on Medium (1, 2, 3 and 4) on how to:

  1. Set up your first AR app and understand the relationship between ARSession and SCNScene
  2. Detect surface with ARKit
  3. Apply physics to rendered 3D objects (apply gravity, allow objects to bump into one another, etc.)
  4. Apply realistic rendering (dubbed in the industry as Physics Based Rendering, or PBR)

The only quarrel I’d have is that Mark used Objective-C in his example. Most developers have moved on to Swift. Apple’s official documentation is more or less tuned for the Swift community.


Understanding and applying PBR:


More in-depth articles on improving rendered results:


Some of my additional thoughts as a PM:

iOS App Development
文章列表