Add AR experiences to apps in minutes using Apple’s new Reality Composer for Mac and iOS

“The new tool is a godsend for anyone looking to build AR apps and games”

Augmented Reality could be the new reality for everyone, at least if Apple has its way. Announced among a slew of other announcements at WWDC 2019 were a bunch of new tools and techniques that could make it extremely simple for developers to add AR experiences into their apps. And judging by what I saw, I wouldn’t be surprised if we see a whole new generation of AR-based apps and games making their way on to the App Store.

Take for example, ARKit 3, the latest set of developer tools for creating AR apps. With ARKit 3, the focus seems to be on people, and well, how real people interact with virtual objects in AR. The new People Occlusion feature can make AR content pass behind and in front of people in the real world, thereby making AR look that much more real and more immersive. The motion tracking feature in ARKit 3 captures the movements of a person in real time and passes those on as input to the AR experience, thereby providing a relevant new addition to stuff like educational apps and games. Then there’s RealityKit, a high-level framework that boasts camera effects, photo-realistic rendering, animations, physics and lots more.


But the real highlight, judging by the demo I saw on the sidelines of WWDC, has to be Reality Composer. This is a new app for iOS and Mac, and staying true to its moniker, helps developers “compose reality” and create interactive AR experiences without any prior 3D experience. The latter is key, since it can potentially save a developer lots of time while working on an AR-based project. Moreover, Reality Composer provides a GUI that could help even a person without extensive coding skills to come up with some fun AR stuff. Reality Composer comes with a massive library of assets already built in, all placed at the developer’s disposal to use. These include 3D models and animations that can simply be dragged and dropped into a project. Furthermore, the developer can even import 3D files in USDZ format into the project.


The virtual objects can be customised in terms of size, style and various other aspects. They can be animated, moved, scaled, and assigned actions that happen when a user interacts with them. Spatial audio can be added as well, effectively bringing a new level of immersion to the scene.

And since Reality Composer is available on both Mac as well as iOS, a developer could choose to create a project on Mac and then use an iPad to test it out in a live environment. The whole process is supposed to be seamless, and designed to make things extremely simple for the developer. Heck, I’m not a coder and even I’m tempted to try out Reality Composer.

In case you’re tempted too, Reality Composer for macOS is included in Xcode 11 as one of the developer tools. The iOS app is in beta, and one needs to request access via Apple’s developer website. So go ahead and give it a whirl if you’re so inclined.

Disclosure: this writer attended WWDC 2019 on Apple India’s invitation