Object Capture, a photogrammetry tool built on the Swift programming language, was announced by Apple at WWDC 2021. The software will debut on the Monterey edition of macOS via RealityKit 2, the next version of Apple’s AR engine.
Object Capture combines a series of photographs to generate a 3D model of the subject. Users can use their iPhones, iPads, or other cameras to capture photos in sequence and then import them to Reality Kit 2 to create the 3D model. They can also preview the content through the AR Quick Look of the model to confirm accuracy.
The content created using Object Capture can then be deployed into the AR experiences created via Reality Composer or Xcode and third-party platforms like Unity MARS and Maxon’s Cinema 4D.
The furniture retailer Wayfair and arts and crafts marketplace Etsy are amongst the few early adopters of the platform. Wayfair, specifically, is deploying Object Capture to expand the products customers can preview via ARKit in their mobile app.
Along with Object Capture, Apple also brings a new set of APIs via RealityKit 2 for “more realistic and complex AR experiences with greater visual, audio, and animation control, including custom, render passes and dynamic shaders.”
Monterey will arrive as a free software update this fall, but for those who cannot wait until then, it is currently available as a developer beta, and the public beta is set to arrive next month.
Follow us on LinkedIn
Read other Articles