8

Unity reveals latest AR Companion app feature at Apple WWDC 21

 2 years ago
source link: https://blog.unity.com/technology/unity-reveals-latest-ar-companion-app-feature-at-apple-wwdc-21
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Unity reveals latest AR Companion app feature at Apple WWDC 21
Image of phone capturing an image of a guitar
Share

At this year’s WWDC, Apple announced their exciting new technology for capturing real-world objects. Unity has been working closely with Apple to bring Object Capture into our Companion-to-Editor workflows.

Since the introduction of Apple’s mobile technology, Unity has made it easy for users to seamlessly create experiences that harness the power of new tech on Apple’s cutting-edge devices. Coming later this year, the Unity AR Companion app will include Object Capture to allow XR creators to scan real-world objects and generate 3D assets.

One pain point for many Unity developers is figuring out how to integrate the unpredictable real world into their digital experience—everything from scanning the space to figuring out where things are, to taking real-world information and embedding it in your experience. The Unity AR Companion app extends authoring beyond the Editor, providing ways to create and capture in context. We’ve allowed you to capture plane, mesh, and other environment info to help you build your app, and we’re excited to bring real-world objects to the list.

An exciting next step in content creation & AR

The Object Capture project also speaks to a larger trend in creation tools: using companion apps on specific hardware in conjunction with the Unity Editor, meaning that we can use the medium and each device for what they're best suited for.  An AR-enabled phone is a great device to do environment capture, AR session recording, and now object capture. 

It’s worth noting this new technology also supports processing any existing photo set. This "traditional camera" use case is important to support, because this opens up the tech to an even wider audience, and brings the tantalizing capability of doing photogrammetry using images you've captured in the past. Say you took photos years ago of a favorite, now-lost heirloom; with this tech, you may be able to see that object in full 3D glory again.

The capture experience

This new Object Capture functionality is built into the iOS version of the Unity AR Companion App.  The AR Companion App will release later this fall, and we'll do a full documentation how-to then, but for today, we wanted to give an overall rundown of this workflow, and discuss some of the thinking that went into it.  

The experience starts in the AR Companion App, where, this fall, you'll find a new mode: Object Capture.  

Before you start capturing photos, you'll be presented with an interactive UI to set up a guide object over your object to capture.  After lining up the guide, you're free to start taking photos.

Like any photogrammetry process, you'll want to take quite a few photos of the object, from every angle you can.  For each photo, we drop a "pin" on the shell, to communicate that this angle has been captured. At any point, you can flip the object over or sit it on its side to capture the bottom or underhanging edges.  

We analyze each photo as you take them, seeking out low quality images which can lead to a bad result.  When we detect a blurry or otherwise unusable photo, we mark this pin red, and let you investigate and optionally delete and re-attempt the photo. 

Once you've taken all your photos, it's time to head to Unity on the Mac to process the photos, generate your model, and put it to use.  

We support a few different entry points: first, new for this project, we've added the ability to use local wireless file transfer.  We also support our existing and robust Companion Resources Sync workflow, where 'Captured Objects' has been added to our existing resource types, such as image markers and environment scans.  Finally, we support the case of simply using a directory of local images, regardless of whether they were captured with the Companion App on an iPhone, or just with a conventional DSLR camera.  

Whichever source you're using, once you have selected the images you want to use, it's time to start processing.  

This happens in two stages: first, a preview-quality model is processed.  Using that preview, you can adjust a bounding box and make any necessary translation and rotation adjustments. 

Then with one more button click, the full-quality model is processed.  This step uses much of the data from the preview model generation, meaning that despite being higher quality, the second processing stage doesn't take significantly longer than the preview.

And that's it!  The final model is imported into your project as a prefab, ready to be used in your app.

Building in best practices

Testing early versions of the Object Capture mode in the AR Companion App, we realized that we had a great opportunity to use AR to guide the user towards best practices for photogrammetry capture.  While the first version had simple written instructions, we were met with the time-honored tradition of users not reading, and simply dismissing the prompt and going in blind.  So we introduced the guide object.  

While the guide isn't strictly necessary, we found it invaluable for giving the user feedback about how best to take the images, and maximize coverage by clearly showing which areas haven't yet been covered.  Along with the guide, we introduced the photo pins, showing exactly where you have already captured photos, as well as the low-quality photodetection and feedback. 

The guide and pins, while enormously helpful, did introduce a new consideration: since we're not tracking the object that's being scanned, we can't automatically move the guide if the object is moved.  But if the guide and pins remain where they were and the object has been turned over, the photos represented by the pins no longer correlate to the side of the object we're now facing.  We tried a few clever solutions but ultimately came down to the simplest: we encourage the user to re-place the guide, and reset the pins, after moving the object.  We continue to track the total percentage of the object covered, which helps to communicate that the previous photos haven't been lost.  

In total, we found these AR affordances to be a clear case of AR as a helpful teaching technology, especially for new interactions that involve physically moving around. 

Looking forward

This project is exciting to us for a number of reasons, first and foremost because of how deeply it aligns with our mission to continue to democratize content creation.  While some game developers have used photogrammetry in their pipelines for years, it can be a specialized and frustrating process.  Apple's announcement of this functionality means that this capability is now much more accessible to a wide range of creators, and we're looking forward to seeing indie game developers, mid-sized studios, students, and more now start to use real-world object capture in their process.

But what we're maybe even more excited about is how this powerful toolset, now more accessible on everyday devices, can unlock content creation and curation for non-developers.  When we started the project, we gathered use cases for the tech and were quickly struck at how much wider the impact can be than just games.  For example, the owner of a music store, previously limited to posting images of instruments coming through the shop, can now use Object Capture to create stunning, realistic, high-quality, easily shareable captures of each instrument in the shop, ready to share online, or (of course, where our hearts are) in AR.  

By integrating this powerful creative technology directly into our tools, we unlock non-traditional users. People like curators, architects, artists, and designers will be able to bring their ideas to life almost instantly. We can’t wait to bring you this feature later this year.

Learn more about building intelligent AR applications with Unity MARS.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK