24

Apple ARKit To Get People Occlusion, Body Tracking, High Level ‘RealityKit’ Fram...

 4 years ago
source link: https://www.tuicool.com/articles/EbmIZnB
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

At its annual WWDC conference today Apple announced big new updates to ARKit, including people occlusion and body tracking.

The company also announced a high level AR framework called RealityKit and easy to use AR creation tool Reality Composer.

People Occlusion, Body Tracking

With previous versions of ARKit, and with Google’s ARCore, virtual objects would always show up on top. If someone walked in front of the object it would still render as if the person were behind it. This looks wrong and instantly breaks the illusion that the virtual object is really in the environment.

ARKit 3 introduces real time human occlusion, which means if a person walks in front of a virtual object it will remain behind them.

This understanding of human movement can also be used for body tracking, enabling use cases such as animating a virtual character in real time from human movement.

RealityKit & Reality Composer

Until now, most ARKit experiences have been developed using engines like Unity. But Unity was designed to develop full scale games, not AR experiences. For regular app developers looking to add AR elements, it has a steep learning curve and a plethora of irrelevant UI and configuration to deal with.

BBVJfqY.png!web

RealityKit is a new high level framework from Apple made specifically for AR. It handles all aspects of rendering including materials, shadows, reflections, and even camera motion blur. It also handles networking for multiplayer AR apps, meaning developers won’t need to be network engineers to develop shared AR experiences.

The framework’s rendering engine takes full advantage of hardware acceleration in Apple chips and performant Apple APIs.

ZjaMzyR.png!web

Apple is also launching a new macOS tool called Reality Composer. This tool lets developers easily visually create AR scenes. Developers can add animations like movement, scaling, and spinning. These animations can be set to be triggered when a user taps on or comes close to an AR object.

Reality Composer scenes can be integrated directly into iOS apps using RealityKit. Alternatively, low level developers can use it as a prototyping tool.

Simultaneous Cameras, Multiple Faces

ARKit 3 also adds new minor features to enable new use cases. Both cameras can now be used simultaneously for example, so a user’s facial expressions could drive the AR experience.

Additionally, the selfie camera can now track multiple people, which could open up interactive facial augmentation experiences, similar to multi-person Snapchat filters.

Tagged with:apple, ARKit


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK