Face Tracking with RealityKit
source link: https://www.raywenderlich.com/20591636-face-tracking-with-realitykit
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
This course is available as part of the raywenderlich.com Professional subscription. To learn more click here
Face Tracking with RealityKit
Feb 23 2021 · Video Course (37 mins) · Advanced
Learn how to leverage RealityKit and Reality Composer to build engaging AR experiences focused on tracking facial movement and expression, add props and behaviors to face anchors in Reality Composer, and drive animations for 3D content in augmented reality with your facial expressions.
Version
- Swift 5.3, iOS 14, Xcode 12.4
Face Tracking with RealityKit
Set Up a RealityKit Project
2:54 FreeFind out what RealityKit has to offer and how to set up a RealityKit project with the required permissions.
Learn how to get around in Reality Composer, set up your first face anchor, and add props from Xcode’s built-in library.
Start an ARSession
5:24Start an ARSession and configure it for face tracking with help from ARKit. Access objects from Reality Composer with Swift.
Set up the app to handle switching between different props by adding and removing anchors from the ARView.
Set up an ARSessionDelegate to handle live updates to face anchors and drive animation based on where a user is looking.
Access Blend Shapes
1:42Learn how to access a ton of information about a user’s facial movement via the face anchor’s blend shapes.
Track Jaw Movement
3:21Use the movement of a user’s jaw to drive the jaw animation of a 3D robot head! Learn a bit about quaternions.
Use the movement of a user’s eyes and eyebrows to drive the eyelid animation of a 3D robot head! Apply multiple rotations to a single object.
Try out Reality Composer’s behavior system to add animation and sound effects to your robot experience.
Trigger Behaviors
3:14Learn how to trigger behaviors from Reality Composer from your Swift code. Add some lights!
Who is this for?
This course is for experienced iOS developers who are comfortable with Swift and have some familiarity with AR technology.
Covered concepts
- RealityKit
- Reality Composer
- ARSession & ARSessionDelegate
- Face Anchors
- Blend Shapes
- Reality Composer Behaviors
Contributors
Catie makes things for, with, and about Apple tech in collaboration with her husband, Jessy! She is inspired by everyone at...
InstructorGraphic Illustrator with a Bachelor’s Degree in Fine Arts. I am a perpetual perfection seeker with a big passion for History...
IllustratorComments
All videos. All books.
One low price.
A raywenderlich.com subscription is the best way to learn and master mobile development — plans start at just $19.99/month!
Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.
The largest and most up-to-date collection of courses and books on iOS, Swift, Android, Kotlin, Flutter, Dart, Server-Side Swift, Unity and more!
All videos. All books. One low price.
A raywenderlich.com subscription is the best way to learn and master mobile development — plans start at just $19.99/month!
Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalogue of 50+ books and 4,000+ videos.
- © 2021 Razeware LLC
- Made with ♥ from around the world
- 6000+ Tutorials and counting
raywenderlich.com and our partners use cookies to understand how you use our site and to serve you personalized content and ads. By continuing to use this site, you accept these cookies, our privacy policy and terms of service .
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK