1

Extended Reality Venture Blog Post, part 2

 2 years ago
source link: https://blogs.sap.com/2021/10/26/extended-reality-venture-blog-post-part-2/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
October 26, 2021 15 minute read

Extended Reality Venture Blog Post, part 2

Hi community,

My name is Michael Spiess, and I am the team lead for the Extended Reality venture at SAP. Based on the first blog post, I’d like to share some additional insights about one of the two key areas in which we’re currently investing. This blog post will tell you about “Augmented and Mixed Reality for Service, Maintenance and Quality Inspections of Assets“. This post will be followed later by another blog post about our second key area, “Business Process Modeling and Training in Virtual Reality”. In these challenging days of dealing with the current pandemic, Extended Reality has never been more important.

In this blog post, I’ll present the AR showcase app for SAP Field Service Management. The app was built by my venture team – a group of cloud, mobile, and 3D developers who are passionate about proving that AR model tracking by Visometry not only helps users – mainly service engineers and field workers – to get their job done faster and better, but also lets them have fun while doing it.

Yes, this app is really fun! The moment that the digital CAD model is positioned exactly with the physical counterpart (“superimposed”), you can see what’s inside in a machine. I often use the term “x-ray” view, but actually the view is much better than a static black and white picture. You can hover over the machine with your mobile device to get instant insights about health status, IoT data, and spare part information.

You’ll also find out more about the benefits and the potential of AR model tracking in general. This part was contributed by Jens Keil, co-founder of Visometry, who supported my team with his great expertise in model tracking.

Finally, I’ll provide insights about the architecture of such apps and how you can use the technology to your advantage.

What is the app about?

1a.jpg

Figure 1: AR Showcase App for SAP Field Service Management

The AR showcase app combines the SAP Field Service Management checklist feature with a new SAP cloud service, XR Cloud, and AR model-tracking technology from VisionLib, making the next generation of remote service available to service engineers.

Instead of sharing a video stream with the remote engineer, the operator receives IoT data, spare parts, and work instructions as augmented information in the system for direct interaction. It’s no longer necessary to scroll or search for materials, as the machine being inspected is linked directly to ERP data sources. What’s more, with a camera always in place, AR helps create pictures that are always in the right position for documentation purposes.

Classic checklists require manual data input by the user. Checklists using AR and model-tracking reduce the data input effort and overall inspection time, while increasing the quality of the inspection. Instead of searching for spare parts or losing time writing the service report, you just go to the relevant component and touch it virtually. Not only do these apps make sense in terms of business, they’re also fun to use! Users feel empowered and totally in control when they hover the tablet over the system and get a x-ray view of the part or machine they’re working on.

Now let’s walk through the inspection steps in our checklist:

Step 1 – Select model

1bc.jpg

Figure 2: Select Smart Asset from XR Cloud

Before a model can be tracked with the physical counterpart, the model must be identified. This can be either done with a QR code placed on or nearby the machine or, as shown in the image, the user can select the model. In both cases, the model is loaded on demand from a cloud service, either before the technician arrives (in case of limited internet capability) or when the technician has arrived at the service workplace.

There are two reasons for loading it from a cloud service. First, such models can be very large. Loading of all models from all workplaces would cause disk space issues for the app. Secondly, it’s possible to change the model anytime to give the service technicians up-to-date instructions.

Step 2 – Use model tracking

In the next step, the user aligns the 3D model from the cloud service with the physical counterpart. The user needs to hold the tablet about 2 meters in front of the machine to align the model with the Evomixx machine. During our test phase, this alignment step was done by many testers who were new to the app. In 98% of all cases, users were able to make the alignment successfully on the first try.

Figure%202%3A%20Matching%20the%20Line%20Model%20with%20the%20Physical%20Object%20Until%20It%20Snaps%20into%20Place

Figure 3: Matching the Line Model with the Physical Object Until It Snaps into Place

Step 3 – Check pressure sensor

Now the user needs to check if the reported IoT sensor values match the actual ones. A camera symbol indicates where the user should move the tablet to take a picture of the physical sensor together with the augmented actual IoT sensor value. This step is also a good example of how AR object tracking can help improve the reliability of AI image recognition by guiding the operator to make the photo using the right position and angle.

Now imagine this step without AR. First, the user has to select the right sensor in the app. This is already a possible source of errors as the user might choose the wrong one. Next, the user has to type the value into the checklist, possibly making typos. These steps are not only error-prone, they also take a lot of time.

Figure%203%3A%20Capture%20Image%20in%20Right%20Position%20and%20Angle

Figure 4: Capture Image in Right Position and Angle

Step 4 – Check health status of seals

Now the user needs to check the health status of the valve seals to decide whether replacement is necessary. AR is also very useful here, as the service technician can immediately see the corresponding health status of the seal.

Step 5 – Start visual inspection

Now the user starts the visual inspection. Here the hinge and the clamp are tracked in addition to the model tracking of the Evomixx machine. Analysis shows whether the physical model matches the digital model. This makes it possible to detect incorrect installations. In the video, the app warns that the hinges are installed incorrectly. By playing an animation in AR, the user can find out how to fix it.

Figure%204%3A%20Visual%20Inspection

Figure 5: Visual Inspection

Steps 6 and 7 – Checklist summary and sign-off

The summary lists all operations done in the AR inspection process. This summary is signed by the operator in the last checklist step before it is uploaded to the SAP Field Service Management back end.

Figure%206%3A%20Signature%20of%20Inspection%20Summary

Figure 6: Signature of Inspection Summary

Why is it so tough to be a field engineer

The main challenge is the huge number of variants in production and asset management nowadays. For instance, operators need to handle an enormous number of product variants. For technicians and field services, it’s hard to keep track of specification, special knowledge such as the pressure, or the right torque, which varies for the same group of items. Not to speak of gaining practical experience. Even with training, there are just too many variations to remember them all.

As an example, let’s take the windshield wipers of a sports car manufacturer: the three clips that attach a wiper to the vehicle each have different torques depending on the configuration. With augmented reality, it’s much easier to filter the necessary information. What’s more, this information is virtually pinned on the exact position, making it absolutely clear which torque is needed for each of the three clips.

Without augmented reality, service faces high support costs and far too often, a lot of remote calls are needed to proceed. Even worse, using the wrong torque or tools, technicians could unintentionally destroy parts that then have to be reordered or laboriously replaced. This causes delays in the actual repair, removal, or installation of spare parts.

What are the benefits of AR in such cases?

One of AR’s basic paradigms is the fusion of the digital and the physical. The real world is a key aspect here, as it defines the context simply by what’s in front of the camera. Using AR, you can actively point the camera at what interests you. Camera and object detection make it possible to filter for relevant elements in the real world so that you can focus quickly and automatically on relevant content. Augmented Reality gives digital content a precise spatial position, ensuring that there’s always a clear and unambiguous relationship between the physical object and the digital information related to it.

AR helps users make visual inspections faster. Because CAD data and digital twins can be superimposed on reality, it’s possible to compare built machinery against the relevant specification in a matter of seconds. Virtual x-ray views let users see right into the insides of a machine and instantly gain better knowledge about how it is assembled.

3

Figure 7: X-Ray View with AR Model Tracking

Early AR prototypes have been misleading about the ideal use of the technology. These prototypes overemphasized irrelevant animations of which tools to use and how to process them. Technicians know how to use a driller. But it gets harder if you need to know the detailed settings for each configuration when you’re in front of a machine and the clock is ticking.

What are the challenges of AR?

The benefits are there, and they’ve been proven repeatedly throughout the decade. However, what companies are currently looking for is the ability to scale AR beyond mere demonstrations and showcases into productive applications that they can use in their daily routines. Unsurprisingly, the challenges that arise here are closely connected to how well companies have digitized their processes.

The ideal process for enterprise product development is not a one-way street from the engineering system to the mobile app. It’s even more important for the product experience to find its way back to the engineering system, closing the feedback loop and achieving continuous product improvement.

The key to this ideal process lies in digitized checklists and process knowledge, CAD data and other sources, all linked and accessible in one place and connected to the ERP system.

What are the technical nuts and bolts behind the demo app?

AR apps for model tracking need to combine the best of two worlds: The business world and the XR world. The business world needs secure access to business data presented in a delightful and responsive user interface. The XR world needs a 3D real-time development and render engine that addresses the different XR technology stacks (iOS ARKit, Android ARCore, MRTK for Microsoft) and platforms to develop XR applications quickly and easily.

For the XR aspect, I see Unity as the platform of choice. The Unity as a library concept allows developers to add the XR part to the native code of the mobile business app. The XR part consists of the VisionLib SDK for Unity and the SAP software component Smart Asset Plugin for Unity. The Smart Asset Plugin loads Unity asset bundles with the 3D model and the tracking parameters from the XR Cloud into the app’s Unity runtime. This on-demand approach is beneficial not only for users, but also for developers. It reduces the turnaround time required for making changes in a model visible in the app. The developer no longer needs to recompile the application whenever there’s a change in a tracking parameter or model.

For the business aspect, we’ve used the SAP Business Technology Platform (SAP BTP) SDK for iOS from SAP Mobile Services. SAP Mobile Services offers a large variety of security functions, offline OData synchronization, and seamless integration with back-end systems. For instance, the XR in the app doesn’t communicate directly with the XR Cloud. Instead, it addresses the SAP Mobile Services back-end, which reroutes the request to the right XR Cloud tenant.

As a result of this tool selection, development of the app took only about 6 weeks from design to final presentation.

Why the XR Cloud is a software as a service (SaaS)

The service technician needs to make inspections at various sites, which also requires various 3D models. For the currently relevant model to be loaded on the device when the inspection occurs, a cloud solution is required. The cloud solution must be able to connect to any ERP back-end, either SAP ERP SAP S/4HANA or third-party ERP systems. The XR Cloud – built as a cloud native solution with the SAP Business Technology Platform – fulfills these requirements.

7b.jpg

Figure 8: SAP Fiori Launchpad for XR Cloud, Built and Run by SAP BTP

What is tracking and why is it so essential?

Modern AR apps use computer vision techniques to detect and track industrial items. That is, the apps detect, localize, and track objects in the video stream of a tablet’s camera. To use AR here and in other industrial processes, this tracking must be able to localize objects with a high level of precision, reliably and repeatedly over time. There are many tracking techniques, but so-called model tracking is currently the most suitable one. Its main benefit lies in the auto-registration and instant tracking of physical objects. , this means that the item that might be affected can be tracked and inspected with one click, without prior preparations. This is essential for saving time in the process and for benefiting from AR in general (compared to other techniques).

What is model tracking and what sets it apart?

Model tracking, a technique which uses CAD or 3D data as tracking references, has become the de facto standard for enterprise-grade object tracking. There are three main reasons for this: First, it overcomes many of the obstacles related to computer vision − for instance faint light conditions. It can handle movement of objects, and unlike other techniques, the tracking information remains stable over time.

Secondly, it doesn’t rely on markers or any other pre-preparation or pre-registration of scenes or the objects to be tracked. Instead, it makes it possible auto-register and track objects in a couple of seconds. And finally, you can scale a tracking setup from one to many objects quickly and automatically.

The VisionLib SDK has specialized in industrial tracking methods. It provides first-class model tracking, and its high performance and precision make it one of the worldwide leaders in this technology area.

Besides technical benefits, the main reason for our choice of the library is the fact that it doesn’t force customers to integrate any cloud solutions or SaaS infrastructure with their IT backbone. VisionLib runs cross-platform and supports Unity, HoloLens, and, with Unreal support coming soon, it ensures that no vendor lock can occur.

And, of course, the team’s experience of over 10 years in object tracking and industrial computer vision is a definite plus for VisionLib.

What are the limitations of model tracking?

While model tracking is the technique of the hour for enterprise AR, it’s still not a one-size-fits-all solution. To work properly, it needs perfectly matching CAD or 3D models as tracking references to get AR started. This may not always be available.

As model tracking is geometry-based, it relies mainly on the object’s shape and requires distinctive characteristics to work well. Shapes with rounded or blunt edges, or those that are rotationally symmetrical aren’t ideal, as the vision system can’t register a clear position or zero in on their direction.

The camera resolution and field-of-view also limit tracking cases. Standard optics can’t capture very small objects or very large ones, which means that the tracking either doesn’t have enough pixels to work with, or the objects exceed the image size, making it unable to grasp the object properly.

And finally, state-of-the-art tracking reaches its limits in cases where it needs extra intelligence in terms of real-world understanding. Let’s take cables or other flexible elements as an example. These are often modeled in CAD data, but in reality, they often might run quite differently. The tracking system would need to be able to understand and match this deviation automatically.

How to overcome such limitations?

Tracking library SDKs like VisionLib make it possible to combine plain model tracking with texture-based information and other input. But textural properties are generally error-prone and might fall short due to the above-mentioned obstacles.

Simultaneous Localization and Mapping (SLAM), the leading tracking technique behind ARKit and ARCore, also uses texture-based or, as they refer to it, feature-based information. The main advantage is the ability to start tracking spontaneously, in almost any situation. But as SLAM is not good at locating or registering objects precisely, augmentations remain vague or need manual alignment for precise localization. These features are also short-lived and therefore not very reliable. They change when objects move, or when surroundings and lighting conditions change too much.

Combinations of multiple tracking methods on the one hand and techniques that non-AR experts can utilize on the other, define whether objects can be tracked well and whether the use of AR is a useful enrichment in these cases.

What application areas do you foresee in future?

SAP Field Service Management is mainly used by service engineers. But the concept of providing a cloud service to provide the 3D model for object-tracking on mobile apps can be used for many other areas – for both business users and consumers. Imagine that you need to drill a hole in a wall in your house and are unsure whether you’ll hit a water pipe. Just download the BIM (Building Information Model) of your house, hold the tablet in front of the wall, and you’ll know in seconds if you need to adjust your project!

But the area of manufacturing is even more promising for productivity improvements. Complex quality inspections like angle measurement or checks for component existence and positional accuracy can be fully automated and recognized by the system, even if the physical component is moving slowly, for example, on an assembly line.

45645

Figure 9: Tracking Remains Stable When Throwing Object in the Air

Depth-sensing technology, like LiDar (Light Detection and Ranging) scanners, also have great potential for the future. Being able to sense depth, they can capture the world in 3D instantly, making it much easier to compare reality as-is with, say, the (CAD) specification. When it comes to precision, however, they’re not yet enterprise-grade. The scan resolution is too low, making it hard to scan very small or large items appropriately. This will likely change in the future.

How can I use this technology for my company?

Before you can use AR technology to your full benefit, you should first digitalize your processes. This means that your 3D data must be up-to-date and linked with ERP entities like equipment and bill of materials. The key really lies in linking different data sources to obtain the full benefits of AR and computer vision detection.

But of course, you can also start with the AR experience first and do the ERP integration later. In both cases, you need modern mobile devices with large screens, processing power, and good connectivity. If you already have a service app in place, the functionality can be enhanced with AR by using the Unity as a library approach, which brings 3D to your app quickly. You don’t need to start from scratch or switch the platform first.

When will SAP support model tracking capability in SAP products?

Today, SAP has an SAP Visual Enterprise viewer app for classical 3D screen experience in the public app stores. At SAPPHIRE Thomas Saueressig presented the launch of the AR viewers for iOS and Android and the MR viewer for Microsoft HoloLens for SAP Enterprise Product Development customers. Model tracking also works for the HoloLens, although the alignment reliability suffers slightly if the user turns their head away from the machine suddenly. It’s useful for smaller objects, but not for bigger systems like the Evomixx.

Figure%2010%3A%20Model%20Tracking%20with%20Microsoft%20HoloLens

Figure 10: Model Tracking with Microsoft HoloLens

But before model tracking is generally available for SAP products, we need to shape and fine-tune this feature in co-innovation with interested customers from different industries to offer our customers a best-of-breed solution. If you would like to take part in such a co-innovation journey and be the first to benefit from this new technology, please contact the XR venture mailbox ([email protected]).

How can I contact SAP?

For SAP customers, we offer advisory services that enable you to quickly identify the fits and gaps so that you can move into a productive mode very quickly. Due to the nature of a cloud SaaS offering, implementation time is no longer required. Contact your account representative and ask for the SAP Enterprise Product Development advisory service.

Non-SAP customers interested in implementing this technology can use the mailbox [email protected] to contact SAP or to contact Visometry.

You can also book a tour at SAP Experience Center in Walldorf to try out the app yourself! Please contact [email protected] to do so.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK