3

Integrating SAP Signavio Process Intelligence and SAP Data Intelligence Cloud: a...

 1 year ago
source link: https://blogs.sap.com/2022/07/20/integrating-sap-signavio-process-intelligence-and-sap-data-intelligence-cloud-a-concrete-example-with-the-new-ingestion-api/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Integrating SAP Signavio Process Intelligence and SAP Data Intelligence Cloud: a concrete example with the new Ingestion API

Welcome to the third episode of the series: Integrating SAP Signavio Process Intelligence and SAP Data Intelligence.

In case you missed the two previous blog posts:

After the enhancements that were released in the product in June 2022, with the general availability of Ingestion API (in the Appendix you can find the link to the documentation), we can now overcome that preliminary approach and actually move to a direct connectivity between the products.

In this third post, I’ll therefore describe a concrete example that shows how you can use an SAP Data Intelligence pipeline to feed data directly into SAP Signavio Process Intelligence, via Ingestion API, bypassing the need for a staging area.

picture-1-7.png

Ingestion API allows you to create an external data pipeline and ingest data into SAP Signavio Process Intelligence via API with an authentication token. This facilitates the integration with SAP Data Intelligence Cloud, where a data pipeline can be created with an operator at the end pushing the data into SAP Signavio Process Intelligence.

The example implemented in this blog shows how to read data from a customer survey from a Qualtrics system and how to then feed it into Signavio Process Intelligence.

What is interesting is that such approach can be applied to any of the numerous sources supported by SAP Data Intelligence Cloud leveraging on its data processing and transformation capabilities, but it can also be applied to any kind of SAP and non-SAP data pipeline, external to SAP Signavio Process Intelligence.

Configuring an SAP Data Intelligence Cloud pipeline to move data from Qualtrics to SAP Signavio Process Intelligence via Ingestion API

Let’s first introduce some context.

Some of you already know about journey to process analytics, an innovative process management practice that connects experience and business operations data, aiming at understanding, improving and transforming your customer, supplier and employee experience. If you haven’t heard of it yet, I would suggest you to read this 5 minute blog post by Aida Centelles Ahicart.

Let’s now imagine that we want to analyze an Incident-to-Resolution process. From a traditional process mining perspective, the approach is quite straightforward: we configure a connection to ServiceNow, extract the data, create an event log, and apply process mining techniques on it.

picture-2-3.png

But what if we could collect customers feedback, at the end of their journey, in a Qualtrics survey?

This experience data could be combined to the operational data to bring the customer experience perspective into play, enriching the data model and leading to a broad variety of new and valuable insights.

However, currently there is no native connector to Qualtrics in SAP Signavio Process Intelligence. Hence, we can create a data pipeline in SAP Data Intelligence to extract the survey data from Qualtrics and push it to SAP Signavio Process Intelligence.

Let’s see how we can quickly set this up.

Ingestion API

First of all, let’s create a new Ingestion API data source in SAP Signavio Process Intelligence.

picture-3-3.png

You can either create a data source or an integration. Either way, the other one is then automatically created with the same title and with the same type (Ingestion API), and they will be interconnected with each other.

picture-4-2.png

The data source provides two parameters:

  • The API endpoint is the target URL to make a request to ingest external data into SAP Signavio Process Intelligence
  • The Token is the authentication token required for a successful request pushing data into this specific pair of data source and integration

Qualtrics

Second, let’s connect to Qualtrics to retrieve the information we need.

Login into your Qualtrics system and go to Account Settings and then to Qualtrics IDs

picture-5-2.png

Here you can retrieve the following parameters

  • Datacenter ID
  • API authentication parameters
    • Token
    • User ID
  • Survey ID

SAP Data Intelligence pipeline

Last, let’s create a data pipeline in SAP Data Intelligence Cloud.

As a prerequisite, in the Connection Management tab you must create an OPENAPI connection to Qualtrics, with the following parameters:

  • Host: <datacenter>.qualtrics.com where <datacenter> is the Datacenter ID retrieved from Qualtrics
  • Authentication type: ApiKey
  • API Key Name: you can fill the Qualtrics user ID (will be overridden by header)
  • API Key Value: <token> where <token> is the API authentication token retrieved from Qualtrics
picture-6-2.png

Import the two ready-made operators which we’ve built to connect to Qualtrics (“Connect to Qualtrics Survey”) and to SAP Signavio Process Intelligence (“Push data to Process Intelligence via Ingestion API”) respectively. You can find all related artifacts shared in a GitHub repository that you can find in the Appendix of this post (it will be shared soon). In the future, we plan to ship improved versions of these operators in the product itself.

Now, let’s open the DI Cloud pipeline modeler, and create a new pipeline. We’ll use generation 1 operators in this example. In this new pipeline, drag and drop the “Connect to Qualtrics Survey” operator.

picture-7.png

Click on the configuration of the operator and fill in the following parameters:

  • Input Connection: choose the OPENAPI connection to Qualtrics
  • The Survey ID retrieved from Qualtrics
  • StartDate to extract survey data from StartDate to now

This is a python operator that opens a connection to Qualtrics and extracts the survey data via API. A few code lines perform data transformation activities specifically for the survey used for this blog: they will need to be adjusted for your use case.

The python code has been implemented based on this blog as a reference.

After that, drag and drop the “Push data to Process Intelligence via Ingestion API” operator in your pipeline, connecting it to the output of the previous operator.

picture-8.png

Click on the configuration of the operator and fill in the following parameters:

  • The table name that you want to be uploaded on SAP Signavio Process Intelligence with the survey data extracted from Qualtrics
  • The authentication token retrieved from the Ingestion API data source

This is a python operator that receives in input the Qualtrics survey data extracted from the previous operator and prepares the data to be ingested into SAP Signavio Process Intelligence via Ingestion API. A few code lines perform data preparation activities specifically for the survey used for this blog: thet will need to be adjusted for your use case.

Now, you can complete the pipeline by just adding a Graph Terminator.

picture-9.png

You can now save and run the pipeline.

Once its status is “completed” you can go to the SAP Signavio Process Intelligence Ingestion API integration and check the logs.

picture-10.png

A new execution log will appear, with “Running” status. After a few seconds the status will become “Done” and you can check in the Tables tab that a new table has indeed been uploaded, with the name provided in the second python operator of the DI pipeline.

picture-11.png

The data is now ready to be used for process mining!

Additional step: Add Experience Data to your Data Model!

As we stated above, the whole idea started with the business requirement to add Qualtrics data to the Incident to Resolution process, to enrich operational data with the customer experience perspective.

To achieve that, you can create a new data model selecting ServiceNow as source system.

picture-12.png

Select the available Incident-to-Resolution transformation template, as an accelerator to transform raw data into an event log with a series of preconfigured SQL scripts.

picture-13.png

Select New Integration and connect to an existing ServiceNow data source.

picture-14.png

With the new Process Data Management (ETL 2.0) interface which was released in June 2022 you can now see the whole data pipeline, spanning connection, integration, and transformation. You can add the target Process Intelligence process.

picture-15.png

Click on “+ Data source and integration” and select your Ingestion API data source. It will be added to the data pipeline.

picture-16.png

Now you can start changing the existing SQL scripts to combine operational data with experience data coming from Qualtrics. For testing purposes, you can edit the Transformation block by adding a new “Qualtrics” business object.

picture-17.png

At this point you can add a new event collector with a simple SQL script to preview the survey data that has been extracted from Qualtrics and pushed to the Ingestion API integration through the SAP Data Intelligence data pipeline.

picture-18.png

Qualtrics survey data is now ready to be combined with ServiceNow data to create new valuable insights!

Further steps

As a summary, in this blog post I have covered:

  • How to create and use an Ingestion API data source to push external data into SAP Signavio Process Intelligence
  • Which information you need to gather from your Qualtrics system in order to extract data from a survey, and how you can retrieve it
  • How to connect SAP Data Intelligence Cloud to Qualtrics and then create a data pipeline with two operators, that respectively:
    • Connect to Qualtrics and extract survey data
    • Push the data to SAP Signavio Process Intelligence via Ingestion API

Follow the SAP Signavio Community and the SAP Data Intelligence Community for similar content, peer-to-peer networking and knowledge sharing.

Appendix

If you’re interested to know more about Ingestion API, here are some additional resources that you can check:

The GitHub repository where you will be able to find the data pipeline and the two python operators to be downloaded and imported into your SAP Data Intelligence tenant, is coming soon.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK