1

Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocS...

 1 year ago
source link: https://blogs.sap.com/2022/09/09/ariba-analytics-using-sap-analytics-cloud-data-intelligence-cloud-and-hana-docstore-part-2/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

This is Part Two of a blog series on Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore. If you would like to start with Part One, please click here

Recap

SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. Our worked example is using SAP Ariba Data to create an SAC Story that lets you know how much spend has been approved within Ariba Requisitions created in the last thirty days

Our%20basic%20SAP%20Analytics%20Cloud%20Story

A simple SAC Story tracking Approved Requisitions

In Part One of our blog series we discussed how we can retrieve data from SAP Ariba’s APIs using SAP Data Intelligence Cloud. We stored this data as JSON Documents in the SAP HANA Document Store

In this blog post, we’re going to build SQL and Calculation Views on top of our JSON Document Collection

In the third (and final) blog post we’ll consume that Calculation View in SAP Analytics Cloud as a Live Data Model, which will provide the data to our SAP Analytics Cloud Story

Viewing%20the%20data%20in%20SAP%20Analytics%20Cloud

Viewing our HANA DocStore Collection data in an SAP Analytics Cloud Story

Design-Time Artifacts in Business Application Studio

As we discussed in our last blog post, objects within HANA usually have both a design-time and runtime artifact. Design-time artifacts are useful because they fully describe the object and can be deployed consistently across multiple HDI Containers or even HANA instances

When%20we%20deploy%20our%20design-time%20artifacts%2C%20they%20will%20be%20created%20as%20runtime%20artifacts%20inside%20our%20HDI%20Container

When we deploy our design-time artifacts, they will be created as runtime artifacts inside our HDI Container

Our JSON Document Collection has already been created, and is already storing our Ariba JSON Documents. From here, it’s time to model our other artifacts

Creating our SQL View

JSON Documents are useful in a variety of situations where you don’t have strict, predefined schemas. When we retrieve our data from the Ariba APIs, we may retrieve data that doesn’t map cleanly to a table schema (for example, data that is nested). Putting this data in the HANA DocStore Collection allows us to store the complete document, ensuring nothing is lost

In order for us to use this data for analytics, we’ll need to map it to some sort of schema. We can create a logical schema on top of our Collection using a SQL View. This allows us to access a predefined subset of our data for analytics while leaving the full data untouched in our Collection

We’ll create the SQL View in Business Application Studio

Click%20on%20View%2C%20then%20Find%20Command%20or%20press%20Ctrl+Shift+P

Click on View, then Find Command or press Ctrl+Shift+P

Use%20Find%20Command%20again%20to%20find%20Create%20SAP%20HANA%20Database%20Artifact%2C%20then%20click%20on%20it

Use Find Command to find Create SAP HANA Database Artifact, then click on it

Select%20SQL%20View%20as%20the%20artifact%20type%2C%20and%20enter%20the%20artifact%20name%20then%20click%20on%20Create

Select SQL View as the artifact type, and enter the artifact name then click on Create

SQL Views use the following format:

VIEW "aribaRequisitionSQLView"
AS SELECT "UniqueName", "Name", [...]
FROM "aribaRequisition"

If you’re familiar with SQL, you may recognise this as the same syntax that you would use to create a standard SQL View, just missing the word “CREATE”

The SQL View doesn’t duplicate any data, just provides a schema that we can use to access the underlying data

Our data in JSON Documents are Key-Value pairs

"Status":"Complete"

To retrieve the value “Complete”, we would SELECT “Status”

JSON Documents may also have nested data

"Content":{"ItemId":"3728754507"}

To retrieve the value “3728754507”, we would SELECT “Content”.”ItemId”, with the full stop marking nested keys

Our example will use the following SQL View:

VIEW "aribaRequisitionSQLView"
AS SELECT "UniqueName", "Name", 
"TotalCost"."AmountInReportingCurrency" AS "AmountInReportingCurrency", 
"ReportingCurrency"."UniqueName" AS "ReportingCurrency", 
"ApprovedState", "Preparer"."UniqueName" AS "Preparer",
"Requester"."UniqueName" AS "Requester", "StatusString", 
"CreateDate", "SubmitDate", "ApprovedDate", "LastModified", 
"ProcurementUnit"."UniqueName" AS "ProcurementUnit"
FROM "aribaRequisition"

The fields we’re using are only a fraction of the fields available in the Documents within our Collection – if we want to customize the scenario later, there are plenty more to choose from

We want to make sure this SQL View is deployed and ready for use, so click on the Deploy rocket

We%20can%20deploy%20our%20SQL%20View%20under%20SAP%20HANA%20Projects%20on%20the%20left

We can deploy our SQL View under SAP HANA Projects on the left

Creating our Calculation View

While we’re in Business Application Studio, we’re going to create our Calculation View. This Calculation View is what we’ll be consuming in SAP Analytics Cloud

As before, we’re using View->Find Command then Create SAP HANA Database Artifact

Choose%20Calculation%20View%2C%20enter%20a%20Name%20then%20click%20on%20Create

Choose Calculation View, enter a Name then click on Create

Business Application Studio has an inbuilt editor for Calculation Views, which we’ll use to create ours

Click%20on%20Aggregation%2C%20then%20click%20the%20Plus%20symbol

Click on Aggregation, then click the Plus symbol

Search%20for%20our%20SQL%20View%2C%20select%20it%2C%20then%20click%20Finish

Search for our SQL View, select it, then click Finish

Now that our SQL View is available as a Data Source, we want to make sure its columns end up in our Calculation View

Click%20on%20Aggregation%2C%20then%20click%20on%20Expand%20Details

Click on Aggregation, then click on Expand Details

Click%20on%20our%20SQL%20View%20on%20the%20left%20then%20drag%20and%20drop%20to%20Output%20Columns%20on%20the%20right

Click on our SQL View on the left then drag and drop to Output Columns on the right

Our%20SQL%20view%20columns%20will%20now%20be%20available%20in%20our%20Calculation%20View

Our SQL view columns will now be available in our Calculation View

Because this is a Calculation View of type Cube (rather than Dimension), we’ll need to make sure it includes at least one Measure

The columns in our SQL View all have the default data type NVARCHAR(5000). If we try to mark this column as a Measure directly, it will treat it as a string – giving us the Aggregation options COUNT, MIN and MAX

We want to treat this column as the number it is – as a workaround, we’ll need to create a Calculated Column

Creating our Calculated Column

Calculated Columnis an output column that we create within the Calculation View itself. Rather than being persisted, the values are calculated at runtime based on the result of an expression

For our example, we’re using a very simple expression. First, we have to make our way to the Expression Editor

Click%20on%20Calculated%20Columns

Click on Calculated Columns

Create%20a%20Calculated%20Column%20using%20the%20Plus%20symbol%2C%20then%20Calculated%20Column

Create a Calculated Column using the Plus symbol, then Calculated Column

Click%20on%20the%20Arrow

Click on the Arrow

Next we’re going to give our Calculated Column a name and data type. Because the granularity of our example is the Requisition-level and not the item-level, the decimal points won’t meaningfully change the results. Given that, we’re going to use the Integer data type

Give%20the%20Calculated%20Column%20a%20Name%2C%20and%20choose%20the%20Data%20Type%20Integer

Give the Calculated Column a Name, and choose the Data Type Integer

Yo

Choose Measure as the Column Type

Hey

Click on Expression Editor

The Expression Editor is where we’ll define how the column is calculated. Select our AmountInReportingCurrency Column

Select%20our%20Column%20from%20the%20left

Select our Column from the left

Our%20Column%20is%20in%20the%20Expression

Our Column is in the Expression

Our Created Column will take the value of AmountInReportingCurrency and convert it to an Integer

Now we want to validate the syntax of our Expression

Click%20on%20Validate%20Syntax

Click on Validate Syntax

Our%20Expression%20is%20valid

Our Expression is valid

We have one last thing to do inside our Calculation View – we want to filter the data to only include Approved Requisitions. If we want to use the Value Help to set our Filter, we’ll need to Deploy the Calculation View

Deploy%20our%20Calculation%20View

Deploy our Calculation View

Click%20on%20Filter%20Expression

Click on Filter Expression

Click%20on%20ApprovedState%20under%20Columns

Click on ApprovedState under Columns

Add%20an%20Equals%20Sign%20%28%3D%29%20then%20click%20on%20the%20Value%20Help

Add an Equals Sign (=) then click on the Value Help

Select%20Approved%20then%20click%20OK

Select Approved then click OK

Now we can check the syntax of our Filter

Click%20on%20Validate%20Syntax

Click on Validate Syntax

Our%20Filter%20is%20valid

Our Filter is valid

Before we Deploy our Calculation View, we want to make sure that we’re only sending our integer Calculated Column and not the string version. To do this, we go the Semantics Node

Click%20on%20Semantics%2C%20then%20Columns

Click on Semantics, then Columns

Click

Check Hidden for our AmountInReportingCurrency Column to exclude it from our Calculation View

All of the Columns we need, including our new Calculated Column are available within the Calculation View. Now we’re ready to Deploy it one last time

Once%20again%2C%20click%20on%20the%20Deploy%20Rocket%20under%20SAP%20HANA%20Projects

Once again, click on the Deploy Rocket under SAP HANA Projects

Checking our Runtime Artifacts

Now that we’ve finished deploying our Design-time artifacts, we’ll have the corresponding Runtime artifacts inside of our HDI Container.  We can check these by going to SAP HANA Database Explorer from within Business Application Studio

Click%20on%20Open%20HDI%20Container%20on%20the%20left%20under%20SAP%20HANA%20Projects

Click on Open HDI Container on the left under SAP HANA Projects

In the Database Explorer, we want to first check on our SQL View

Click%20on%20Views%20on%20the%20left%2C%20then%20click%20on%20our%20SQL%20View

Click Views on the left, then click on our SQL View

Our%20SQL%20View

Our SQL View

We can see all of the Columns in our created SQL View. If we want to check out some of the data returned by our SQL View, we can click on Open Data

Click%20on%20Open%20Data

Click on Open Data

Our%20SQL%20View%20is%20displayed

Data from our SQL View is displayed

Next it’s time to check on our Calculation View

Click%20on

Click Column Views on the left, then click on our Calculation View

Our%20Calculation%20View

Our Calculation View

Click%20on%20Open%20Data

Click on Open Data

Database Explorer will open our Calculation View for Analysis. We’re going to do our analysis in SAP Analytics Cloud, so for now we just want to verify the Raw Data

Click%20on%20Raw%20Data

Click on Raw Data

Our

Data from our Calculation View is displayed

Wrap-Up

During this blog post we’ve built a SQL View and Calculation View on top of our HANA DocStore Collection. We’ve also made sure that our Calculation View only contains Approved Requisitions

In the third and final blog post we’ll consume our Calculation View as a Live Data Model before visualizing it in an SAP Analytics Cloud Story. We’ll also schedule the Data Intelligence Pipeline we created in our first blog post so that the data in our HANA DocStore Collection is updated on a regular basis automatically

Viewing%20the%20data%20in%20SAP%20Analytics%20Cloud

Other Resources

SQL Views

Calculation Views

SAP HANA Database Modeling Guide for SAP Business Application Studio

Special Thanks

This blog series has had a lot of input from my colleagues – any errors are mine not theirs. In particular, thanks go to the Cross Product Management – SAP HANA Database & Analytics team, Antonio Maradiaga, Bengt Mertens, Andrei Tipoe, Melanie de Wit and Shabana Samsudheen

Note: While I am an employee of SAP, any views/thoughts are my own, and do not necessarily reflect those of my employer


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK