

Creating a Google Big Query connection in SAP Datasphere (DWC)
source link: https://blogs.sap.com/2023/05/25/creating-a-google-big-query-connection-in-sap-datasphere-dwc/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Creating a Google Big Query connection in SAP Datasphere (DWC)
Scenario: We wanted to show how we can upload a csv file to Google cloud storage and then create a table based on it in Big Query and then import this table in SAP Datasphere via Import Remote tables
1) In GCP cloud storage we need to create a bucket

Give it a name
Next add a label (this is optional)

Choose where you want this bucket to be stored, I kept it as default what was selected

Choose storage class for your data (hot, cold, nearline) I kept it as standard

Choose how to control access on this bucket (like if you don’t want to make it public)

I uncheck the above highlighted since I just wanted to try if anyone can access this bucket.
Next is security on this bucket

When you click on Buckets you see the bucket has been created.

I uploaded a simple .csv file

Now we will access this csv file from this cloud storage into Big Query
In the Big Query under explorer, I will create a dataset (this is like a schema where you will store all the tables) Click on Project

Next Create data set and give a proper business name



Data set (schema) is created. Now we will create a table via the cloud storage csv file.
Select the create table from Google cloud storage, Next select the bucket and then the file

Give table a name

Table is created and we can preview the data


Now we are going to create a connection in SAP Datasphere for Big Query.
In SAP Datasphere under connections click create and select Google BigQuery


We need to give Project name and Key (Access Key which we need to generate via API)
You will get the project id when you click on My First Project


Next click on Enabled APIs and services, scroll down you and click on BigQuery Connection API

Next click on credentials

Now you need to have a service account.

Let’s create a Service Account
Go to IAM and admin in Google console from the Navigational menu and click on Service accounts

At the top click on Create Service account

Give a relevant service account name

Click on create and you get a pop-up service account is created
Now we will see this service account In the BigQuery connection API

Click on it and go to Keys

Now click on Add key (recommended is JSON)

Click on Create, a key file will get downloaded. Store it securely since we will use it in the connection

Now under create connection in SAP Datasphere (DWC) Enter the project id and select the key document.

Click on Next step and give technical name

Validate the connection
Now go to Data Builder in Datasphere and click on Import Remote tables

Select the connection and then browse your project and select the table

And click Import and deploy and there it is the table is imported.

And finally, the table has been imported successfully in Datasphere. Further you can create Graphical view, Entity Relationship, Analytical model etc.
Hope this article was helpful.
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK