6

Google Cloud Functions Tutorial : Using gcloud Tool

 3 years ago
source link: https://rominirani.com/google-cloud-functions-tutorial-using-gcloud-tool-ccf3127fdf1a
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Google Cloud Functions Tutorial : Using gcloud Tool

This is part of a Google Cloud Functions Tutorial Series. Check out the series for all the articles.

In the series so far, we have been using Google Cloud Console to deploy, test and manage Cloud Functions. If you are a command-line person or would like to script out the management of Cloud Functions, the gcloud SDK is your ideal tool.

The gCloud SDK is a set of tools that you can install locally that helps you manage various services on the Google Cloud Platform. Google Cloud Functions can be managed via the gcloud functions command.

Configure your Project for gCloud

Before you run any of these commands, please keep in mind that gCloud tool should be configured to run against your project id. In an earlier post, we had seen how to setup your local environment, where we mentioned how to setup a project against which to run your commands. To reiterate:

Do a login first via:

$ gcloud auth login

Finally, set the project id via the following command. Please use your project id instead of the PROJECT_ID value below.

$ gcloud set project PROJECT_ID

Finally, validate that you are logged in with your project and the project id is also correctly setup via the following command:

$ gcloud config list

Check the accountand projectvalues and ensure that they were set correctly.

regions, event-types and logs command groups

Assuming that you have the gCloud SDK tools installed on your machine, let us first take a look at the first group of commands, especially the event-types and regions group:

gcloud functions GROUP | COMMAND [GCLOUD_WIDE_FLAG …]
Image for post
Image for post

Google Cloud Functions is currently supported is multiple GCP regions. If you run the following command, you get that response as shown below:

$ gcloud functions regions list
NAME
projects/mindstormclouddemo/locations/europe-west1
projects/mindstormclouddemo/locations/us-east1
projects/mindstormclouddemo/locations/us-central1
projects/mindstormclouddemo/locations/asia-northeast1

In a similar way, we can look at the current Events Providers and Types that are currently supported.

$ gcloud functions event-types list
cloud.pubsub google.pubsub.topic.publish Yes topic No
cloud.pubsub providers/cloud.pubsub/eventTypes/topic.publish No topic No
cloud.storage google.storage.object.archive No bucket No
cloud.storage google.storage.object.delete No bucket No
cloud.storage google.storage.object.finalize Yes bucket No
cloud.storage google.storage.object.metadataUpdate No bucket No
cloud.storage providers/cloud.storage/eventTypes/object.change No bucket No
google.firebase.analytics.event providers/google.firebase.analytics/eventTypes/event.log Yes firebase analytics No
google.firebase.database.ref providers/google.firebase.database/eventTypes/ref.create Yes firebase database No
google.firebase.database.ref providers/google.firebase.database/eventTypes/ref.delete No firebase database No
google.firebase.database.ref providers/google.firebase.database/eventTypes/ref.update No firebase database No
google.firebase.database.ref providers/google.firebase.database/eventTypes/ref.write No firebase database No
google.firestore.document providers/cloud.firestore/eventTypes/document.create Yes firestore document No
google.firestore.document providers/cloud.firestore/eventTypes/document.delete No firestore document No
google.firestore.document providers/cloud.firestore/eventTypes/document.update No firestore document No
google.firestore.document providers/cloud.firestore/eventTypes/document.write No firestore document No

Finally, you can view all the current Stackdriver logs for your Cloud Functions via the functions below:

$ gcloud functions logs read
LEVEL NAME EXECUTION_ID TIME_UTC LOGD function-1 xohadny5ps6m 2018-10-12 08:41:56.334 Function execution started
D function-1 xohadny5ps6m 2018-10-12 08:41:56.379 Function execution took 46 ms, finished with status code: 200
D function-1 xyhxh9g2qtw9 2018-10-12 08:52:56.598 Function execution started
D function-1 xyhxh9g2qtw9 2018-10-12 08:52:56.670 Function execution took 72 ms, finished with status code: 200...

We will take a look at more parameters that we can pass to the logs read command in order to view logs specific to a function, number of log statements and so on.

Deploying and Invoking Cloud Functions using gcloud

Let us take a look at deploying and then invoking a Google Cloud Function using the gcloud utility. The Commands that we shall use are deploy and call as shown below from the documentation.

Image for post
Image for post

The deploy command for a HTTP Trigger function is shown below:

Image for post
Image for post

The above diagram shows a sample HTTP Trigger based Cloud Function. Note that the name of the function is the function name that we are exporting. Similarly the Trigger name is --trigger-http. Note that this is not the only way and there are ways to specify where the index.js file is and which function to choose, etc. Refer to --help for more details via the gcloud functions deploy --help option.

There are a couple of other parameters that we need to provide to this command.

  • --runtime : This will be one of the current runtimes that is supported on Google Cloud Functions. At the time of writing, the current runtime values are nodejs6 (NodeJS version 6), nodejs8 (NodeJS version 8)and python37 (Python 3.7).
  • --region : This will be the Google Cloud Platform region that you want to deploy your function to. Keep in mind that multiple regions are now supported and that we saw the list of regions that were supported via the gcloud functions regions list command a while earlier. An example value for the region parameter can be us-central1.

For now, let us go to the root folder into which you have cloned the Github repository for this tutorial series.

Go to the helloworld-http folder by giving the following command:

$ cd helloworld-http

We will now deploy our function as shown below:

$ gcloud functions deploy helloGET --trigger-http \ 
--region=us-central1 --runtime=nodejs6
Deploying function (may take a while - up to 2 minutes)...done.
availableMemoryMb: 256
entryPoint: helloGET
httpsTrigger:
url: https://us-central1-mindstormclouddemo.cloudfunctions.net/helloGET
labels:
deployment-tool: cli-gcloud
name: projects/mindstormclouddemo/locations/us-central1/functions/helloGET
runtime: nodejs6
serviceAccountEmail: [email protected]
sourceUploadUrl: https://storage.googleapis.com/...
status: ACTIVE
timeout: 60s
updateTime: '2018-10-12T09:47:31Z'
versionId: '1'

Now that the function is deployed, you can check the same via the following command :

$ gcloud functions list
NAME STATUS TRIGGER REGION
helloGET ACTIVE HTTP Trigger us-central1

We can get the details for any function by using the gcloud functions describe <function_name> command as shown below:

$ gcloud functions describe helloGET

This will get you all information about the function like name, trigger, url, serviceaccount, version, etc.

To invoke the function, we will use the gcloud functions call method as shown below:

$ gcloud beta functions call helloGET
executionId: kapuuigbhgmn
result: Hello World!

Notice that we were provided a unique executionId and the result from our code i.e. “Hello World!”

We can now view the specific logs for our function from Stackdriver logging. Instead of using just logs read which would give us logs across several functions, we can specify the execution id to view the logs specific to that function execution.

The command is shown below (Notice that we provided the same execution id value that we received as part of the call command execution.

$ gcloud functions logs read --execution-id=kapuuigbhgmn
LEVEL NAME EXECUTION_ID TIME_UTC LOG
D helloGET kapuuigbhgmn 2018-10-12 09:51:03.906 Function execution started
D helloGET kapuuigbhgmn 2018-10-12 09:51:04.084 Function execution took 179 ms, finished with status code: 200

Passing data to invoking the call command

In the Cloud Function that we tested above, the code was straightforward and it did not bother about the data that was passed in the request parameter, since it was simply printing out a “Hello World” message.

What if you had to pass data to the function in the request and wanted to the use the gcloud functions call command? To do that, let us look at the sample function as explained below.

For now, let us go to the root folder into which you have cloned the Github repository for this tutorial series.

Go to the hellogreeting-http folder by giving the following command:

$ cd hellogreeting-http

The index.js file has the code as shown below:

exports.helloGreeting = (req, res) => {

console.log(req.body);

// Example input: {“name”: “GCF”}
if (req.body.name === undefined) {
// This is an error case, as “name” is required.
res.status(400).send(‘No name defined!’);
} else {
console.log(req.body.name);
res.status(200).send(‘Hello ‘ + req.body.name);
}
};

First up, we will now deploy our function as shown below:

$ gcloud functions deploy helloGreeting --trigger-http --region=us-central1 --runtime=nodejs6

Once the function is deployed, we can invoke it with the data as given below:

$ gcloud functions call --data '{"name":"Romin"}' helloGreeting
executionId: 36hzafyyt8cj
result: Hello Romin

Similarly, assuming that we have the direct HTTPs URL for the function, we can use curl utility too as given below:

curl -H "Content-Type: application/json" \ 
-X POST \
-d '{"name":"GCF"}' \
https://us-central1-<PROJECT_ID>.cloudfunctions.net/helloGreeting

Replace the PROJECT_ID above with your Google Cloud Platform project id value.

Deploying & invoking Pub/Sub Trigger based Cloud Function

We can use the gcloud beta functions deploy command to deploy a Pub/Sub based Cloud Function as given below:

Image for post
Image for post

Earlier we had deployed our first Cloud Pub/Sub based function pubsubfunction1 earlier in the series. Assuming that we have this function deployed on the topic pubsubtopic1 , we can invoke it as given below:

$ gcloud functions call pubsubfunction1 --data '{"data":"R29vZ2xlIENsb3VkIEZ1bmN0aW9ucw=="}'
executionId: e3qjdpr3l18j

$ gcloud functions logs read — execution-id e3qjdpr3l18jLEVEL NAME EXECUTION_ID TIME_UTC LOG
D pubsubfunction1 e3qjdpr3l18j 2018–04–01 07:52:47.676 Function execution started
I pubsubfunction1 e3qjdpr3l18j 2018–04–01 07:52:47.679 Google Cloud Functions
D pubsubfunction1 e3qjdpr3l18j 2018–04–01 07:52:47.684 Function execution took 8 ms, finished with status: ‘ok’

But what if we could use the gcloud pubsub command itself to test out the function. Sure, we can !

First up, we can list down the Pub/Sub topics as given below:

$ gcloud pubsub topics list
name: projects/gcf-live-training/topics/pubsubtopic1

The next thing is to publish a message to this topic as given below:

$ gcloud pubsub topics publish projects/gcf-live-training/topics/pubsubtopic1 --message Hello
messageIds:
- ‘65108465410522’

We can now get the logs from our function execution as shown below:

$ gcloud functions logs read  --limit=10
D pubsubfunction1 65108465410522 2018–04–01 08:04:30.288 Function execution started
I pubsubfunction1 65108465410522 2018–04–01 08:04:30.290 Hello
D pubsubfunction1 65108465410522 2018–04–01 08:04:30.298 Function execution took 10 ms, finished with status: ‘ok’

Deploying & invoking Google Cloud Storage Trigger based Cloud Function

We can use the gcloud beta functions deploy command to deploy a Google Cloud Storage based Cloud Function as given below:

Image for post
Image for post

Assuming that we have the Google Cloud Storage based function deployed , we can use the gsutil utility to upload a file to the specific bucket that we are monitoring and then view the function logs to ensure that the Cloud function got executed.

First up, let us list down the buckets via the command shown below:

$ gsutil lsgs://gcs-function-bucket1/

You should see at a minimum the bucket name as shown above. This was the bucket that we had created for our GCS based function.

Now, let us copy a sample file package.json to the above bucket via the command shown below:

$ gsutil cp package.json gs://gcs-function-bucket1/
Copying file://package.json [Content-Type=application/json]…
- [1 files][ 105.0 B/ 105.0 B]
Operation completed over 1 objects/105.0 B.

We can then inspect the function execution log as shown below:

$ gcloud functions logs read --limit=20
LEVEL NAME EXECUTION_ID TIME_UTC LOG
D gcs-function-1 65148571114808 2018–04–01 09:23:57.603 Function execution started
I gcs-function-1 65148571114808 2018–04–01 09:23:57.639 Event 65148571114808
I gcs-function-1 65148571114808 2018–04–01 09:23:57.640 Event Type: google.storage.object.finalizeI gcs-function-1 65148571114808 2018–04–01 09:23:57.640 Bucket: gcs-function-bucket1
I gcs-function-1 65148571114808 2018–04–01 09:23:57.640 File: package.json
I gcs-function-1 65148571114808 2018–04–01 09:23:57.640 Metageneration: 1
I gcs-function-1 65148571114808 2018–04–01 09:23:57.640 Created: 2018–04–01T09:23:56.797Z
I gcs-function-1 65148571114808 2018–04–01 09:23:57.640 Updated: 2018–04–01T09:23:56.797Z
D gcs-function-1 65148571114808 2018–04–01 09:23:57.733 Function execution took 131 ms, finished with status: ‘ok’

This completes our use of the gcloud functions command/group to deploy, manage and invoke our functions. There is a command delete too to delete a Cloud Function as needed.

Proceed to the next part : Local Functions Emulator or go back to the Google Cloud Functions Tutorial Home Page.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK