In this chapter, we will finally talk about fully serverless compute options, that is, Cloud Functions. This means no more servers and no more containers. This service is leveraging them in the backend, but they aren't visible to the end user. All we need to care about now is the code. Cloud Functions is a Function-as-a-Service (FaaS) offering. This means that you write a function in one of the languages supported by GCP and it can be triggered by an event or via HTTP. GCP takes care of provisioning and scaling the resources that are needed to run your functions.
Important Note
How does Cloud Functions work in the backend? Again, you don't need to bother with GCP's backend infrastructure, which runs the functions for you. However, being an engineer, I bet you will still search for answers on your own. Cloud Functions uses containers to set an isolated environment for your function. These are called Cloud Functions instances. If multiple functions are executed in parallel, multiple instances are created.
Exam Tip
Expect Cloud Functions questions to appear in the Cloud Architect exam. You will need to understand what Cloud Functions is and what the most common use cases are. Being able to tell the difference between two types of functions, namely HTTP and backend functions, is also important. Knowing when you would use Cloud Functions rather than other compute options, as well as remembering what programming languages are supported, is crucial. Finally, be sure that you can deploy a function both from the Google Cloud Console and the command line.
In this chapter, we will cover the following topics:
The following are the key Cloud Functions characteristics:
Now that we are aware of the main characteristics, let's look at some use cases.
Now that we have a basic understanding of Cloud Functions, let's have a look at numerous use cases. Remember that, in each of these use cases, you can still use other compute options. However, it is a matter of delivering the solution quickly, taking advantage of in-built autoscaling, and paying only for what we have used.
Instead of using virtual machines for backend computing, you can simply use functions. Let's have a look at some example backends:
Next, let's have a look at some data processing examples.
When it comes to event-driven data processing, Cloud Functions can be triggered whenever a predefined event occurs. When this happens, it can preprocess the data that's passed for analysis with GCP big data services:
Next, let's look at examples of smart applications.
Smart applications allow users to perform various tasks in a smarter way by using a data-driven experience. Some of these are as follows:
Now, let's cover some runtime environment examples.
Cloud Functions are executed in a fully managed environment. The infrastructure and software that's needed to run the function are handled for you. Each function is single-threaded and is run in an isolated environment with the intended context. You don't need to take care of any updates for that environment. They are auto-updated for you and scaled as needed.
Currently, several runtimes are supported by Cloud Functions, namely the following:
When you define a function, you can also define the requirements or dependencies file in which you state which modules or libraries your function is dependent on. However, remember that those libraries will be loaded when your function is executed. This causes delays in terms of execution. We will talk about this in more detail in the Cold start section of this chapter.
In the next section, we will look at types of Cloud Functions.
There are two types of Cloud Functions: HTTP functions and background functions. They differ in the way they are triggered. Let's have a look at each.
HTTP functions are invoked by HTTP(S) requests. The POST, PUT, GET, DELETE, and OPTIONS HTTP methods are accepted. Arguments can be provided to the function using the request body:
The invocation can be defined as synchronous as it can return a response that's been constructed within the function.
Interesting Fact
Don't expect a question on this on the exam. However, it might be interesting to know that Cloud Functions handles HTTP requests using common frameworks. For Node.js, this is Express 4.16.3, for Python, this is Flask 1.0.2, and for Go, this is the standard http.HadlerFunc interface.
Background functions are invoked by events such as changes in the Cloud Storage bucket, messages in the Cloud Pub/Sup topic, or one of the supported Firebase events:
In the preceding diagram, we can see various triggers for Cloud Functions; that is, Cloud Pub/Sub, Cloud Storage, and Firebase.
Next, let's take a look at events.
Events can be defined as things happening in or outside the GCP environment. When they occur, you might want certain actions to be triggered. An example of an event might be a file that's been added to Cloud Storage, a change that was made to your database table, or a new resource that has been provisioned to GCP, to name a few. These events can come from one of the following providers:
If you create a sink to forward the logs to Pub/Sub, then you can trigger Cloud Functions (for more details, check out Chapter 17, Monitoring Your Infrastructure).
Next, let's look at triggers.
For your function to react to an event, a trigger needs to be configured. The actual binding of the trigger happens at deployment time. We will have a look at how to deploy functions with different kinds of triggers in the Deploying Cloud Functions section.
When using Cloud Functions, you should be aware of a couple of features and considerations. Let's have a look at these now.
As we mentioned previously, Cloud Functions is stateless and the state needs to be saved on external storage or in a database. This can be done with external storage such as Cloud Storage or a database such as Cloud SQL. In general, any external storage can be used. We introduced Cloud SQL in Chapter 3, Google Cloud Platform Core Services. To remind you, it is a managed MySQL, Postgres, or MS SQL database. With Cloud Functions, you can connect to Cloud SQL using a local socket interface that's provided in the Cloud Functions execution environment. It eliminates the need to expose your database to a public network.
If your function needs to access services within a VPC, you can connect to it directly by passing a public network. To do this, you need to create a serverless VPC access connector from the network menu and refer to the connector when you deploy the function. Note that this does not work with Shared VPC and legacy networks.
Cloud Functions allows you to set environment variables that are available during the runtime of the function. These variables are stored in the function's backend and follow the same life cycle as the function itself. These variables are set using the --set-env-vars flag; for example:
gcloud functions deploy env_vars --runtime python37 --set-env-vars FOO=bar --trigger-http
Important Note
The first time you use Cloud Functions, you will be asked to enable the API.
Next, let's take a look at cold starts.
As we mentioned previously, Cloud Functions executes using function instances. These new instances are created in the following cases:
Cold starts can impact the performance of your application. Google comes with a set of tips and tricks to help us reduce the impact of cold starts. Check out the Further reading section for a link to a detailed guide.
Deploying functions to GCP takes time. If you want to speed up tests, you can use a local emulator. This only works with Node.js and allows you to deploy, run, and debug your functions.
In the next section, we will learn how to deploy a Cloud Function.
Cloud Functions can be deployed using a CLI, the Google Cloud Console, or with APIs. In this section, we will have a look at the first two methods since it's likely that you will be tested on them in the exam.
To deploy Cloud Functions from the Google Cloud Console, follow these steps:
Now, your function is ready to execute.
Now that we have seen how to deploy the function using the Google Cloud Console, it will be easier to explain the parameters and flags for the gcloud command.
To deploy Cloud Functions, we can use the following command:
gcloud deploy cloud functions $FUNCTION_NAME
--region=$REGION
--entry-point=$ENTRY_POINT
--memory=$MEMORY
--runtime=$RUNTIME
--service-account=$SERVICE_ACCOUNT
--source=$SOURCE
--stage-bucket=$STAGE_BUCKET
--timeout=$TIMEOUT
--retry
Here, we have the following options:
Next, let's define our triggers.
After defining the necessary parameters, you can define the following triggers, depending on how you want your function to be initiated.
To define an HTTP trigger, use the following command:
--trigger-http
An endpoint will be assigned to the function.
To trigger a function on changes to a Cloud Storage bucket, use the following command:
--trigger-bucket=$TRIGGER_BUCKET
Here, we have the following option:
To trigger a function on messages that are arriving in a Pub/Sub queue, use the following command:
--trigger-topic=$TRIGGER_TOPIC
Here, we have the following option:
For other sources, such as Firebase, use the following command:
--trigger-event=$EVENT_TYPE
--trigger-resource=$RESOURCE
Here, we have the following options:
Let's have a look at an example of configuring a trigger from Pub/Sub:
gcloud functions deploy hello_pubsub --runtime python37 --trigger-topic mytopic
This will deploy a function called help_pubsub, where there will be a message arriving in the mytopic Pub/Sub topic.
Important Note
You may be interested in looking at some more advanced triggers, such as using Firebase authentication. Check out https://cloud.google.com/functions/docs/calling/ for examples for every possible trigger.
In the next section, we will review the IAM roles that are available for Cloud Functions.
Access to Google Cloud Functions is secured with IAM. Let's have a look at a list of predefined roles, along with a short description of each:
Note that for the Cloud Functions Developer role to work, you must also assign the user the IAM Service Account User role on the Cloud Functions runtime service account.
Google Cloud Functions come with predefined quotas. These default quotas can be changed via the hamburger menu via IAM & Admin | Quotas. From this menu, we can review the current quotas and request an increase to these limits. We recommend that you become familiar with the limits for each service as this can have an impact on your scalability. For Cloud Functions, we should be aware of the following three types of quotas:
The list of values is quite extensive. Check out the Further reading section if you wish to see a detailed list.
The price of Cloud Functions consists of multiple factors. These include the number of Invocations, Compute time, and network rate (Networking). These are shown in the following diagram:
Remember that there is a monthly free usage tier that you can play around with without generating any cost. At the time of writing this book, it consists of 2 million invocations, 1 million seconds of compute time, and 5 GB of egress network traffic. Enjoy it!
In this chapter, we talked about Cloud Functions and several use cases where it works perfectly. We talked about two types of functions, namely HTTP and background functions, and also understood that functions can be executed via a particular event or an HTTP request. Finally, we looked at how a function can be deployed both with the Google Cloud Console and with the gcloud command. For the exam, it is important to understand the use cases of Cloud Functions and when using them could be to our advantage.
With this chapter, we have concluded with the Google Compute options. In the next chapter, we will have a look at networking.
For more information regarding the topics that were covered in this chapter, take a look at the following resources: