7

Creating a Serverless Docker Image

 3 years ago
source link: https://espressocoder.com/2021/01/05/creating-a-serverless-docker-image/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Creating a Serverless Docker ImageSkip to content

Serverless architectures have been all the rage these days. AWS, Azure, and the Google Cloud Platform have seen the benefits and embraced this new cloud development paradigm. With these tools available, building serverless architectures can be extremely advantageous to organizations that do not want to spend time and money managing their infrastructure in the cloud.

While your cloud provider manages just about everything you need while in the cloud, it is still vital to take the time to turn your development process into a well-oiled machine.  In my previous article, we took a look at how we can use Docker to supplement your local development experience. This strategy can be helpful while you are building your serverless functions. But serverless functions do not always exist in isolation. It is essential to consider the consumers of your functions. Are you building an API for a single-page application? Are other microservices going to be interacting with your serverless functions? Does your organization heavily rely on Docker? Docker does a great job allowing different teams to run applications in an isolated container.

How do serverless functions fit into this model? Is it possible to package your serverless application as a Docker image?  

Packaging serverless

Some people might argue that you should publish your functions directly to the cloud in these cases? One of the key benefits of serverless is you have the option to pay based on consumption. For example, if you are working in an AWS environment, why not just publish your lambda whenever you need someone else to use them? Well, the majority of the time, you can and should do exactly that. This article focuses on those few use cases (mentioned above) where you may want to package your serverless APIs in a Docker container.

Finding the Right Boundaries 

One of the cool things about Function as a Service (FaaS) offerings is the ability to create, deploy, and scale individual functions. That being said, it is still critical to follow architectural best practices by grouping your functions into logical services and identifying bounded contexts. I could write an entire post on the importance of following this strategy in serverless architectures, but we will not go into the details here. For now, assume functions will be logically grouped, and that grouping will make up our container.

Getting started

This article is a continuation of my previous article, so we will use the serverless framework and the serverless-offline plugin to locally run our environment. If you haven’t read the previous article yet, the following steps will get you up to date.

Create a Serverless Project

If you do not already have one, creating a serverless project is a breeze with the following command.

serverless
serverless

Install NPM Packages

Next we need to setup our project to work with npm and install two serverless packages.

npm init
npm install serverless --save-dev
npm install serverless-offline --save-dev
npm init
npm install serverless --save-dev
npm install serverless-offline --save-dev

Update Serverless.yml

If you are creating this project from scratch, there are a couple updates we will need to make to the serverless.yml file. First, we need to add the serverless-offline plugin. Second, we need to expose one of our Lambda functions via HTTP.

service: docker-serverless
provider:
name: aws
runtime: nodejs12.x
plugins:
- serverless-offline
functions:
hello:
handler: handler.hello
events:
- http:
path: /
method: get
service: docker-serverless

provider:
  name: aws
  runtime: nodejs12.x
plugins:
  - serverless-offline

functions:
  hello:
    handler: handler.hello
    events:
      - http:
          path: /
          method: get

What’s in the Box?!?

At this point, we should be able to run our serverless application with the npx serverless offline command locally. With our project setup, creating a Docker image isn’t much more work. Effectively, we just need to add a Dockerfile to the root of our project. Below is an example of one that works nicely.

FROM node:12.20.0-alpine3.10
RUN apk update
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "npx", "serverless", "offline", "--host", "0.0.0.0" ]
FROM node:12.20.0-alpine3.10
RUN apk update

WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install

COPY . .
EXPOSE 3000

CMD [ "npx", "serverless", "offline", "--host", "0.0.0.0" ]

With the Dockerfile added, we can build an image with the following command.

docker build . -t dockerserverless:latest
docker build . -t dockerserverless:latest

With our Docker image built, we can run it with the run command.

docker run dockerserverless
docker run dockerserverless

If everything worked, you should see the following output.

offline: Starting Offline: dev/us-east-1.
offline: Offline [http for lambda] listening on http://0.0.0.0:3002
offline: Function names exposed for local invocation by aws-sdk:
* hello: docker-serverless-dev-hello
┌───────────────────────────────────────────────────────────────────────┐
│ GET | http://0.0.0.0:3000/dev │
│ POST | http://0.0.0.0:3000/2015-03-31/functions/hello/invocations │
└───────────────────────────────────────────────────────────────────────┘
offline: [HTTP] server ready: http://0.0.0.0:3000 🚀
offline:
offline: Enter "rp" to replay the last request
offline: Starting Offline: dev/us-east-1.
offline: Offline [http for lambda] listening on http://0.0.0.0:3002
offline: Function names exposed for local invocation by aws-sdk:
           * hello: docker-serverless-dev-hello

   ┌───────────────────────────────────────────────────────────────────────┐
   │                                                                       │
   │   GET | http://0.0.0.0:3000/dev                                       │
   │   POST | http://0.0.0.0:3000/2015-03-31/functions/hello/invocations   │
   │                                                                       │
   └───────────────────────────────────────────────────────────────────────┘

offline: [HTTP] server ready: http://0.0.0.0:3000 
offline:
offline: Enter "rp" to replay the last request

CLosing Thoughts

As I mentioned previously, in most cases it is easier to publish your serverless application directly to AWS/Azure. Since cost is based on usage, there isn’t a concern for being excessively charged during development cycles. That being said, there are some cases where it is nice to be able to run your serverless application locally. I certainly came across a few usecases recently and found Docker is a great solution!

Like this:

Loading...

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK