Lambda Deployment Using AWS CodePipeline + GitHub

16/03/2019 erbileren
The estimated reading time for this post is 8 minutes

If you are using GitHub for code hosting and looking for a cloud platform to serve the code, AWS could be one of the best choices since there are lots of documentation and good samples on the internet. I am also using AWS services for several years, and the main problem for me is there is really much information around the internet and it is sometimes hard to find the best way to do something on AWS. After some unsuccessful attempts, I think that I found a way to use some AWS services for the auto code deployment, hopefully.

 

First things First

Before creating a new pipeline, choose the most suitable platform for your project:

Use S3 if you are working on a client-side static web product. It is really easy to configure and serve static files on S3. Think as you are uploading HTML files to Google Drive but they are accessible through the internet.
Use Lambda if you want to create an API service and don’t want to struggle with server configurations. I strongly recommend you to check how serverless architecture works before using Lambda.
Use EC2 if your product requires more complex and custom configurations and the other AWS services don’t meet your requirements. Don’t forget that EC2 is a typical cloud server like Digital Ocean.

In the rest of this post, I will give instructions for Lambda application deployment with Serverless and AWS CodePipeline.

 

Creating a Serverless Project

When you are developing an API service, using Lambda or any other serverless structure is one of the best practices. AWS Lambda is a serverless platform that you can deploy and run your code without any server configuration and management. It supports several programming languages and works function oriented.

A Sample Application

Here is a simple hello world application with Node.js. It gets a query parameter (name) and returns a message:

//index.js

module.exports.helloWorld = async (event, context, callback) => {
  const param = event["queryStringParameters"]["name"];
  var response = {};

  if (param) {
    response = {
      statusCode: 200,
      headers: {
        "Content-Type": "application/json",
        "Access-Control-Allow-Origin": "*"
      },
      body: `Hello ${param}`
    };
  } else {
    response = {
      statusCode: 400,
      headers: {
        "Content-Type": "application/json",
        "Access-Control-Allow-Origin": "*"
      },
      body: "Please specify a name!"
    };
  }

  callback(null, response);
};

Don’t forget to initialize npm with “npm init” command for your project to create a package.json file. Since Lambda works as a function based, we don’t need any server framework (like express).

To create a new Lambda application, there are two options:

  1. Creating a function manually using AWS Lambda application wizard: In this case, you can write or upload and test your code using your browser. However, you need to do some configurations between your Lambda functions and other AWS services such as API Gateway and CloudWatch.
  2. Deploying your application using Serverless. You can deploy a Lambda application with this framework just with one command.

If you want to use Serverless, you need a configuration file (serverless.yml). In this configuration, you need to define your service and function details:

# serverless.yml

service: hello-world

provider:
  name: aws
  runtime: nodejs8.10
  region: us-east-1
  stage: dev

functions:
  sayHello:
    handler: index.helloWorld
    events:
      - http: 'GET /hello'

When you have these 2 files, you are ready to deploy your first Lambda application.

Simply with installing serverless package and running the deployment command on your local machine, Serverless will deploy your function and create an endpoint with binding the function to API Gateway automatically.

npm install serverless -g
serverless deploy

In this case, you must deploy your code after any changes and it is separated from your GitHub repository.

 

Let’s do this automatically by creating a pipeline on AWS CodePipeline…

 

Creating a Pipeline

To create a pipeline just click on Services on the top of the AWS console and start to write on the search box:

By clicking on the orange button, “Create Pipeline”, you will be redirected to the pipeline setup settings window.

Step 1: Pipeline Settings

Specify your pipeline name, service role, and storage target.

If you don’t have any defined role before just leave the role selection on the “New service role”. It will create a new IAM role for your pipeline automatically. If you need a custom permission to use other AWS services, don’t forget to attach new policies to your role using IAM>Roles.

Artifact store selection is for determining the location of your bundled code is stored. You can select the default one or address to one of your S3 buckets. If you want to keep your buckets tidy, I recommend using S3 bucket for storing artifact. Otherwise, it becomes difficult to manage S3 buckets as the number of projects increases.

Step 2: Source Settings

When you click on “Connect to Github”, the page requests you to log in your GitHub account and give permission to the application. AWS CodePipeline will add a new webhook to the repository to detect push action automatically. Also in the source window, you need to specify the repository and the branch. The pipeline source works for a single branch changes. I don’t try with multiple branches but there are some solutions for this around the web. I think, working on a single branch at one time is the easiest way to manage the pipeline if you have multiple development environments.

Step 3: Build Settings

Even though the build stage is optional, for the dependency installation or deployment to S3 and Lambda, you must create a build step for the pipeline. In this window, you will choose Build provider (it is AWS CodeBuild in our case), region and CodeBuild project details.

I recommend you to create a CodeBuild project in this step instead of creating before pipeline. You can do this easily by clicking on the “Create Project” button.

Build Project Creation

You can create a CodeBuild project using the popup. It asks you to define the details of the project.

Project Details:

The first section is the identification of the project, fill name and description areas.

Environment Settings:

The next section is the environment definition. If you want to use Docker, just specify it and start to use your Docker image. In my case, I will continue with Node.js with the new service role. You can also manage environment variables from this section. If your project needs environment variables, you can add them through additional configuration section. Keep in mind that you can specify project-based environment variables on the build specification file.

Buildspec:

Buildspec is the build specification of your project. You can create a specification file on your project or specify the name of that file on “Use a buildspec file” option or define the commands for each section using “Insert build commands”. In the following, I will write manually build commands using the buildspec editor.

The default configuration is defined by the AWS in the editor mode:

version: 0.2

#env:
  #variables:
     # key: "value"
     # key: "value"
  #parameter-store:
     # key: "value"
     # key: "value"

phases:
  #install:
    #commands:
      # - command
      # - command
  #pre_build:
    #commands:
      # - command
      # - command
  build:
    commands:
      # - command
      # - command
  #post_build:
    #commands:
      # - command
      # - command
#artifacts:
  #files:
    # - location
    # - location
  #name: $(date +%Y-%m-%d)
  #discard-paths: yes
  #base-directory: location
#cache:
  #paths:
    # - paths

Update the default file with the following one:

version: 0.2

phases:
  install:
    commands:
      - echo install started...
      - npm install
      - echo install finished.
  pre_build:
    commands:
      - echo pre_build started...
      - npm install -g serverless
      - echo pre_build finished.
  build:
    commands:
      - echo build started...
      - serverless deploy
      - echo build finished.
artifacts:
  files:
    - '**/*'

CodeBuild detects these commands and runs them in each phase. Since our application is so simple, defining only 3 phases is enough for now.

  • On the install phase, it will install your node dependencies defined in the package.json file.
  • On the pre_build phase, we install serverless globally as we do it on the local machine. We need to this every build stages since this is a virtual environment.
  • On the build phase, our code will be deployed to Lambda with running deployment command of Serverless.
Logs:

I strongly recommend you to activate CloudWatch logs for your CodeBuild project to track build and deployment logs. It will assign a group and stream names automatically so you can leave empty these fields.

Step 4: Review and Confirmation

When you click on “Next” your CodeBuild summary will be shown. If all configurations are correct, you can create your pipeline by clicking on “Create Pipeline” button.

Final: Pipeline View

That’s all. Now you can commit and push your code to GitHub and see how CodePipeline runs and deploy your code to Lambda. You can always track logs of CodeBuild by clicking on”Details” of Build stage. If there is an error during the building process, it will write it to logs and your pipeline process will fail.

, , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *