Prerequisites
We need to create a user for each stage or environment so that we can isolate environments and stage the deployment for each environment. To do this, go through the following steps for each user:
- Log in to an AWS account as the root user, and go to the IAM (Identity and Access Management) page.
- Click on Users on the left-hand-side bar, then click on the Add User button and add the username dev-serverless. Enable programmatic access by checking the checkbox. Then click on the Next:Permissions button.
- On the Permissions page, select Attach existing policies directly, and search for and select the AdministratorAccess checkbox. Then click on Next:Review.
- Now check that everything is good and then click on Create User. This will create a user and show us the access key id and secret access key. Copy these these keys somewhere temporarily.
- Now that we have the keys, we export them as environment variables so that they will be accessed by the framework to perform the required functions.
- Repeat the preceding five steps for the sit-serverless and prod-serverless users.
- CloudBees AWS Credentials Jenkins Plugin.
Now, go through the following steps to create the pipeline:
- Git clone the following repository into a directory:
$ git clone https://github.com/shzshi/aws-lambda-dynamodb-mytasks.git
- Go into this directory and build the Docker image with the Dockerfile provided. With docker images, we should be able to see the Docker image with the name docker build --rm -f Dockerfile -t aws-lambda-dynamodb-mytasks:latest, as shown in the following code:
$ cd aws-lambda-dynamodb-mytasks
$ docker build --rm -f Dockerfile -t aws-lambda-dynamodb-mytasks:latest .
$ docker images
- Next, we will run the container and open Jenkins on the browser. The initial installation password can be found in the container run logs as shown in the following code:
$ mkdir jenkins
$ docker run --rm -it -p 50000:50000 -p 8080:8080 -v <FULL_PATH_TO_JENKINS_FOLDER>/jenkins:/var/jenkins_home aws-lambda-dynamodb-mytasks:latest
- Go to the browser and open http://localhost:8080. Copy the password from the container run; it should be something like the following output. Once you are logged in, install the suggested plugin and create a Jenkins user for future logins.
6050bfe89a9b463c8e2784060e2225b6
This may also be found at /var/jenkins_home/secrets/initialAdminPassword.
Once Jenkins is up and running, we can go ahead and create a pipeline job, so click on New Item, enter the item name as my-serverless-pipeline, and select the pipeline project. In Job Configure, select the This project is parameterized checkbox, and then in Add Parameter, select Credentials Parameter and then go to the Default Value section of the Credentials Parameter and click on Add. Then select Jenkins. This will open the Jenkins Credentials Provider page. On this page in the Kind drop-down menu, select AWS Credentials and add the users dev-serverless, sit-serverless, and prod-serverless, as shown in the following screenshot. Then, click Add:
Once all the AWS credentials are added, pull them into the Credentials parameter for AWS, as shown in the following screenshot. Make sure that all three types of credential parameter are added, namely dev, sit, and prod:
You need to create your own Git repository and push the files from the repository named https://github.com/shzshi/aws-lambda-dynamodb-mytasks.git. Then, open the Jenkins file in your favorite editor and comment in all the system test code for the entire environment, as shown in the following code. The reason we are doing this is because we will be exporting the API gateway endpoint to execute the system test for the entire environment:
stage ('System Test on Dev') {
steps {
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'dev-serverless', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh '''
export TASKS_ENDPOINT=6pgn5wuqeh.execute-api.us-east-1.amazonaws.com/dev
./node_modules/mocha/bin/mocha ./test/*.js
'''
}
}
}
Now click on the Pipeline tab and select Pipeline script the SCM in the definition. Set the SCM as Git and in the Repository URL field, add the Git repository path created by you. This repository has a Jenkins file, a Lambda function, and a test folder. Leave the rest as default and click on Save. Now our pipeline is saved. It is time to run the pipeline. Jenkinsfile is a great script that will orchestrate the pipeline for us.
Click Build with Parameters. You will then see the environment-based build parameter that will be required for our pipeline to build, test, and deploy our code.
The first run should be without any testing in place, and should only deploy functions and the API gateway for the tasks on the AWS Cloud. The console output will provide us with the endpoints for the tasks, as shown in the following code:
endpoints: POST - https://6pgn5wuqeh.execute-api.us-east-1.amazonaws.com/dev/mytasks GET - https://6pgn5wuqeh.execute-api.us-east-1.amazonaws.com/dev/mytasks GET - https://6pgn5wuqeh.execute-api.us-east-1.amazonaws.com/dev/mytasks/{id} PUT - https://6pgn5wuqeh.execute-api.us-east-1.amazonaws.com/dev/mytasks/{id} DELETE - https://6pgn5wuqeh.execute-api.us-east-1.amazonaws.com/dev/mytasks/{id}
Replace the task endpoint in the Jenkinsfile for the entire environment with the API gateway path listed on the console, as shown in the following code. Then save the Jenkinsfile and push it to the Git repository created by you. The reason we are adding the endpoints late in the build is because API gateway endpoints are created dynamically. But we can also have a static endpoint URL with custom domain names featured in the API gateway:
export TASKS_ENDPOINT=6pgn5wuqeh.execute-api.us-east-1.amazonaws.com/dev
Build the job by clicking on Build with Parameters. Here, we should be able to see that the system test is running along the deployment steps, and the pipeline should be green, as shown in the following screenshot:
In the preceding recipe, we learned how Lambda functions and the API gateway call tasks. We also learned how they are deployed to AWS through Serverless Framework and Jenkins via different environments, such as dev, sit, and prod. We also created and tested a system test. The deployment and execution will deploy the Lambda function and API gateway to the AWS cloud, and every system test will execute the Lambda function to perform CRUD operations on DynamoDB. So if you go into DynamoDB, you should see three tables that are created for each environment. You should also be able to see different functions and the API gateway for each environment.