Menu Icon Close
App Development

Automating build and deployment of Angular 2 and Java application

Automating build and deployment of Angular 2 and Java application
September 19, 2017
By Blue Pi
Reading Time: 9 minutes

In this article, we are going to look into many details how we managed to automate the process of building and deploying our application build on angular 2 and spring boot technology through CI/CD and AWS CodeDeploy. For the head start as continuous integration tool we are using GitLab-CI, AWS S3 for hosting angular 2 code and of course CodeDeploy as a deployment tool.

What is CI/CD ?

Continues Integration is that you build your application, run your unit tests written through the practice of test-driven development at every code change into your central repository and Continues Deployment is that you automate the deployment of every successful build and passed tests.

The tools we are using for CI/CD is gitlab-ci as an integration tool. GitLab CI/CD is a part of GitLab, a web application with an API that stores its state in a database. It manages projects/builds and provides a nice user interface. To perform the actual build, you need GitLab Runner, works with GitLab CI/CD through an API. It can be deployed separately also. Runners run the jobs that you define in .gitlab-ci.yml
If you add a .gitlab-ci.yml file to the root directory of your repository and configure your GitLab project to use a Runner, then each commit or push, triggers your CI pipeline.
The .gitlab-ci.yml file tells the GitLab runner what to do, by default, it runs a pipeline with
three stages: build, test, and deploy. Stages with no jobs are simply ignored.

This guide assumes that you:

  • have a working GitLab account.
  • have a GitLab project that you would like to use CI for

On any push to your repository, GitLab will look for the .gitlab-ci.yml file and start jobs
on Runners according to the contents of the file, for that commit.
You can view all pipelines by going to the Pipelines page in your project.

A pipeline is a group of jobs that get executed in stages. All jobs within the stages are executed in parallel if there are enough concurrent runner, if all jobs succeed pipeline moves to the next stage jobs or if any one of the jobs fails the next stage is not executed.

Okay! so how did we do it and configure CI to do its job, below is the .gitlab-ci.yml file. The below example show’s the script for staging and production build and deploy CI process automated through gitlab-ci.


variables: BETA_CLIENT_BUCKET_NAME: beta-client/ BETA_BACKEND_BUCKET_NAME: ci-build/beta/ PROD_CLIENT_BUCKET_NAME: prod-client/ PROD_BACKEND_BUCKET_NAME: ci-build/prod/
stages: - build - deploy
before_script: - apt-get -q update -y
build_app_beta: stage: build image: openjdk:latest script: - curl -sL https://deb.nodesource.com/setup_6.x | bash - - apt-get -q install -y nodejs - npm --loglevel silent cache clean - npm --loglevel silent install -g @angular/cli@latest - apt-get -q install -y maven - cd client/ - npm install - ng build --prod --aot - cd ..
- cd backend/ - mvn package artifacts: paths: - client/dist/ - backend/target/example-project-0.0.1-SNAPSHOT.jar when: manual only: - staging
deploy_app_beta: stage: deploy image: python:latest script: - pip install awscli --upgrade - aws s3 cp client/dist/. s3://$BETA_CLIENT_BUCKET_NAME --recursive - tar -zcvf app_build.tar.gz backend/target/example-project-0.0.1-SNAPSHOT.jar appspec.yml scripts/ - aws s3 cp ./ s3://$BETA_BACKEND_BUCKET_NAME --recursive --exclude "*" --include "*.tar.gz" environment: name: example/beta dependencies: - build_app_beta when: manual only: - staging
build_app_prod: stage: build image: openjdk:latest script: - curl -sL https://deb.nodesource.com/setup_6.x | bash - - apt-get -q install -y nodejs - npm --loglevel silent cache clean - npm --loglevel silent install -g @angular/cli@latest - apt-get -q install -y maven - cd client/ - npm install - ng build --prod --aot - cd .. - cd backend/ - mvn package artifacts: paths: - client/dist/ - backend/target/example-project-0.0.1-SNAPSHOT.jar when: manual only: - master
deploy_app_prod: stage: deploy image: python:latest script: - pip install awscli --upgrade - aws s3 cp client/dist/. s3://$PROD_CLIENT_BUCKET_NAME --recursive - tar -zcvf app_build.tar.gz backend/target/example-project-0.0.1-SNAPSHOT.jar appspec.yml scripts/ - aws s3 cp ./ s3://$PROD_BACKEND_BUCKET_NAME --recursive --exclude "*" --include "*.tar.gz" environment: name: example/production dependencies: - build_app_prod when: manual only: - master

Since our front-end application is in angular 2 we are hosting it as static web hosting in AWS S3.
So let’s break the script and under stand it part-by-part :

variables: BETA_CLIENT_BUCKET_NAME: beta-client/ BETA_BACKEND_BUCKET_NAME: ci-build/beta/ PROD_CLIENT_BUCKET_NAME: prod-client/ PROD_BACKEND_BUCKET_NAME: ci-build/prod/

Here, we have defined the variables for the CI to use and each these variable holds the bucket name as its value. We have one bucket for staging and one for production to host the angular 2 code and one bucket to hold the java jar file for respective environment.

stages: - build - deploy

We are using only two stages for this example i.e build and deploy. The build stage compiles our angular 2 code for production ready deployment, compile java and packages the application in a jar file. The deploy stage puts our code into the respective S3 buckets.

stages: before_script: - apt-get -q update -y

The before_script is used to define the command that should be run before all jobs. This can be an array or a multi-line string.


build_app_beta: stage: build image: openjdk:latest script: - curl -sL https://deb.nodesource.com/setup_6.x | bash - - apt-get -q install -y nodejs - npm --loglevel silent cache clean - npm --loglevel silent install -g @angular/cli@latest - apt-get -q install -y maven - cd client/ - npm install - ng build --prod --aot - cd .. - cd backend/ - mvn package artifacts: paths: - client/dist/ - backend/target/example-project-0.0.1-SNAPSHOT.jar when: manual only: - staging

It’s the first job out of the two on which we are going to work, what this job does is, it install the all necessary software required to compile our angular 2 and java application and compile them for deployment, the two commands doing this are :


ng build --prod --aot //for angular2 application
and
mvn package // for java jar file
artifacts: paths: - client/dist/ - backend/target/example-project-0.0.1-SNAPSHOT.jar

The artifacts is used to bundle list of files and directories which should be attached to the job after success, since post compilation of respective angular 2 and java application the compiled code needs to be bundled up i.e contents of dist/ folder in case of angular 2 application and jar file of our java application for the next deploy job to actually deploy them.


when: manual

when is used for manual trigger of the job, it can also be used for on_success,on_failure and always, these jobs needs to run only when changes are pushed into the particular repository i.e in our case staging. For the job to execute automatically as soon as code is pushed into the repository ignore this when.


deploy_app_beta: stage: deploy image: python:latest script: - pip install awscli --upgrade - aws s3 cp client/dist/. s3://$BETA_CLIENT_BUCKET_NAME --recursive - tar -zcvf app_build.tar.gz backend/target/example-project-0.0.1-SNAPSHOT.jar appspec.yml scripts/ - aws s3 cp ./ s3://$BETA_BACKEND_BUCKET_NAME --recursive --exclude "*" --include "*.tar.gz" environment: name: example/beta dependencies: - build_app_beta when: manual only: - staging

Here, the code compiled successfully from the previous stage is pushed into the AWS S3 buckets with the help AWS CLI, the below code snippet actually does the work of deployment. dependencies are used in conjunction with artifacts to allow you to pass artifacts between different jobs, in dependencies pass a list of all previous jobs from which artifacts should be downloaded and we can only define the jobs which are already executed before our current job.


script: - pip install awscli --upgrade - aws s3 cp client/dist/. s3://$BETA_CLIENT_BUCKET_NAME --recursive - tar -zcvf app_build.tar.gz backend/target/example-project-0.0.1-SNAPSHOT.jar appspec.yml scripts/ - aws s3 cp ./ s3://$BETA_BACKEND_BUCKET_NAME --recursive --exclude "*" --include "*.tar.gz"

To use AWS CLI we need an AWS user Access key id and Secret access key which we can set into the environment variable of the gitlab. GitLab has a separate place for secret variables go to : Settings > Variables

Whatever you put here will be turned into environment variables. Only an administrator of a project has access to this section.
The next two jobs for building and deploy in the production environment, the code is same only the change is in the S3 bucket name and branch name.

Now let’s move towards the deployment

Since our angular 2 code we are hosting as static web hosting in S3 bucket that’s already deployed, what we need to take care of now is the deployment of Java application i.e jar file which can be achieved through AWS CodeDeploy.

What is AWS CodeDeploy?

AWS CodeDeploy is a deployment service that automates application deployments to Amazon EC2 instances or on-premise infrastructure. CodeDeploy can deploy application content stored in S3, GitHub repository or AWS CodeCommit. CodeDeploy also helps you to avoid downtime during application deployment, also helps you with nice rollback feature in case deployment failure.

Few terms which we will use :

    1. AppSpec file : An application specification file, which is unique to AWS CodeDeploy, is a YAML-formatted file used to :
  • Map the source files in your application to their destinations on EC2 instance.
  • Specify scripts to be run in each instance at various stages of the deployment process.

The AppSpec file manages each deployment as a series of life cycle events. Life cycle event hooks, which are defined in the file, allow you to run scripts on an instance. AWS CodeDeploy runs only those scripts specified in the file

    2. Deployment Application:– The unique name which will be given to your Deployment Application.
    3. Deployment Group:– It is defined as a group of individual instances and auto-scaled instances.
    4. Deployment Configuration:– It lets you decide that how you want your code to be deployed: one at a time/ half at a time/ all at once.

For deployment we are going to use a single AWS EC2 instance(Launch Amazon EC2 instance), this instance is already launched with AWS code-deploy agent and aws cli installed.

We need two IAM roles, one role is for EC2 instance to have access to S3 and other role is for code-deploy having access to chooses EC2 instances based on tags defined. We already have S3 bucket ci-build containing the content for deployment with jar file, AppSpec file and scripts zipped in tar.gz
Policy Role for Code Deploy


{ "Version": "2012-10-17", "Statement": [ { "Action": [ "autoscaling:PutLifecycleHook", "autoscaling:DeleteLifecycleHook", "autoscaling:RecordLifecycleActionHeartbeat", "autoscaling:CompleteLifecycleAction", "autoscaling:DescribeAutoscalingGroups", "autoscaling:PutInstanceInStandby", "autoscaling:PutInstanceInService", "ec2:Describe*" ], "Effect": "Allow", "Resource": "*" } ]
}

Policy Trust for Code Deploy


{ "Version": "2012-10-17", "Statement": [ { "Sid": "", "Effect": "Allow", "Principal": { "Service": [ "codedeploy.ap-south-1.amazonaws.com" ] }, "Action": "sts:AssumeRole" } ]
}

Instance Role for EC2 Instance


{ "Version": "2012-10-17", "Statement": [ { "Action": [ "s3:Get*", "s3:List*" ], "Effect": "Allow", "Resource": "*" } ]
}

Installing aws code-deploy agent and aws cli


sudo apt-get update -y
#On Ubuntu Server 16.04
sudo apt-get install ruby -y
sudo apt-get install wget -y
cd /home/ubuntu
#Mumbai region
wget https://aws-codedeploy-ap-south-1.s3.amazonaws.com/latest/install
chmod +x ./install
sudo ./install auto
sudo service codedeploy-agent start
echo `sudo service codedeploy-agent status`
#install aws cli
curl -O https://bootstrap.pypa.io/get-pip.py
python get-pip.py
pip install awscli --upgrade

AppSpec file


version: 0.0
os: linux
files: - source: appspec.yml destination: /home/ubuntu - source: scripts/ destination: /home/ubuntu/scripts/ - source: backend/target/example-project-0.0.1-SNAPSHOT.jar destination: /home/ubuntu
hooks: ApplicationStop: - location: scripts/application_stop.sh timeout: 180 runas: root ApplicationStart: - location: scripts/application_start.sh timeout: 180 runas: root ValidateService: - location: scripts/application_validation.sh timeout: 180 runas: root

For more on AppSpec file AppSpec File Structure, AppSpec File Example, AppSpec ‘file’ section, AppSpec ‘hook’ section

Creating AWS CodeDeploy Application


Sign in to the AWS Console . Go to the services and click on “Code Deploy” as shown below.

Now, Click on “Create New Application” button. It will open up the prompt to create a new application.

A new window will ask about the details for creating an application. Enter Application Name, Application Group Name.

Choose instances to which you want to deploy the code using the Key and Value and Choose your Deploy Configuration: One at a time /Half at a time /All at a time. This configuration lets you choose how you want to deploy your code.

Then Click on “CREATE APPLICATION ” button. Your application will be created. You have to create a new revision. Click on Deploy New Revision button to create a new revision.

Now enter the Application Name, Deployment Group Name. Choose Revision type: My application is stored in Amazon S3. Give the location of Bucket and the file name (You can also copy the full path of file from AWS S3 and paste it). Select Overwrite the content option from Content Options. After entering all the details, click on Deploy. Now your application and code are being deployed.

The status will appear as Succeeded. You can now hit the IP of your instance and you will get the application running that you deployed.

GitLab CI is a free service provided and we use it in conjunction with CI/CD tools for many of our projects ranging from Java, Node. Check out how to build a dockerized environment for Node application, for which deployment has been automated using Gitlab CI.

Happy reading! Don’t forget to comment and share.

Kaushal Mewar, Software Engineer

Kaushal Mewar is a Software Engineer, with 2+ years of experience in Software development and IT Infrastructure, currently working with BluePi.
Key areas of interest include Java Technologies, Cloud Computing, Databases, Open Source Technologies and Server Automations.

TAGS : , ,

Invest in better
solutions today

Contact Us