Create and deploy a complete CI/CD pipeline using Jenkins, starting with instance setup
๐ก Introduction
Welcome to the world of Cloud and Automation! If you're someone who's eager to get hands-on with real-world DevOps tools and practices, you're in for a treat. In this blog, weโll walk through how to deploy an Amazon Clone built with Node.js to an AWS EC2 instance, running inside a Docker container.
But thatโs not all โ weโll be using Terraform to provision our infrastructure and Jenkins to automate the entire CI/CD pipeline. This is a beginner-friendly yet powerful project that combines cloud provisioning, containerization, and automation โ three pillars of modern DevOps.
Whether youโre just starting out or looking to solidify your skills, this guide will help you get a solid grasp on how things work in a real-world deployment workflow.
So without further ado, letโs dive in and bring this project to life! ๐๐ณโ๏ธ
๐ก Pre-Requisites
Before we roll up our sleeves and start deploying, letโs make sure youโve got everything set up and ready to go. Here are the essentials youโll need for this project:
โ An AWS Account โ You should have an AWS account ready with an IAM user that has full EC2 access and AWSCLI configured. This is where weโll be provisioning our infrastructure.
๐ณ Basic Knowledge of Docker โ You donโt need to be a Docker expert, but having a basic understanding of images, containers, and how Dockerfiles work will help you a lot during this project.
If you're new to either of these tools, no worries โ Iโll walk you through each step. Now that youโre geared up, letโs move on to setting up our infrastructure with Terraform!
๐กStep 1: Test the Application Locally
Before we jump into cloud deployment, itโs always a good idea to test the application locally. This helps ensure everything works as expected before we automate it.
Our Amazon Clone app (built with Node.js) is hosted on . Clone the repository using the following commands:
git clone https://.com/Pravesh-Sudha/amazon-clone.git
cd amazon-clone/
Now, ensure your Docker engine is running. Weโll build the Docker image for the Amazon Clone:
docker build -t amazon-clone .
This command will build the image and tag it as amazon-clone:latest
.
Once the image is ready, run the container with the following command:
docker run -p 3000:3000 --name amazon-clone amazon-clone:latest
Now, open your browser and navigate to:
http://localhost:3000
You should see the Amazon Clone application running locally! ๐
๐ ๏ธ Step 2: Install Terraform
With local testing complete, itโs time to move towards the cloud. We'll provision infrastructure on AWS using Terraform.
If you're on Ubuntu (amd64), install Terraform using the following commands:
sudo apt install wget -y
wget -O- https://apt.releases.hashicorp.com/gpg | sudo gpg --dearmor -o /usr/share/keyrings/hashicorp-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/hashicorp-archive-keyring.gpg] https://apt.releases.hashicorp.com $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/hashicorp.list
sudo apt update && sudo apt install terraform
To verify the installation:
terraform -version
If you've already configured the AWS CLI with your IAM user credentials, youโre all set. Terraform will use that configuration to create resources.
Now initialize the Terraform setup:
cd amazon-clone/Config
terraform init
โ๏ธ Step 3: Configure Terraform Before Apply
Before running Terraform, we need to tweak the default EC2 configuration. Open the file: Config/
main.tf
.
Hereโs the block that needs your attention:
resource "aws_instance" "web" {
ami = "ami-020cba7c55df1f615" # โ
Replace with your preferred Ubuntu AMI ID
instance_type = "t2.medium"
key_name = "default-ec2" # โ
Replace with your actual key pair name from AWS
vpc_security_group_ids = [aws_security_group.Jenkins-sg.id]
user_data = templatefile("./install_tools.sh", {})
tags = {
Name = "amazon clone"
}
root_block_device {
volume_size = 30
}
}
AMI ID: Use an Ubuntu AMI (you can find it in your AWS EC2 console).
Key Name: Use a valid key pair name that exists in your AWS account. If you donโt have one, go to the EC2 dasard โ Key Pairs โ Create Key Pair.
๐ What is install_
tools.sh
?
Notice the user_data
script in the Terraform config:
user_data = templatefile("./install_tools.sh", {})
This Bash script will automatically install tools like:
Docker
Jenkins
SonarQube
Trivy
And any other dependencies required
So no need to SSH into the instance and manually set them up โ just sit back and let Terraform + cloud-init do the job!
๐ Step 4: Apply the Terraform Configuration
Now, letโs deploy the infrastructure:
terraform apply --auto-approve
Give it a few minutes (around 5) โ Terraform will spin up the EC2 instance with the required configuration. Once itโs up, youโll have an environment ready to run the Amazon Clone app inside a Docker container, with all necessary tools installed.
๐ง Step 5: Setting Up SonarQube and Jenkins
Now that our EC2 instance is up and running โ and our install_
tools.sh
script has pre-installed Jenkins and SonarQube โ letโs start setting them up for use in our CI/CD pipeline.
โ Accessing SonarQube & Generating Token
Go to your browser and visit:
http://<Your-EC2-Public-IP>:9000
Youโll land on the SonarQube login screen.
Use the default credentials:
* **Username:** `admin`
* **Password:** `admin`
After the first login, SonarQube will prompt you to change the default password.
Once logged in, go to:
Administration โ Security โ Users โ Tokens
Click Generate Token, give it a name like
jenkins
, and copy/save the token safely โ weโll use it later to integrate with Jenkins.
๐ Step 6: Setting Up Jenkins
๐ Get Jenkins Admin Password
Head to your AWS EC2 Dasard.
Select your running
Amazon Clone
instance.Click Connect โ EC2 Instance Connect.
Once inside the instance, run:
sudo su
cat /var/lib/jenkins/secrets/initialAdminPassword
Copy this password and go to your browser:
```bash
http://<Your-EC2-Public-IP>:8080
```
Paste the password to unlock Jenkins and click Continue.
Choose Install Suggested Plugins.
Set up your first admin user (I named mine admin
, feel free to choose your own).
๐งฉ Install Required Jenkins Plugins
Go to:
Manage Jenkins โ Plugins โ Available Plugins
Install the following:
Eclipse Temurin Installer
SonarQube Scanner
NodeJS Plugin
Docker Pipeline
Docker Commons
Docker API
Docker Build Step
Once installed, restart Jenkins.
๐ Add SonarQube Token in Jenkins Credentials
Now letโs securely store the token we generated from SonarQube.
Navigate to:
Manage Jenkins โ Credentials โ Global โ Add Credentials
Choose Kind: Secret Text
Paste the SonarQube token here.
Give it an ID like
jenkins
.
๐ฆ Create Sonar Project & Token
Now go back to:
http://<Your-EC2-Public-IP>:9000
Create a new project manually.
Give it a name like
Amazon
.Choose locally and generate another token for this project.
๐ณ Add DockerHub Credentials to Jenkins
Letโs store DockerHub credentials for pushing images from Jenkins.
Go to:
Manage Jenkins โ Credentials โ Global โ Add Credentials
- Select:
* **Kind:** Username and Password
* **Username:** Your DockerHub username
* **Password:** Your DockerHub password
* **ID:** `docker`
๐งฐ Step 7: Install Tools in Jenkins
Letโs configure all necessary tools for our CI/CD pipeline:
โ JDK Installation
Go to:
Manage Jenkins โ Tools โ JDK installations
Click Add JDK
Name it
jdk17
, check Install automaticallySelect:
* **Install from** [**adoptium.net**](http://adoptium.net)
* **Version:** `jdk17.0.9.1+1`
๐ข Node.js Installation
Still under Tools:
Add NodeJS installation.
Name:
node16
Version:
16.2.0
๐ Docker Installation
Add Docker.
Enable Install Automatically
Version:
latest
๐ SonarQube Scanner Installation
Add SonarQube Scanner
Name:
sonar-scanner
๐ก๏ธ OWASP Dependency Check
- Add:
* Name: `DP-Check`
* Check: **Install Automatically**
* Install from: [`.com`](http://.com)
๐ Step 8: Configure Global SonarQube Settings
Now link Jenkins with your SonarQube server:
Go to:
Manage Jenkins โ System
Scroll to SonarQube Servers
Add a new server:
* **Name:** `sonar-server`
* **Server URL:**
```bash
http://<Your-EC2-Public-IP>:9000
```
* **Authentication Token:** Choose the credential ID (`jenkins`) created earlier
This completes our SonarQube and Jenkins setup โ all tools and integrations are ready for our CI/CD pipeline. ๐ฏ
๐ Step 9: Create Jenkins Pipeline for CI/CD
Now that Jenkins is fully configured with all the necessary tools and integrations, it's time to bring everything together in a Jenkins Pipeline.
๐ Create a New Pipeline Job
Go to your Jenkins Dasard.
Click New Item โ Select Pipeline โ Name it
amazon-clone
.Scroll down to the Pipeline Script section.
Paste the following Jenkinsfile code:
pipeline {
agent any
tools {
jdk 'jdk17'
nodejs 'node16'
}
environment {
SCANNER_HOME = tool 'sonar-scanner'
}
stages {
stage('Clean Workspace') {
steps {
cleanWs()
}
}
stage('Checkout from Git') {
steps {
git branch: 'main', url: 'https://.com/Pravesh-Sudha/amazon-clone.git'
}
}
stage('SonarQube Analysis') {
steps {
withSonarQubeEnv('sonar-server') {
sh '''$SCANNER_HOME/bin/sonar-scanner \
-Dsonar.projectName=Amazon \
-Dsonar.projectKey=Amazon'''
}
}
}
stage('Quality Gate') {
steps {
script {
waitForQualityGate abortPipeline: false, credentialsId: 'jenkins'
}
}
}
stage('Install Dependencies') {
steps {
sh 'npm install'
}
}
stage('OWASP FS Scan') {
steps {
dependencyCheck additionalArguments: '--scan ./ --disableYarnAudit --disableNodeAudit', odcInstallation: 'DP-Check'
dependencyCheckPublisher pattern: '**/dependency-check-report.xml'
}
}
stage('Trivy File System Scan') {
steps {
sh 'trivy fs . > trivyfs.txt'
}
}
stage('Docker Build & Push') {
steps {
script {
withDockerRegistry(credentialsId: 'docker', toolName: 'docker') {
sh 'docker build -t amazon-clone .'
sh 'docker tag amazon-clone pravesh2003/amazon-clone:latest'
sh 'docker push pravesh2003/amazon-clone:latest'
}
}
}
}
stage('Trivy Image Scan') {
steps {
sh 'trivy image pravesh2003/amazon-clone:latest > trivyimage.txt'
}
}
stage('Deploy to Container') {
steps {
sh 'docker run -d --name amazon-clone -p 3000:3000 pravesh2003/amazon-clone:latest'
}
}
}
}
๐ Donโt forget to update:
- The DockerHub username (
pravesh2003
) with your own Docker ID.
๐ Quick Breakdown of Whatโs Happening
This pipeline covers the entire DevSecOps lifecycle:
Clean Workspace: Clears the previous build directory to avoid conflicts.
Git Checkout: Pulls the latest code from your repository.
SonarQube Analysis: Analyzes code quality and vulnerabilities.
Quality Gate: Ensures the code meets SonarQube's quality thresholds before continuing.
Install Dependencies: Installs Node.js dependencies via
npm
.OWASP FS Scan: Scans for known dependency vulnerabilities.
Trivy FS Scan: Performs a filesystem security scan for additional security insights.
Docker Build & Push: Builds the Docker image and pushes it to DockerHub.
Trivy Image Scan: Scans the Docker image for security vulnerabilities.
Deploy to Container: Runs the final image on the EC2 instance.
๐ Step 10: Application Live on EC2
Once the pipeline runs successfully, head to your browser:
http://<Your-EC2-Public-IP>:3000
Youโll see your Amazon Clone application live and ready to go!
You can also visit:
http://<Your-EC2-Public-IP>:9000
To view the SonarQube dasard, including detailed reports on code quality, bugs, and vulnerabilities.
๐งน Step 11: Tear Down AWS Resources (To Save Cost)
Before we wrap up the project, letโs clean up and destroy the infrastructure we created. This is an important habit when working with cloud services โ you donโt want to rack up unnecessary charges.
To do that, run the following command from the project directory:
cd amazon-clone/Config
terraform destroy --auto-approve
In a couple of minutes, Terraform will remove all the resources (EC2 instance, security groups, etc.) from your AWS account. ๐ธ๐จ
โ Final Thoughts
And thatโs a wrap! ๐
In this hands-on project, you learned how to:
๐งช Test and Dockerize a Node.js application locally
โ๏ธ Provision infrastructure on AWS using Terraform
โ๏ธ Set up a full CI/CD pipeline with Jenkins
๐ณ Build and push Docker images to DockerHub
๐ Scan for vulnerabilities with Trivy, OWASP Dependency Check, and SonarQube
๐ Deploy and run your app on an AWS EC2 instance
This end-to-end pipeline is an excellent demonstration of modern DevSecOps practices, combining infrastructure-as-code, CI/CD automation, container security, and static code analysis โ all in one place!
If you enjoyed this guide and want to explore more projects around Cloud, DevOps, and AI, feel free to connect with me:
๐งโ๐ป : Pravesh-Sudha
๐น YouTube: pravesh-sudha
๐ Blog: blog.praveshsudha.com
๐ฆ Twitter: praveshstwt
๐ผ LinkedIn: Pravesh Sudha
Thanks for reading!
Until next time, keep building, keep learning. ๐๐จโ๐ป
Top comments (0)