In this exercise we'll break free from the chains of point'n'click Jenkins by introducing pipeline as code in the form of
Jenkinsfile
. Following this we will introduce some new Jenkins slaves that will be used in later exercises.
There are a number of ways pipeline as code can be achieved in Jenkins.
* The Job DSL Plugin - this is a slightly older but very functional DSL mechanism to create reusable pipelines. Create a groovy
file to run Jenkins Domain Specific Language to create jobs, functions and other items. In Jenkins; you then can execute this file which will build all of the config.xml files needed for each Job.
* The Scripted Pipeline - The scripted pipeline introduced the Jenkinsfile and the ability for developers to write their Jenkins setup as groovy code. A repo with a Jenkinsfile in its root can be pointed to by Jenkins and it will automatically build out each of the stages described within. The scripted pipeline is ultimately Groovy at its core.
* The Declarative Pipeline - This approach looks to simplify and opinionate what you can do and when you can do it in a pipeline. It does this by giving you top level block
which define sections, directives and steps. The declarative syntax is not run as groovy but you can execute groovy inside script blocks. The advantage of it over scripted is validation of the config and lighter approach with requirement to understand all of the groovy
syntax
As a learner you will be able to
- Use a Jenkinsfile to create a declarative pipeline to build, bake and deploy the Todolist App
- Identify the differences between scripted, declarative and DSL pipelines
- Create Jenkins slave nodes for use in builds in future exercises
Name of tool - short description and link to docs or website
This exercise begins cluster containing blah blah
The goal of this exercise is to move to using the Jenkinsfile in the todolist-api and todolist-fe projects. Additionally we will create new slaves for use in the next exercise.
On Jenkins; create a multibranch pipeline project to scan the GitLab endpoint for each app. Use the Jenkinsfile provided to run the stages. Replace <YOUR_NAME>
with the appropriate variable.
Create two new Jenkins slaves for the OWASP ZAP
scanner and the Arachni
WebCrawler
This is a fairly structured guide with references to exact filenames and sections of text to be added.
In this exercise we'll replace the Pipeline we created in Exercise 2 with a Jenkinsfile approach
On your terminal navigate to your todolist-api
project and checkout the pipeline feature branch that's been already created for you.bash git checkout feature/jenkinsfile
Open up your todolist-api
application in your favourite editor and move to the Jenkinsfile
in the root of the project. The high-level structure of the file is shown collapsed below.
Some of the key things to note:
pipeline {}
is how all declarative Jenkins pipelines begin.environment {}
defines environment variables to be used across all build stagesoptions {}
contains specific Job specs you want to run globally across the jobs e.g. setting the terminal colourstage {}
all jobs must have one stage. This is the logical part of the build that will be executed e.g. bake-image
steps {}
each stage
has one or more steps involved. These could be execute shell or git checkout etc.agent {}
specifies the node the build should be run on e.g. jenkins-slave-npm
post {}
hook is used to specify the post-build-actions. Jenkins declarative pipeline syntax provides very useful callbacks for success
, failure
and always
which are useful for controlling the job flowwhen {}
is used for flow control. It can be used at the stage level and be used to stop pipeline entering that stage. e.g. when branch is master; deploy to test
environment.The Jenkinsfile is mostly complete to do all the testing etc that was done in previous exercises. Some minor changes will be needed to orchestrate namespaces. Find and replace all instances of <YOUR_NAME>
in the Jenkinsfile. Update the <GIT_USERNAME>
to the one you login to the cluster with; this variable is used in the namespace of your git projects when checking out code etc. Ensure the GITLAB_DOMAIN
matches your git host.
```groovy
environment {
// GLobal Vars
PIPELINES_NAMESPACE = "-ci-cd"
APP_NAME = "todolist-api"
JENKINS_TAG = "${JOB_NAME}.${BUILD_NUMBER}".replace("/", "-")
JOB_NAME = "${JOB_NAME}".replace("/", "-")
GIT_SSL_NO_VERIFY = true
GIT_CREDENTIALS = credentials('jenkins-git-creds')
GITLAB_DOMAIN = "gitlab.apps.<SOME_DOMAIN>.com"
GITLAB_PROJECT = "<GIT_USERNAME>"
}With these changes in place, push your changes to the feature/jenkinsfile
branch.bash git add Jenkinsfile
bash git commit -m "ADD - namespace and git repo to pipeline"
bash git push
When the changes have been successfully pushed; Open Jenkins.
Create a New Item
on Jenkins. Give it the name todolist-api
and select Multibranch Pipeline
from the bottom of the list as the job type.
On the job's configure page; set the Branch Sources to git
Fill in the Git settings with your todolist-api
GitLab url and set the credentials as you've done before. https://gitlab.apps.lader.rht-labs.com/<YOUR_NAME>/todolist-api.git
Set the Scan Multibranch Pipeline Triggers
to be periodic and the interval to 1 minute. This will poll the GitLab instance for new branches or change sets to build.
Save the Job configuration to run the intial scan. The log will show scans for master
and develop
branches, which have no Jenkinsfile
so are skipped. The resulting view will show the feature/jenkinsfile
job corresponding the only branch that currently has one. The build should run automatically.
The pipeline file is setup to only run bake
& deploy
stages when on master
or develop
branch. This is to provide us with very fast feedback for team members working on feature or bug fix branches. Each time someone commits or creates a new branch a basic build with testing occurs to give very rapid feedback to the team. Let's now update our master
and develop
branches to include the Jenkinsfile and delete the feature branch.bash git checkout develop
bash git merge feature/jenkinsfile # you may get merge conflicts at this point
bash git add .
bash git commit -m "Jenkinsfile updates"
bash git checkout master
bash git merge develop
bash git push -u origin --all
bash # this is to delete the branch from the remote git push origin :feature/jenkinsfile
Back on Jenkins we should see our todolist-api
pipelines have changed with the develop
and master
now appearing. The feature was deleted so this job should have gone away.
With the builds running for develop
and master
we can explore the Blue Ocean View for Jenkins. On the Job overview page, hit the Open Blue Ocean
button on the side to see what modern Jenkins looks like.
We can move on to the todolist-fe
job. The process is the same as before, checkout the feature branchbash cd todolist-fe
bash git checkout feature/jenkinsfile
Open up your todolist-fe
application in your favourite editor and move to the Jenkinsfile
in the root of the project. Update all <YOUR_NAME>
and <GIT_USERNAME>
as you did before, including in the prepare environment steps. Check the GITLAB_DOMAIN
is set too.
Commit your changes to your feature branch as you did previously.bash git add Jenkinsfile
bash git commit -m "ADD - namespace and git repo to pipeline"
bash git push
This time update your master and develop branches before creating config in Jenkins git checkout develop
bash git merge feature/jenkinsfile # you may get merge conflicts at this point
bash git add .
bash git commit -m "Jenkinsfile updates"
bash git checkout master
bash git merge develop
bash # this is to delete the branch from the remote git push origin :feature/jenkinsfile
bash git push -u origin --all
On Jenkins; create a new Multibranch Pipeline
job called todolist-fe
.
Add the todolist-fe
git repository and set the credentials for git accordingly.
Set the trigger to scan every minute as done previously. Save the configuration and we should see the collection of Jobs as shown below.
Run the jobs and validate the app is working as expected in the test
environment!
This exercise adds a new BuildConfig to our cluster for the todolist-apps to run their pipelines in OpenShift using the OpenShift Jenkins Sync Plugin. We will use the OpenShift Applier to create the content in the cluster
Open the todolist-fe
app in your favourite editor. Move to the .openshift-applier
directory. Explore the template/ocp-pipeline
. This template creates a BuildConfig for OpenShift with a Jenkinsfile from a given repo. In this case; it will be the Jenkinsfile
at the root of our application.
Open the params/ocp-pipeline
file and update PIPELINE_SOURCE_REPOSITORY_URL
with the git url of your project (Don't forget to add the .git
at the end). For example: PIPELINE_SOURCE_REPOSITORY_URL=https://gitlab.apps.<SOME_DOMAIN>.com/<GIT_USERNAME>/todolist-fe.git PIPELINE_SOURCE_REPOSITORY_REF=develop NAME=todolist-fe
Create a new object in inventory/group_vars/all.yml
to drive the ocp-pipeline
template with the parameters file you've just created. It can be put under the existing todolist-fe-build
object.
```yaml
Use the OpenShift Applier to create the cluster contentbash cd .openshift-applier
bash ansible-playbook apply.yml -i inventory/ \ -e "filter_tags=pipeline"
With these changes in place, commit your changes to GitLabbash git add .
bash git commit -m "ADD - ocp pipeline in git repo"
bash git push
Login to your OpenShift Cluster and go to the <YOUR_NAME>-ci-cd
namespace. On the side menu; hit Builds > Pipeline to see your newly created pipeline running in OCP Land.
Running the pipeline from here will run it in Jenkins. You can see the job sync between OpenShift and Jenkins if you login to Jenkins. You should see a folder with <YOUR_NAME>-ci-cd
and your pipeline jobs inside of it.
With the configuration in place for the todolist-fe
; repeat the process for the todolist-api
. Update the todolist-api/.openshift-applier/inventory/group_vars/all.yml
with a new object to drive the params and template
```yaml
Update the todolist-api/.openshift-applier/params/ocp-pipeline
PIPELINE_SOURCE_REPOSITORY_URL=https://gitlab.apps.<SOME_DOMAIN>.com/<GIT_USERNAME>/todolist-api.git PIPELINE_SOURCE_REPOSITORY_REF=develop NAME=todolist-api
Use the OpenShift Applier to create the cluster contentbash cd todolist-api/.openshift-applier
bash ansible-playbook apply.yml -i inventory/ \ -e "filter_tags=pipeline"
Login to your OpenShift Cluster and go to the <YOUR_NAME>-ci-cd
namespace. On the side menu; hit Builds > Pipeline to see your newly created pipeline running in OCP Land.
Commit your changes to GitLabbash git add .
bash git commit -m "ADD - ocp pipeline in git repo"
bash git push
This exercise focuses on updating the
enablement-ci-cd
repo with some new jenkins-slave pods for use in future exercise
OWASP ZAP (Zed Attack Proxy) is a free open source security tool used for finding security vulnerabilities in web applications.
On your terminal; move to the enablement-ci-cd
repo. We need to checkout a template for OpenShift to build our Jenkins Slave images and some parameters for the zap
slave.bash git checkout exercise4/zap-and-arachni params/jenkins-slave-zap templates/jenkins-slave-generic-template.yml
This should have created the following files which we will fill out. We will use a ZAP
image hosted on the rht-labs/ci-cd
repo so there will be no Dockerfile
needed:
params/jenkins-slave-zap
Create an object in inventory/host_vars/ci-cd-tooling.yml
called jenkins-slave-zap
and add the following content:
```yaml
Run the ansible playbook filtering with tag zap
so only the zap build pods are run.bash ansible-playbook apply.yml -e target=tools \ -i inventory/ \ -e "filter_tags=zap"
Head to https://console.lader.rht-labs.com on OpenShift and move to your ci-cd project > builds. You should see jenkins-slave-zap
has been built.
Arachni is a feature-full, modular, high-performance Ruby framework aimed towards helping penetration testers and administrators evaluate the security of web applications.
On your terminal; checkout the params and Docker file. The Dockerfile for the Arachni
scanner is included here and we will point the build to it.bash git checkout exercise4/zap-and-arachni params/jenkins-slave-arachni docker/jenkins-slave-arachni
Create an object in inventory/host_vars/ci-cd-tooling.yml
called jenkins-slave-arachni
with the following content:
```yaml
Update the jenkins-slave-arachni
files SOURCE_REPOSITORY_URL
to point to your GitLab's hosted version of the enablement-ci-cd
repo. SOURCE_REPOSITORY_URL=https://gitlab.apps.lader.rht-labs.com/<GIT_USERNAME>/enablement-ci-cd.git SOURCE_CONTEXT_DIR=docker/jenkins-slave-arachni BUILDER_IMAGE_NAME=registry.access.redhat.com/openshift3/jenkins-slave-base-rhel7:latest NAME=jenkins-slave-arachni SOURCE_REPOSITORY_REF=master
With these changes in place, push your changes to the master
branch.bash git add .
bash git commit -m "ADD - Arachni scanning image"
bash git push
Run the Ansible playbook filtering with tag arachni
so only the arachni build pods are run.bash ansible-playbook apply.yml -e target=tools \ -i inventory/ \ -e "filter_tags=arachni"
Head to https://console.lader.rht-labs.com on OpenShift and move to your ci-cd project > builds. You should see jenkins-slave-arachni
.
Ideas for go-getters. Advanced topic for doers to get on with if they finish early. These will usually not have a solution and are provided for additional scope.
Jenkins S2I
- Add the multi-branch configuration to the S2I to have Jenkins come alive with the todolist-api
and -fe
configuration cooked into it for future uses.
Jenkins Pipeline Extension
- Add an extension to the pipeline that promotes code to the UAT environment once the master job has been successful.
- Use a WAIT to allow for manual input to approve the promotion
Jenkins e2e extension (blue/green)
- Add a step in the pipeline to only deploy to the test
environment if the e2e tests have run successfully against which ever environment (blue or green) is not deployed.
List of links or other reading that might be of use / reference for the exercise