Nowadays, containers are taking over the world. We still have big systems, legacy system and, obviously, not every company out there has enough speed to migrate to containerized solutions but, wherever you look, people are talking about containers.
And, if you look in the opposite direction, people are talking about security. Breaches, vulnerabilities, systems not properly patched, all kind of problems that put at risk enterprise security and users data.
With all of this, and it is not new, projects involving both topics have been growing and growing. The ecosystem is huge, and the amount of options is starting to be overwhelming.
We have projects like:
- Docker Bench for Security: The Docker Bench for Security is a script that checks for dozens of common best-practices around deploying Docker containers in production. The tests are all automated and are inspired by the CIS Docker Benchmark v1.2.0.
- Clair: Clair is an open-source project for the static analysis of vulnerabilities in application containers (currently including apps and docker).
- Cilium: Cilium is open source software for providing and transparently securing network connectivity and load-balancing between application workloads such as application containers or processes. Cilium operates at Layer 3/4 to provide traditional networking and security services as well as Layer 7 to protect and secure use of modern application protocols such as HTTP, gRPC and Kafka. Cilium is integrated into common orchestration frameworks such as Kubernetes and Mesos.
- Anchore Engine: The Anchore Engine is an open-source project that provides a centralized service for inspection, analysis and certification of container images. The Anchore Engine is provided as a Docker container image that can be run standalone or within an orchestration platform such as Kubernetes, Docker Swarm, Rancher, Amazon ECS, and other container orchestration platforms.
- OpenSCAP: The OpenSCAP ecosystem provides multiple tools to assist administrators and auditors with assessment, measurement, and enforcement of security baselines. We maintain great flexibility and interoperability, reducing the costs of performing security audits.
- Dagda: Dagda is a tool to perform static analysis of known vulnerabilities, trojans, viruses, malware & other malicious threats in docker images/containers and to monitor the docker daemon and running docker containers for detecting anomalous activities.
- Notary: The Notary project comprises a server and a client for running and interacting with trusted collections. See the service architecture documentation for more information.
- Grafaes: An open artefact metadata API to audit and govern your software supply chain.
- Sysdig Falco: Falco is a behavioural activity monitor designed to detect anomalous activity in your applications. Powered by sysdig’s system call capture infrastructure, Falco lets you continuously monitor and detect container, application, host, and network activity – all in one place – from one source of data, with one set of rules.
- Banyan Collector: Banyan Collector is a light-weight, easy to use, and modular system that allows you to launch containers from a registry, run arbitrary scripts inside them, and gather useful information.
As we can see, there are multiple tools within this container security scope. These are just some example.
In this article, we are going to explore a bit more Archore Engine. We are going to create a basic Jenkins pipeline to scan one container. Fro this, we are going to need:
- A repository in GitHub with a simple dockerized project. In my case, I will be using this one. It’s a simple Spring Boot app with a hello endpoint and a very simple ‘Dockerfile’.
- We are going to need a Docker Hub repository to store our image. I will be using this one.
- Docker and docker-compose.
And, that’s all. Let’s go.
We can see in the next image the pipeline we are going to implement:

Install Anchore Engine
We just need to execute a few commands to have Anchore Engine up and running.
mkdir -p ~/aevolume/config
mkdir -p ~/aevolume/db/
cd ~/aevolume/config && curl -O https://raw.githubusercontent.com/anchore/anchore-engine/master/scripts/docker-compose/config.yaml && cd -
cd ~/aevolume
curl -O https://raw.githubusercontent.com/anchore/anchore-engine/master/scripts/docker-compose/docker-compose.yaml
After that, we should see a folder ‘aevolume’ with a content similar to:

Running Anchore Engine
As we can see, the previous step has provided us with a docker-compose file to run in an easy way Anchore Engine. We just need to execute the command:
docker-compose up -d
When docker-compose finishes, we should be able to see the two containers for Anchore Engine executing. One for the application itself and one for the database.

Install the Anchore CLI
It is not necessary but, it is going to be very useful to debug integration problem if we have (I had a few the first time). For this, we just need to execute a simple command that it will make the executable ‘anchore-cli’ available in our system.
pip install anchorecli
Install the Jenkins plugin
Now, we start working on the integration with Jenkins. The first step is to install the Anchore integration on Jenkins. We just need to go to the Jenkins management plugin area and install one called ‘Anchore Container Image Scanner Plugin’.

Configure Anchore in Jenkins
There is one more step we need to take to configure the Anchore plugin in Jenkins. We need to provide the engine URL and the access credentials. This credentials can be found in the file ‘~/aevolume/config/config.yaml’.

Configure Docker Hub repository
The last configuration we need to do, it is to add our access credential for our Docker Hub repository. I recommend here to generate an access token and not to use our real credentials. Once we have the access credential, we just need to add them to Jenkins.

Create a Jenkins pipeline
To be able to run our builds and to analyze our containers, we need to create a Jenkins pipeline. We are going to use the script feature for this. The script will look like this:
pipeline {
environment {
registry = "fjavierm/anchore_demo"
registryCredential = 'DOCKER_HUB'
dockerImage = ''
}
agent any
stages {
stage('Cloning Git') {
steps {
git 'https://github.com/fjavierm/demo.git'
}
}
stage('Building image') {
steps {
script {
dockerImage = docker.build registry + ":$BUILD_NUMBER"
}
}
}
stage('Container Security Scan') {
steps {
sh 'echo "docker.io/fjavierm/anchore_demo:latest `pwd`/Dockerfile" > anchore_images'
anchore name: 'anchore_images'
}
}
stage('Deploy Image') {
steps{
script {
docker.withRegistry( '', registryCredential ) {
dockerImage.push()
}
}
}
}
stage('Cleanup') {
steps {
sh'''
for i in `cat anchore_images | awk '{print $1}'`;do docker rmi $i; done
'''
}
}
}
}
This will create a pipeline like:

Execute the build
Now, we just need to execute the build and see the results:


Conclusion
With this, we finish the demo. We have installed Anchore Engine, integrate it with Jenkins, run a build and check the analysis results.
I hope it is useful.