Security in the build pipeline

I’ve already blogged about OWASP Dependency Check or its alternative SourceClear in the past. But there is more you can do about security in a typical (Jenkins) build. Although I’m calling this post “Security in the build pipeline” I’m not actually using the Jenkins pipeline as code feature. Security scans are independent of that, so it doesn’t matter whether you are using pipeline as code, a freestyle or a Maven job (or any other type you can think of). I’m focused on Jenkins, since that’s the build server I’m used too. However most of the tools should be available for other build servers as well.

This post is the start of a little series talking about security in the build pipeline. One important part of that – verifying that there a no known vulnerabilities in your dependencies – has been published already. Upcoming parts will take care of a vulnerability scan in the web application to build and scan for vulnerabilities in your actual code.

The “problem” with all these scans is, that they will slow down your build. Every single one. That’s why I recommend to not scan in a build job that has been started because of a source code change (a push in your Git repo). You usually want fast feedback for those jobs, and even some extra minutes for a simple scan will be too long. I recommend a nightly build for those scans, one which provides feedback (reports) in the morning for the architect or security champion who cleans those reports and assigns the stuff to fix to the team. A weekly security build is too rare, a nightly build is a good compromise.
It’s always helpful to assign a single person to validate all reports and prepare them for your daily or create bug tickets. There will be false positives, there will be duplicates and not every finding is important (right away). This person should – whenever supported by the tool or plug-in – create and maintain ignore lists (blacklists) to get rid of false positives as soon as possible. These tasks require a little more security knowledge than usually available, a perfect job for a security champion!

Always keep in mind that all those scans aim for the low hanging fruits. They can’t replace secure development, they don’t find every security vulnerability. And they don’t make pen tests and code reviews superfluous. Use them as an additional defense mechanism. Removing the low hanging fruits before a pen test forces the pentester hunting the more complicated vulnerabilities.

This was the first post for more security in your build, watch out for the next one to come. In the meanwhile – if you haven’t done so already – have a look at my post on scanning for known vulnerabilities in third party dependencies within Jenkins.

SRC:CLR revisited

I was approached by SourceClear a little while ago to have another look at their dependency vulnerability scanner after my first blog post on their service.

Since I already liked the tool before, I was quite interested to see what they have changed and was happy to do another test. The test setup partly changed: While I did scan the same repositories again (JavaSecurity and ApplicationIntrusionDetection) I did switch to their Travis CI integration and did an automatic scan after a successful build. I did scan the latest version of the pom files, the listed dependencies (their versions) therefore changed since my first scan in May 2016.

On the first run in May 2016, one critical vulnerability in Apache Commons BeanUtils has been reported. Now there are two more: One in Apache Commons FileUpload and one in Xalan. Plus another medium one in FileUpload.

Here the scan changed from zero vulnerabilities to one medium in OGNL.

All discovered vulnerabilities have been disclosed before my first scan in May, so they did a nice job updating their scanner and detection capabilities since then.

What I extremely like is how they display the dependency graph of a vulnerable dependency, enabling you to easily figure out its origin:

There are plenty more options to find out more details about an identified vulnerability and whether or not it has been fixed (including the fixed version). There seem to be some cases where this is not working correctly (see below with Spring AOP or Spring Beans, where the latest version is older than my used version), but I’m sure they will fix this small issue in a future update.

Wrap up
Use it! It is free and easy to use. Travis CI integration is working smoothly and the setup is described in their docs section for this and other scenarios. The reports provide a lot of useful details to help you either update the dependency or live with the vulnerability. The only thing I’m missing right now is a badge to display the scan results in a GitHub repository readme…

Finding vulnerabilities in third party libraries

I’ve already blogged about OWASP Dependency Check as a Jenkins plug-in a little while ago. With SRC:CLR, a new web based alternative might be available.

To use it, simply register via GitHub login and follow the installation instructions for your operating system. After installation, execute a scan via command line: srcclr scan --url

The results are uploaded to SRC:CLR and can be examined on their website. For my repo, the report looks like this:


As you can see, only one critical vulnerability is reported. Lets compare this to OWASP Dependency Check for the same repository and the same version. This scan was executed via dependency-check --project JavaSecurity --scan ./**/*.jar on command line, the result (on my own machine) looks like this:


OWASP Dependency Check identified three critical vulnerabilities; one being identical with SRC:CLR (Apache Common BeanUtils, CVE-2014-0114), and two more: JSTL (CVE-2015-0254) and Xalan (CVE-2014-0107). OK, a little difference, and both vulnerabilities are there (the dependencies are identified by SRC:CLR, but listed green with no vulnerability). At least both identified the critical one in Apache Common BeanUtils.

Second try, this time with my Application Intrusion Detection repository. SRC:CLR identified no vulnerable dependency, OWASP Dependency Check one (tomcat-embed-core-8.0.33.jar with four vulnerabilities); none of them matters for my application and/or Tomcat version.

By simply looking at the sheer numbers it is an OWASP Dependency Check victory. Both tools reported the critical vulnerability, but SRC:CLR skipped the more controversial ones. OWASP Dependency Check has the tendency to report more (everything), including false positives or CVEs that only may affect your application. And therefore forces the developer to decide about the relevance.

What I extremely like about SRC:CLR is their way of presenting the analysis. Especially the Dependency Path column which shows you whether it is a direct dependency or a transitive one. But of course it is an online tool, uploading everything to the cloud. Which in my case doesn’t matter because of my public GitHub repos. This might be different in an enterprise environment.

Which one to choose? I’ll stick to OWASP Dependency Check on my Jenkins, the reports are fine, and false positives can be excluded. And it keeps everything local. Without a local Jenkins build server I strongly recommend SRC:CLR, directly integrated in your GitHub repo. Vulnerable third party dependencies are a great risk, scanning every (web) application is an absolute must.

Using OWASP Dependency Check as Jenkins plugin

OWASP Dependency Check is a great tool to check your third party dependencies in Java (web) applications. Besides using it as command line tool, Maven plugin or Ant task, you should integrate it all your Jenkins build jobs.

One downside is that as default, every build job downloads and regularly updates its own National Vulnerability Database file. In order to improve that, I recommend creating an update only job that runs daily. In order to do that, create a freestyle project and add Invoke OWASP Dependency-Check NVD update only as build step:

Invoke OWASP Dependency-Check NVD update only

Enter a data directory which will be used by every job. This job should have a build trigger to run that job periodically (like @daily). Now save and run the job.

After that you need to activate OWASP Dependency Check on every build you like. To do that, open all job configurations and add Invoke OWASP Dependency-Check analysis as a post-build step:

Invoke OWASP Dependency-Check analysis

Click on the Advanced button and enter the data directory configured in the update job before. Remember to activate the Disable NVD auto-update checkbox since all updates are done by the updates job. Now add Publish OWASP Dependency-Check analysis results and configure the status thresholds as you like. Since developers tend to hate failing jobs I recommend configuring warning thresholds only:

Publish OWASP Dependency-Check analysis results

Time to save and run your job. As with any (security) scan results: these lists tend to contain false positives, verify each finding!

This configuration ensures fast job execution and minimizes the required update time for the National Vulnerability Database. Besides that, it reduces the required disk space by using only one common database.

Free web application vulnerability scanner for Eclipse

Contrast released Contrast for Eclipse 1.0 already a little while ago. The Eclipse plug-in works as a runtime security scanner and checks for security vulnerabilities in your web application while executing it in Eclipse. Promised by Contrast on Eclipse Marketplace is an Automated detection of OWASP Top 10 vulnerabilities.

This is the first free tool available that explicitly scans for security vulnerabilities. Other tools like FindBugs or PMD may find some security problems as well, but are focussed on bugs and bad practices.

Running the Contrast test is easy: Instead of running or debugging your web application you simply launch your configured web server with Contrast in the Servers view:

Contrast in Eclipse Server view

The scanner detects possible vulnerabilities while you are using the web application (a.k.a. at runtime) and points to the source code line causing the vulnerability, extended with helpful information about the vulnerability and additional links.

I’ve used some of my intentionally vulnerable web applications in the JavaSecurity and Java-Web-Security repositories as test environment. These are some of the results I received while using the XSS sample application:

Contrast view with findings

The findings are all correct, but the important one is missing: The XSS vulnerability. So while Contrast tells me that no Anti-Caching response headers are in place and that my forms use auto-completion (both warnings are absolutely correct), it has missed the successful XSS attack that ended in the following dialog visible in my browser:

XSS popup in the browser


Next stop, CSRF: Same findings (cache and auto-complete), no CSRF warning.

Final stop, SQL Injection: Same findings (cache and auto-complete), no SQL Injection warning.

XSS, CSRF and SQL Injection are – in my eyes – still some of the nastier problems we are facing in web applications (among others). And they have been a part of the OWASP Top 10 forever.

Countercheck with FindBugs (and manually enabled security and malicious code vulnerability checks): got several warnings on reflected cross-site scripting and SQL Injection vulnerabilities.

So, use the Contrast plug-in or not? Well, use it from time to time, it still might discover some vulnerabilities in your web application. But don’t expect too much and definitively extend it with regular FindBugs scans. Still a long way to go with open source security scanner.

Using security response headers with WordPress

I’ve added several security headers to my blog today. The first part was easy: I’ve created a .htaccess file in my blog’s root directory with the following content:

Header set X-XSS-Protection "1; mode=block"
Header set X-Frame-Options DENY
Header set X-Content-Type-Options "nosniff"
Header set Strict-Transport-Security "max-age=31556926"
Header set Cache-Control "no-store, no-cache, must-revalidate"

Only one header was missing: Content Security Policy (CSP). The header itself was easy to add, but caused some problems at first:

Header set X-XSS-Protection "1; mode=block"
Header set X-Frame-Options DENY
Header set X-Content-Type-Options "nosniff"
Header set Strict-Transport-Security "max-age=31556926"
Header set Cache-Control "no-store, no-cache, must-revalidate"
Header set Content-Security-Policy "default-src 'self'; img-src 'self' http: https: *; frame-ancestors 'none'"

This works perfectly for the pages you can access, but totally breaks the admin pages. There are way too many JavaScript files consumed from other domains. And a lot of these pages contain unsafe inline JavaScript, which would force me to add ‘unsafe-inline’ to the policy. And with that value not much security is left. Fortunately, I’m talking about the admin area, which is only accessible by myself. So instead of creating a new policy in the wp-admin folder that would more or less allow anything, I’ve decided to deactivate it completely in this area by creating a .htaccess file in this folder with the following content:

Header unset Content-Security-Policy

The Content Security Policy header is great, but this is a typical example for problems with older (I don’t wanna say legacy in this case) applications or application parts you don’t have under control.

But anyway, all security relevant headers are returned in my blog now. Please report any problems you might discover.

Keep Your X-Frame-Options header a little while longer

So Mozilla has decided to deprecate the X-Frame-Options header to avoid clickjacking or UI redressing attacks (have a look in the page history, the first version used a much stronger language). This header was never standardized (as the leading X indicates), but is supported in all browsers (yes, in ALL browsers, with the exception of the special ALLOW-FROM value). The X-Frame-Options replacement is Content Security Policy Level 2 (CSP) with its frame-ancestors directive. I’m all for reducing the huge amount of headers and using CSP instead. Even though a CSP for a normal web application might already be extensive.

There is one huge problem though: browser support. While most modern browsers support CSP level 1 (where there is no frame-ancestors directive) – Internet Explorer supports at least a subset – only Firefox supports the frame-ancestors directive. Give it a try, I’ve updated my security-headers web application you can clone from GitHub. This of course leads to the situation that we developers have to return both headers at the same time, X-Frame-Options and CSP with frame-ancestors directive. And the browser shouldn’t have any problem with that. Older browsers ignore the CSP, newer browsers should ignore X-Frame-Options if frame-ancestors is present. Doesn’t sound like a lot of fun for both parties, does it? But that’s the way it is. We simply cannot afford replacing X-Frame-Options header with CSP right now and leaving most web users unprotected. Especially corporate environments with older browsers (even older Firefox versions do not support CSP Level 2) this will be the situation for quite a while. In the end, CSP will win, and it’s a win we all will benefit from. Just be patient, there is no need to hurry removing old headers right away.

JSF stateless views and CSRF protection

JavaServer Faces (JSF) – especially since version 2.2 – provides a good Cross-Site Request Forgery (CSRF) protection. To achieve this, every form automatically receives a random hidden token:

JSF non-transient view

Nothing more to do for the developer, JSF takes care of comparing the token’s value against the one stored in the server side session. Without the correct token, the request won’t be processed.

JSF 2.2 also introduced stateless views, simply by marking one as transient:

<f:view transient="true">...</f:view>

Using a transient view may make sense e.g. before a user has logged in and no session needs to be stored on the server. Or you want to avoid that nasty ViewExpiredException. But keep in mind that this influences the out of the box CSRF protection as well. Looking at the same form from before but with transient on true, the anti CSRF token changes:

JSF transient view

An anti CSRF token needs to be unpredictable and absolutely random to be of any use for CSRF protection. The static value stateless of course is not.

Is this a bug? No! The most common implementation for CSRF protection is called Synchronizer Token Pattern. This pattern stores a hidden value in forms (like javax.faces.ViewState) and the corresponding value in the server side session. Using transient=”true” avoids creating a server side session. No session, no Synchronizer Token Pattern, no CSRF protection. So when using stateless views you’ll have to implement CSRF protection yourself. The requirement is the same: a random token calculated per user and per session (or per request depending on your security needs). But without the server side session the implementation needs to change. Luckily, there is another option called Double Submit Pattern (see here for a description of both patterns). Double submit, as the name implies, submits the same token from two different places: one in a cookie and the other one, as before, in each form you want to protect. Whereas the cookie is always submitted – including faked requests – the form token is missing and your backend – which has to compare the two submitted tokes – can recognize an invalid request. So by avoiding session creation for every visitor (user), you’ll be giving away the out of the box JSF CSRF protection.

One usage scenario for stateless views might be a login form. After logging in, you want a session for the user, but not for any anonymous visitor. You might ask yourself, what harm a login form that is prone to CSRF could do? Well, it may enable an attacker to automatically log in a user with prepared credentials. This way, the attacker may record and retrace the users’ actions. Therefore you should protect the login form against CSRF as well.

That’s the tradeoff you have the be aware of: JSF provides out of the box CSRF protection, as long as your view isn’t transient. If that this the case you’ll have to take care of the CSRF protection yourself.