63

Jenkins Build Monitoring with the ELK Stack

 5 years ago
source link: https://www.tuicool.com/articles/hit/qMBRf2E
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
VFbQBjq.jpg!web

Jenkins is an extremely popular, open-source, continuous integration tool used for running tests, building code, and subsequently pushing to staging and then production.

In a previous post , I outlined instructions for collecting, analyzing and visualizing Jenkins system logs. Jenkins system logs can be useful to monitor the general health of a Jenkins setup, especially in the case of a multi-node environment and I highly recommend exploring this option as outlined in the above-mentioned article, but in this piece, I will be focusing on Jenkins build logs.

Jenkins build logs contain a complete record of an execution’s output, including the build name, number, execution time, result, and more. If your pipeline is broken, this data can provide a wealth of information to help to troubleshoot the root cause. Jenkins supports console logging but in case of a large amount of running jobs it becomes difficult to keep track of all the activity so collecting all this data and shipping it into the ELK Stack can help to give you more visibility.

Installing Jenkins

As a first step, and for those just getting started, let’s review how to install and setup a single Jenkins server. If you already have Jenkins up and running, skip right to the next step.

Jenkins can be installed in a variety of different ways, depending on your operating system and environment. In this case, I’ll be installing Jenkins using Ubuntu packages. I also recommend checking out system requirements before beginning the process.

Start by adding the repository key:

wget -q -O - https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo 
apt-key add -

Next, add the package repository address to your ‘sources.list’:

echo deb https://pkg.jenkins.io/debian-stable binary/ | sudo tee 
/etc/apt/sources.list.d/jenkins.list

Run update so you can use the new repository:

sudo apt-get update

To install Jenkins and its dependencies (this includes Java 8 also required for running Elasticsearch), use:

sudo apt-get install jenkins

Start the Jenkins server using:

sudo systemctl start jenkins

To open Jenkins, open your browser and enter the following URL:

http://<yourServerIP>:8080

You will then be required to enter an initial admin password available in the default installation directory. For Linux:

cat /var/lib/jenkins/secrets/initialAdminPassword

Follow the rest of the setup steps (installing plugins and creating a new admin user), and you should be all set and ready to go.

2qMNJvz.png!web

Integrating with the ELK Stack

The integration with the  ELK Stack, either your own deployment or Logz.io (as demonstrated below) is done using a fork of a Jenkins plugin called logstash-plugin . The next step describes how to download, build and install this plugin.

First, clone the plugin:

git clone https://github.com/idohalevi/logstash-plugin

Next, use maven to build it:

cd logstash-plugin
mvn build

The building process takes a while, so be patient. Tip – if you don’t have Maven installed, you can use this Docker to run the build:

sudo docker run -it --rm --name logstash-plugin -v "$(pwd)":/usr/src/mymaven -w /usr/src/mymaven maven:3.3-jdk-8 mvn package

The end result of this process is a logstash.hpi file located within the plugin directory at: logstash-plugin/target

Open Jenkins, and open the Advanced tab on the Manage Jenkins –> Manage Plugins page.

FZfeUjE.png!web

Upload the logstash.hpi file in the Upload Plugin section. Jenkins will display a success message when done.

bEJJjyJ.png!web

Select the Restart Jenkins checkbox to apply the changes.

Once Jenkins finished to reinstall, open the Manage Jenkins → Configure System page. 

A section called Logstash appears in the middle of the page. Select the Enable sending logs to an indexer checkbox to open the configurations.

JfymAvZ.png!web

If you’re shipping to your own ELK deployment, enter the IP of your Elasticsearch instance and authentication details if necessary. To ship to Logz.io, open the Indexer type drop-down, and select Logz.io.

Enter the following details:

  • Logz.io host – enter the URL of the Logz.io listener. If you are in the EU region insert https://listner-eu.logz.io:8071 . Otherwise, use https://listner.logz.io:8071 . You can tell which region you are in by checking the login URL. If your environment says app.logz.io then you are in the US, if it says app-eu.logz.io then you are in the EU.
  • Logz.io key – Your Logz.io token. It can be found in your Logz.io app account settings.

qiaMrmB.png!web

Click Save to apply the configurations.

Verifying the pipeline

Now that we have the plugin installed and configured, it’s time to test that the integration with Logz.io is working and that build logs are actually indexed properly.

To test the pipeline, I will create a simple item in Jenkins that executes a bash script.

I have two options of sending the build logs to Logz.io, either line by line as the logs are generated by the build or in bulk, post-build.

To send the logs to Logz.io line by line, simply select the Send console log to Logstash checkbox in the item’s General section. In this case, however, I’m going to send the logs in bulk post-execution.  

In the Post Build Actions section at the end of the configurations, open the Add post-build action drop-down menu, and select Send console log to Logstash . You can then configure how many lines to send to Logz.io. To send all the data, enter ‘-1’.

mm2yyaM.png!web

That’s it. Save the configuration, run your build and within a few seconds you should be seeing build console logs in Logz.io.

yaUvuiY.png!web

Analyzing Jenkins build logs in Kibana

Now that the logging pipeline is up and running, it’s time to look into the data with some simple analysis operations in Kibana.

I like adding some fields to the main display area in Kibana to get some visibility into the data. Adding, for example, the ‘buildNum’, ‘projectName’ and ‘result’ fields helps give us some context.

zeAvEja.png!web

We can use a field-level Kibana query to look for specific builds, say failed builds:

result:FAILURE

yaUvuiY.png!web

Things get more interesting when visualizing the data. Using Kibana’s different visualizations you can create a series of simple charts and metric visualizations to get you a nice overview of your Jenkins builds.

Let’s start with a simple metric visualization showing the number of failed vs. successful builds:

3YruyeE.png!web

In another example, this time using a line chart visualization, we can monitor the average execution time for builds, per project:

VZZjEjF.png!web

You can slice and dice the data in any way you want, and once you’ve got your visualizations lined up, add them into one comprehensive dashboard.

Yv6bqqq.png!web

By the way, this dashboard is available in Logz.io’s dashboard and visualization library, ELK Apps so if you’re shipping you can hit the ground running by installing this dashboard instead of building your own from scratch.

Endnotes

If your Jenkins build pipelines are busy, visibility becomes an issue. In a microservices environment, with multiple Jenkins jobs running continuously, monitoring and troubleshooting failed builds is a challenge.

The benefit of using a centralized logging system is the ability to not only collect and store the data in one single location but use best-in-class analysis and visualization tools to drill down to the root cause of failed builds. The Logstash plugin used here is an easy way to integrate a Jenkins deployment with the ELK Stack to enjoy these benefits.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK