10

Jenkins Pipeline setup for Backup, Version Compare and Validation (cpilint) of S...

 2 years ago
source link: https://blogs.sap.com/2023/04/20/jenkins-pipeline-setup-for-backup-version-compare-and-validation-cpilint-of-sap-integration-suite-artifacts/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Introduction

This article will explain step by step procedure to create Jenkins’s pipeline to backup source code, compare Versions and perform basic validation/checks (Using cpilint tool) for Integration Flows created in SAP Integration Suite. I have mostly taken (almost all) reference from below two GitHub repo.

CICD-StoreIntegrationArtefactSunny Kapoor and SAP team

cpilintMorten Wittrock

Prerequisites:

  1. Docker should be installed.(Docker Desktop as well for GUI)
  • Jenkins Installation (Using Docker)

Jenkins can be installed using simple command from any terminal or cmd.

docker run -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home jenkins/jenkins:lts-jdk11

Jenkins%20Docker%20Image%20Run

Jenkins Docker Image Run

After Installation, copy the temp password and go to localhost:8080 in the browser and provide the password and click next. Then create one admin user and press next. In the next page choose appropriate uri and then click save and finish. Do not install all the the plugins and only proceed with “Select Plugins to install“.

  • GitHub Repo Creation

  • Create a new private repository with appropriate name in GitHub.
  • Create a “Jenkinsfile” with below code.
pipeline {
	agent any
	
	parameters {
		string defaultValue: 'internalEventListener', description: 'Iflow Name', name: 'Name', trim: true
	}

	//Configure the following environment variables before executing the Jenkins Job	
	environment {
		IntegrationFlowID = "${Name}"
		CPIHost = "${env.CPI_HOST}"
		CPIOAuthHost = "${env.CPI_OAUTH_HOST}"
		CPIOAuthCredentials = "${env.CPI_OAUTH_CRED}"	
		GITRepositoryURL  = "${env.GIT_REPOSITORY_URL}"
		GITCredentials = "${env.GIT_CRED}"
		GITBranch = "${env.GIT_BRANCH_NAME}"
		GITFolder = "IntegrationContent/IntegrationArtefacts"
		GITComment = "Integration Artefacts update from CICD pipeline"
   	}
	
	stages {
		stage('download integration artefact and store it in GitHub') {
			steps {
			 	deleteDir()
				script {
					//clone repo 
					checkout([
						$class: 'GitSCM',
						branches: [[name: env.GITBranch]],
						doGenerateSubmoduleConfigurations: false,
						extensions: [
							[$class: 'RelativeTargetDirectory',relativeTargetDir: "."],
							//[$class: 'SparseCheckoutPaths',  sparseCheckoutPaths:[[$class:'SparseCheckoutPath', path: env.GITFolder]]]
						],
						submoduleCfg: [],
						userRemoteConfigs: [[
							credentialsId: env.GITCredentials,
							url: 'https://' + env.GITRepositoryURL
						]]
					])
					
					//get token
					println("Request token");
					def token;
					try{
					def getTokenResp = httpRequest acceptType: 'APPLICATION_JSON', 
						authentication: env.CPIOAuthCredentials, 
						contentType: 'APPLICATION_JSON', 
						httpMode: 'POST', 
						responseHandle: 'LEAVE_OPEN', 
						timeout: 30, 
						url: 'https://' + env.CPIOAuthHost + '/oauth/token?grant_type=client_credentials';
					def jsonObjToken = readJSON text: getTokenResp.content
					token = "Bearer " + jsonObjToken.access_token
				   	} catch (Exception e) {
						error("Requesting the oauth token for Cloud Integration failed:\n${e}")
					}
					//delete the old flow content so that only the latest content gets stored
					dir(env.GITFolder + '/' + env.IntegrationFlowID){
						deleteDir();
					}
					//download and extract artefact from tenant
					println("Downloading artefact");
					def tempfile = UUID.randomUUID().toString() + ".zip";
					def cpiDownloadResponse = httpRequest acceptType: 'APPLICATION_ZIP', 
						customHeaders: [[maskValue: false, name: 'Authorization', value: token]], 
						ignoreSslErrors: false, 
						responseHandle: 'LEAVE_OPEN', 
						validResponseCodes: '100:399, 404',
						timeout: 30,  
						outputFile: tempfile,
						url: 'https://' + env.CPIHost + '/api/v1/IntegrationDesigntimeArtifacts(Id=\''+ env.IntegrationFlowID + '\',Version=\'active\')/$value';
					if (cpiDownloadResponse.status == 404){
						//invalid Flow ID
						error("Received http status code 404. Please check if the Artefact ID that you have provided exists on the tenant.");
					}
					def disposition = cpiDownloadResponse.headers.toString();
					def index=disposition.indexOf('filename')+9;
					def lastindex=disposition.indexOf('.zip', index);
					def filename=disposition.substring(index + 1, lastindex + 4);
					def folder=env.GITFolder + '/' + filename.substring(0, filename.indexOf('.zip'));
					def zipfolder=env.GITFolder + '/ZipFiles';
					fileOperations([fileUnZipOperation(filePath: tempfile, targetLocation: folder)])
					fileOperations([fileRenameOperation(source: tempfile,  destination: filename)])
					fileOperations([fileCopyOperation(includes: filename,  targetLocation: zipfolder)])
					env.Filename = filename;
					cpiDownloadResponse.close();

					//remove the zip
					fileOperations([fileDeleteOperation(excludes: '', includes: filename)])
						
					dir(env.GITFolder){
						sh 'git add .'
					}
					println("Store integration artefact in Git")
					withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: env.GITCredentials ,usernameVariable: 'GIT_AUTHOR_NAME', passwordVariable: 'GIT_PASSWORD']]) {  
						sh 'git diff-index --quiet HEAD || git commit -am ' + '\'' + env.GitComment + '\''
						sh('git push https://${GIT_PASSWORD}@' + env.GITRepositoryURL + ' HEAD:' + env.GITBranch)
					}				
				}
			}
		}
		stage('Code Analysis') {
            steps {
		    script{
			    def zipcpilintfile = "cpilint-1.0.4.zip";
			    def unzipcpilintfile = "cpilint";
			    fileOperations([fileUnZipOperation(filePath: zipcpilintfile, targetLocation: unzipcpilintfile)])
			    fileOperations([fileDeleteOperation(excludes: '', includes: zipcpilintfile)])
			    sh "chmod a+rwx -R $WORKSPACE/cpilint"
			    sh "$WORKSPACE/cpilint/cpilint-1.0.4/bin/cpilint -rules $WORKSPACE/rules.xml -files $WORKSPACE/IntegrationContent/IntegrationArtefacts/ZipFiles/${env.Filename}"
		    }
	    }
        }
    }
}
Jenkinsfile

Jenkinsfile

  • Create a “rules.xml” file. As per the schema cpilint. I have taken one sample xml file. You can modify this as per your requirements.
<?xml version="1.0" encoding="UTF-8"?>
<cpilint>
    <rules>
        <!-- Require that all iflows have a description. -->
        <iflow-description-required/>

        <!-- Don't allow the social media receiver adapters. -->
        <disallowed-receiver-adapters>
            <disallow>facebook</disallow>
            <disallow>twitter</disallow>
        </disallowed-receiver-adapters>

        <!-- Don't allow Router steps configured with both XML and non-XML conditions. -->
        <multi-condition-type-routers-not-allowed/>
        
        <!-- Message Mapping and XSLT are the two allowed mapping types. -->
        <allowed-mapping-types>
            <allow>message-mapping</allow>
            <allow>xslt-mapping</allow>
        </allowed-mapping-types>

        <!-- Make sure that all data store writes are encrypted. -->
        <unencrypted-data-store-write-not-allowed/>

    </rules>
</cpilint>
  • Then download the zip file of cpilint and upload it into the GitHub repo. Now your repo should look like below.
GitHub%20Repository%20Files

GitHub Repository Files

  • Create a PAT (Personal Access Token) in your GitHub account. And save the token.
  • Copy GitHub Repo HTTPS url.
HTTP%20URL

HTTP URL

  • SAP Process Integration API Plan Instance – Service key

Login into BTP cockpit and save the service key.

Service%20Key

Service Key

  • Jenkins Basic Configuration

First create credentials for CPI and GitHub in Manage Jenkins -> Credentials page.

  • Use CPI Service key ClientId and ClientSecret as Username and Password and give ID “CPIOAuthCredentials” as below.
CPI%20Cred

CPI Cred

  • Create GitHub Credentials using the same method. Use your GitHub User and PAT Token as username and password. Give ID as “GIT_Credentials”.
GitHub%20Cred

GitHub Cred

  • Then go to Manage Jenkins -> Configure System and configure below environment variables.
Name Value
CPI_HOST {{url value from Service Key without https://}} e.g. xxxxxxxxxxx.it-cpi002.cfapps.ap10.hana.ondemand.com
CPI_OAUTH_CRED CPIOAuthCredentials
CPI_OAUTH_HOST {{tokenurl value from Service Key without https://}} e.g. xxxxxxxxxxx.authentication.ap10.hana.ondemand.com
GIT_BRANCH_NAME Main
GIT_CRED GIT_Credentials
GIT_REPOSITORY_URL github.com/Asutosh-Integration/IFlowRepo.git
Environment%20Variable

Environment Variable

  • Run below 2 commands in docker terminal. (from docker desktop app inside the container). Put your own name and email.
git config --global user.name "Your Name"
git config --global user.email "[email protected]"
  • Then install below 3 plugins. From your Jenkins dashboard navigate to Manage Jenkins > Manage Plugins and select the Available tab. Locate this plugin by searching below terms and install without restart.
    1. http_request
    2. pipeline-utility-steps
    3. file-operations
  • Jenkins Pipeline configuration

  • Click on New item button from Dashboard and then choose Multibranch pipeline and give appropriate name and hit “OK”.
Pipeline%20Configuration

Multibranch Pipeline Configuration

  • Then Scroll down to branch source and select the GitHub credential and give your GitHub repo URL in Repository HTTPS URL as per below config. Then hit save.
Branch%20Source

Branch Source

  • Build and Test

  • Then click on Build Button in the main branch and then give the IFlowID in the name parameter like below screenshot.
Build%20with%20Parameter

Build with Parameter

  • Then build will get completed and show status of each stage.
Build%20status

Build status

  • You can now go to your Github repo and see the iFlow files are stored. You can try changing the iflow and save it as a new version and then again try to build the pipeline again for the same iFlow using step 1.
Change%20in%20Iflow%20and%20Save%20as%20a%20version

Change in Iflow and Save as a version

  • Then after build gets completed, you will be able to see the comparison between both the versions in GitHub.
Diff
  • You can also check the basic validation logs which are generated by cpilint application in Jenkins build logs. This will give you full flexibility to configure cpilint and automatic validations via jenkins+cpilint. Thanks Morten for this cool tool.
cpilint%20logs

cpilint logs

Conclusion

This setup is neither tested in any productive scenario, nor any through risk analysis has been done, so I would request to do proper analysis before implementing it. There are a lot of improvement required to have a fully automated Pipeline. We can add more stages to have automatic deployment in another tenant and test with automatic test scripts.


Recommend

  • 64

    一、背景 在日常开发中,我们经常会有发布需求,而且还会遇到各种环境,比如:线上环境(Online),模拟环境(Staging),开发环境(Dev)等。最简单的就是手动构建、上传服务器,但这种方式太过于繁琐,使用持续集成可以完美...

  • 72

    环境测试部署主机IP:192.168.1.1Jenkins主机IP:192.168.1.2Harbor主机IP:192.168.1.3Gitlab主机IP:192.168.0.10系统信息:系统:CentOS7.5内核:4.18.7-1.el7.elrepo.x86_64Docker版本:18.09docker-compose版本:1.23.1所有主机的Docker安装方式wget-O/etc/y

  • 51

    Hi, I'm Guy "RiotSomeOtherGuy" Kisel, a software engineer at Riot. You might remember me from Running an Automated Tes...

  • 58

    在Jenkinspipeline的groovy脚本中可以实现很多复杂灵活的功能,但是:1一来对java、groovy不是很熟,也不知道能不能引入一些三方库?比如搞个jdbc操作下mysql什么的。2二是自己对go和python比较熟悉,所以想能够更加灵活的实现更多的功能:比如:操作数据库,无需...

  • 43

    简介Pipeline,简而言之,就是一套运行于Jenkins上的工作流框架,将原本独立运行于单个或者多个节点的任务连接起来,实现单个任务难以完成的复杂流程编排与可视化。Pipeline是Jenkins2.X的最核心的特性,帮助Jenkins实现从CI到CD与DevOps的转变。一,创建pipeline...

  • 26
    • studygolang.com 5 years ago
    • Cache

    K8s原生Jenkins-X和Tekton Pipeline

    Jenkins X不是Jenkins,它是完全从头开始重写的。 Jenkins X比Jenkins更聚焦于特定领域。它提供了一种使用特定工具(Kubernetes Helm Tekton Skaffold Flagger…)来构建和部署应用程序的方式。如果您喜欢这样使用它,那将是一种...

  • 39

    CQRS Validation Pipeline with MediatR and FluentValidation Posted by Code Maze | Updated Date Oct 11, 2021 |

  • 7
    • blog.knoldus.com 3 years ago
    • Cache

    Backup and Restoring In Jenkins

    Backup and Restoring In Jenkins Reading Time: 4 minutes If you are using Jenkins for a while then you must be aware about...

  • 1
    • linuxsuren.github.io 3 years ago
    • Cache

    Jenkins Backup

    Jenkins Backup Jenkins Backup 本文是把 Jenkins 2.164.2 的数据备份到了 2.175 上。 首先,我们需要进入到 Jenkins 配置的根目录中(默认为:~/.jenkins): tar...

  • 8

    Daniel Graversen June 6, 2022 4 minute read...

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK