目录
一、goc-test-coverage
1.参数化构建过程
2.编写流水线脚本(Pipeline script)
二、goc-get-coverage
1.参数化构建过程
2.添加shell命令框
三、pid-goc-coverage
1.参数化构建过程
2.流水线脚本
3.实践问题
优化1:执行job模块代码,同时获取job服务和standard-compute服务覆盖率
优化2:自动往数据库中添加初始数据
优化3:合并多个html文件
优化4:优先上传文件到指定目录
优化5:覆盖率字段通过机器人发送内容配置直观展示
四、pid-docker-build
1.流水线脚本
2.编写makefile文件
接着上文了解到如何通过docker-compose集群来启动服务,goc-server和每个服务在同一网络中实现通信,并通过jenkins流水线来执行测试用例和通过jenkins的shell脚本来获取代码覆盖率。
本文想不断优化此种方式,考虑是否可以用更为简便的方式同时获取到测试结果和代码覆盖率,看是否可以合并为一个流水线。
一、goc-test-coverage
原有的流水线1:用于获取测试脚本执行结果(流水线)
1.参数化构建过程
- 添加选项参数:MODULE_NAME
- 添加选项参数:PRIORITY
测试脚本代码可看出使用suite套件包模式编写,可保证每次运行前后数据一致,实现数据解耦。
- 添加选项参数:ENV
- 添加字符参数:RECIPIENT_LIST
2.编写流水线脚本(Pipeline script)
最开始定义一些变量如下:
#!groovy
pipeline {
agent {label 'develop'}
environment {
GO_BINARY = "go"
TEST_REPORT_PATH = "test-report.xml"
RECIPIENTS = "xxx.com" // 接收测试报告的邮箱地址
}
stages {
stage('checkout') {
steps {
sh"""
export PATH="${arcPath}:${goRoot}:${kubectlRoot}:${makeRoot}:$PATH"
git clone --depth 1 ${repoURL} ${repoPath}
"""
echo "${JOB_NAME}"
}
}
stage('unit-test') {
steps {
echo 'yes'
}
}
stage('api-test') {
steps {
sh """
export CGO_ENABLED=1;GO111MODULE=on;GOINSECURE=公司git地址;GOPRIVATE=公司git地址;GOPROXY=https://goproxy.cn,direct
export PATH="${arcPath}:${goRoot}:${kubectlRoot}:${makeRoot}:$PATH"
go env -w GOINSECURE=公司git地址
go env -w GOPRIVATE=公司git地址
mkdir test-coverage
echo $ENV
cd ${repoPath}
echo 'cp test .env'
cp /home/jenkins/compose/conf/.env ${repoPath}/test
cp /home/jenkins/compose/conf/.env.storage ${repoPath}/test/storage/v1/.env
export PATH="/home/jenkins/go/bin:$PATH"
go mod tidy
if [ $MODULE_NAME = "merchandise" ] || [ $MODULE_NAME = "storage" ]
then
GOMAXPROCS=1 go test -v -timeout 30m ./test/$MODULE_NAME/... -run="^Test"+"/Test"+$PRIORITY -json | go-test-report
else
go test -v -timeout 30m ./test/$MODULE_NAME/... -run="^Test"+"/Test"+$PRIORITY -json | go-test-report
fi
mv test_report.html ${repoPath}/test-coverage
"""
echo 'Api-test pass.'
}
}
stage('Report') {
steps {
echo "report"
publishHTML (target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: '${BUILD_ID}/test-coverage',
reportFiles: 'test_report.html',
reportName: "HTML Report"
])
}
}
}
post {
always {
sh """
cd ${repoPath}
//一些需要后置执行的步骤
ls ${repoPath}/test-coverage
"""
echo "clean over..."
emailext body: genReportBody(),
subject: 'Test Report',
to: '${RECIPIENT_LIST}',
mimeType: 'text/html'
}
success {
echo 'Build && Test Succeeded.'
}
failure {
echo 'Build && Test Failured.'
}
}
}
//下面就是获取报告的一些内容,定义一些你需要的参数
例如:
// 生成测试报告
def reportContent = """
<h2>OpenAPI Test Report (${MODULE_NAME})</h2>
<p>Environment: ${ENV}</p>
<p>Test Time: ${runtime}s</p>
<h3>Test Cases:</h3>
<ul>
<a href="公司jenkins地址/view/pid/job/${JOB_NAME}/$BUILD_ID/HTML_20Report/" target="_blank">公司jenkins地址/view/pid/job/${JOB_NAME}/$BUILD_ID/HTML_20Report/</a>
</ul>
<p>Pass Rate: ${passedRate}% </p>
<p>Test Range: ${PRIORITY}</p>
<h3>Failures: ${failedCount}</h3>
"""
二、goc-get-coverage
原有的Jenkins的shell脚本:用于获取代码覆盖率(自由风格)
1.参数化构建过程
- 添加选项参数:SERVICE_NAME
- 添加选项参数:MODULE_NAME
- 添加选项参数:PRIORITY
2.添加shell命令框
#!/bin/bash
GOC_SERVER_ADDRESS="goc-server:7788"
export GOC_SERVER_ADDRESS
TestURL="测试代码库.git"
repoPath="/var/jenkins_work/workspace/${JOB_NAME}/${BUILD_ID}"
cov_file="${SERVICE_NAME}_TestCoverage.out"
cov_html="${SERVICE_NAME}_TestCoverage.html"
git clone $TestURL $repoPath
cd $repoPath
cp /home/jenkins/compose/conf/.env ${repoPath}/test
cp /home/jenkins/compose/conf/.env.storage ${repoPath}/test/storage/v1/.env
go mod tidy
#执行测试用例
cd $repoPath
go test -v -timeout 30m ./test/$MODULE_NAME/... -run="^Test"+"/Test"+$PRIORITY
# 由于在Jenkins非交互式环境,移除-it参数用来运行docker exec
docker exec goc-server bash -c "goc list"
# 清除存量覆盖率数据
docker exec goc-server bash -c "goc clear --address=$GOC_SERVER_ADDRESS"
# 收集覆盖率数据
docker exec goc-server bash -c "goc profile --service $SERVICE_NAME -o ${cov_file}"
docker exec goc-server bash -c "go tool cover -func=${cov_file}"
# 生成代码覆盖率报告
docker exec goc-server bash -c "gocov convert ${cov_file} | gocov-html > ${cov_html}"
cd ${repoPath}
# 创建存放覆盖率报告的目录
mkdir -p TestCoverage
# 确保已将报告文件从 docker 容器复制到指定目录
docker cp goc-server:/go/src/project-root/${cov_file} TestCoverage/
docker cp goc-server:/go/src/project-root/${cov_html} TestCoverage/
三、pid-goc-coverage
基于上述两个jenkins流水线,考虑到可以合并成同一个,并同时获取到测试脚本执行结果和代码覆盖率结果,两者都是HTML格式的,那要如何编写呢?
将goc-get-coverage的部分内容编写为流水线的格式,加入到goc-test-coverage流水线中,并不断调试~
1.参数化构建过程
参数化构建过程:结合原有的两个流水线的数据。这里不再赘述。
2.流水线脚本
具体的流水线脚本内容如下:
#!groovy
pipeline {
agent {label 'develop'}
environment {
GO_BINARY = "go"
TEST_REPORT_PATH = "test-report.xml"
RECIPIENTS = "xxxn.com" // 接收测试报告的邮箱地址
GOC_SERVER_ADDRESS = "goc-server:7788"
COV_FILE = "${SERVICE_NAME}_TestCoverage.out"
COV_HTML = "${SERVICE_NAME}_TestCoverage.html"
WEBHOOK_URL = "https://qyapi.weixin.qq.com/cgi-bin/webhook/send?key=yourkey"
}
stages {
stage('checkout') {
steps {
sh"""
export PATH="${arcPath}:${goRoot}:${kubectlRoot}:${makeRoot}:$PATH"
git clone --depth 1 ${repoURL} ${repoPath}
"""
echo "${JOB_NAME}"
}
}
stage('unit-test') {
steps {
echo 'yes'
}
}
stage('api-test') {
steps {
sh """
export CGO_ENABLED=1;GO111MODULE=on;GOINSECURE=git.xxx;GOPRIVATE=git.xxx;GOPROXY=https://goproxy.cn,direct
export PATH="${arcPath}:${goRoot}:${kubectlRoot}:${makeRoot}:$PATH"
go env -w GOINSECURE=git.xxx
go env -w GOPRIVATE=git.xxx
cd ${repoPath}
cp /home/jenkins/testdata/randomfile3G test/storage/upload/
cp /home/jenkins/testdata/randomfile100M test/storage/upload/
cp /home/jenkins/testdata/randomfile3G test/storage/Adminupload/
mkdir test-report
echo $ENV
cd ${repoPath}
echo 'cp test .env'
cp /home/jenkins/compose/conf/.env ${repoPath}/test
cp /home/jenkins/compose/conf/.env.storage ${repoPath}/test/storage/v1/.env
export PATH="/home/jenkins/go/bin:$PATH"
go mod tidy
if [ $MODULE_NAME = "merchandise" ] || [ $MODULE_NAME = "storage" ]
then
GOMAXPROCS=1 go test -v -timeout 30m ./test/$MODULE_NAME/... -run="^Test"+"/Test"+$PRIORITY -json | go-test-report
else
go test -v -timeout 30m ./test/$MODULE_NAME/... -run="^Test"+"/Test"+$PRIORITY -json | go-test-report
fi
mv test_report.html ${repoPath}/test-report
"""
echo 'Api-test pass.'
}
}
stage('Generate Test Coverage Report') {
steps {
sh """
export GOC_SERVER_ADDRESS=${GOC_SERVER_ADDRESS}
docker exec goc-server bash -c "goc list"
docker exec goc-server bash -c "goc clear --address=\${GOC_SERVER_ADDRESS}"
docker exec goc-server bash -c "goc profile --service \${SERVICE_NAME} -o \${COV_FILE}"
docker exec goc-server bash -c "go tool cover -func=\${COV_FILE}"
docker exec goc-server bash -c "gocov convert \${COV_FILE} | gocov-html > \${COV_HTML}"
mkdir -p ${repoPath}/TestCoverage
docker cp goc-server:/go/src/project-root/${COV_FILE} ${repoPath}/TestCoverage/
docker cp goc-server:/go/src/project-root/${COV_HTML} ${repoPath}/TestCoverage/
"""
}
}
stage('Test Coverage Report') {
steps {
echo "coverage report"
publishHTML (target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: "${BUILD_ID}/TestCoverage",
reportFiles: "${COV_HTML}",
reportName: "Test Coverage HTML Report"
])
}
}
stage('API Test Report') {
steps {
echo "test report"
publishHTML (target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: '${BUILD_ID}/test-report',
reportFiles: 'test_report.html',
reportName: "HTML Report"
])
}
}
}
post {
always {
sh """
cd ${repoPath}
go test -v ./test/job/... '-run=^TestJobList/TestSuccessJobList_Terminate'
go test -v ./test/job/... '-run=^TestJobList/TestSuccessJobListSuspended_Terminate'
if [[ $ENV = "coverage" && ($PRIORITY = "AdminP0" || $PRIORITY = "AdminP1") ]]
then
go test -v ./test/job/... '-run=^TestJobList/TestSuccessAdminJobListSuspended_Terminate'
go test -v ./test/job/... '-run=^TestJobList/TestSuccessAdminJobList_Terminate'
fi
ls ${repoPath}/test-report
ls ${repoPath}/TestCoverage
"""
echo "clean over..."
emailext body: genReportBody(),
subject: 'Test Report',
to: '${RECIPIENT_LIST}',
mimeType: 'text/html'
}
success {
echo 'Build && Test Succeeded.'
}
failure {
echo 'Build && Test Failured.'
}
}
}
def convertToJUnitXML(jsonFile, junitFile) {
sh "python - <<EOF\nimport json\nimport xml.etree.ElementTree as ET\n\ndef json_to_junit(json_file, junit_file):\n with open(json_file, 'r') as f:\n test_results = json.load(f)\n\n testsuites = ET.Element('testsuites')\n\n for package in test_results:\n testsuite = ET.SubElement(testsuites, 'testsuite', name=package['Package'])\n for test in package['Tests']:\n testcase = ET.SubElement(testsuite, 'testcase', name=test['Name'])\n if test['Action'] == 'fail':\n failure = ET.SubElement(testcase, 'failure', type='failure')\n failure.text = test['Output']\n elif test['Action'] == 'skip':\n skipped = ET.SubElement(testcase, 'skipped')\n skipped.text = test['Output']\n\n tree = ET.ElementTree(testsuites)\n tree.write(junit_file, xml_declaration=True, encoding='utf-8')\n\njson_to_junit('${jsonFile}', '${junitFile}')\nEOF"
}
def genReportBody() {
// 生成测试报告内容
def apitestReport = readFile("${BUILD_ID}/test-report/test_report.html")
def codeCoverageReport = readFile("${repoPath}/TestCoverage/${COV_HTML}")
// 在 genReportBody() 函数中添加,仅用于测试读取是否正确
def testApiReportContents = sh(script: "cat ${BUILD_ID}/test-report/test_report.html", returnStdout: true).trim()
echo testApiReportContents
def testCovReportContents = sh(script: "cat ${repoPath}/TestCoverage/${COV_HTML}", returnStdout: true).trim()
echo testCovReportContents
// 获取执行时间
sh(script: 'pwd')
def duration = sh(script: 'grep "Duration:" '+"${BUILD_ID}/test-report/test_report.html"+' | awk \'{print substr($6,9,length($6)-17)}\'', returnStdout: true).trim()
echo duration
def runtime = duration.split("\\.")[0].trim()
echo runtime
// 获取总数量
def total = sh(script: 'grep "Total:" '+"${BUILD_ID}/test-report/test_report.html"+' | awk \'{print substr($5,9,length($5)-26)}\'', returnStdout: true).trim()
// 获取通过率
def passedCount = sh(script: 'grep "Passed:" '+"${BUILD_ID}/test-report/test_report.html"+' | awk \'{print substr($5,9,length($5)-17)}\'', returnStdout: true).trim()
def skippedCount = sh(script: 'grep "Skipped:" '+"${BUILD_ID}/test-report/test_report.html"+' | awk \'{print substr($5,9,length($5)-17)}\'', returnStdout: true).trim()
def failedCount = sh(script: 'grep "Failed:" '+"${BUILD_ID}/test-report/test_report.html"+' | awk \'{print substr($5,9,length($5)-17)}\'', returnStdout: true).trim()
def passedRate = String.format("%.2f", passedCount.toInteger()/(total.toInteger()-skippedCount.toInteger()) * 100)
// 生成测试报告
def reportContent = """
<h2>OpenAPI Test Report (${MODULE_NAME})</h2>
<p>Environment: ${ENV}</p>
<p>Test Time: ${runtime}s</p>
<h3>Test Cases:</h3>
<ul>
<li>
API Test Report:
<a href="https://jenkins.xxx/view/pid/job/${JOB_NAME}/$BUILD_ID/HTML_20Report/" target="_blank">https://jenkins.xxx/view/pid/job/${JOB_NAME}/$BUILD_ID/HTML_20Report/</a>
</li>
<li>
Test Coverage Report:
<a href="https://jenkins.xxx/view/pid/job/${JOB_NAME}/$BUILD_ID/Test_20Coverage_20HTML_20Report/" target="_blank">https://jenkins.xxx/view/pid/job/${JOB_NAME}/$BUILD_ID/Test_20Coverage_20HTML_20Report/</a>
</li>
</ul>
<p>Pass Rate: ${passedRate}% </p>
<p>Test Range: ${PRIORITY}</p>
<h3>Failures: ${failedCount}</h3>
"""
def project_user = [
'job': '@aaa',
'storage': '@bbb',
'account': '@ccc',
'cloudapp': '@ddd',
'license': '@eee',
'merchandise': '@aaa']
def duty=project_user[MODULE_NAME]
//test
sh """
sh /home/jenkins/compose/conf/wxmsg.sh "https://qyapi.weixin.qq.com/cgi-bin/webhook/send?key=yourkey" ${MODULE_NAME} ${ENV} ${runtime} $BUILD_ID ${passedRate} ${failedCount} "$duty" ${JOB_NAME} ${PRIORITY}
"""
return reportContent
}
上面流水线中需要将结果通过机器人发送到企业微信群,需要配置,用到了wxmsg.sh
具体内容如下:
url=$1
MODULE_NAME=$2
ENV=$3
runtime=$4
BUILD_ID=$5
passedRate=$6
failedCount=$7
modOwner=$8
jobName=$9
testRange=${10}
curl "$url" \
-H 'Content-Type: application/json' \
-d '
{
"msgtype": "markdown",
"markdown": {
"content": "# OpenAPI Test Report <font color=\"warning\">('$MODULE_NAME')</font>\n
>### Environment: <font color=\"comment\">'$ENV'</font>
>### Test Time: <font color=\"comment\">'$runtime's</font>
>### API Test Report:\n<font color=\"comment\">[https://jenkins.xxx.cn/view/pid/job/'$jobName'/'$BUILD_ID'/HTML_20Report](https://jenkins.xxx.cn/view/pid/job/'$jobName'/'$BUILD_ID'/HTML_20Report)</font>
>### Test Coverage Report:\n<font color=\"comment\">[https://jenkins.xxx.cn/view/pid/job/'$jobName'/'$BUILD_ID'/Test_20Coverage_20HTML_20Report](https://jenkins.xxx.cn/view/pid/job/'$jobName'/'$BUILD_ID'/Test_20Coverage_20HTML_20Report)</font>
>### Pass Rate: <font color=\"info\">'$passedRate'%</font>
>### Test Range: <font color=\"comment\">'$testRange'</font>
>### Failures: <font color=\"red\">'$failedCount'</font>
>### Mod Owner: <font color=\"comment\">'$modOwner'</font>",
"mentioned_list":["@all"]
}
}
'
3.实践问题
优化1:执行job模块代码,同时获取job服务和standard-compute服务覆盖率
小编在项目实际实践过程中,存在一个需求项:job服务和standard-compute服务获取代码覆盖率情况,只需要运行一次job模块的测试脚本即可,基于上述需求,修改流水线“Generate Test Coverage Report”部分内容如下:
stage('Generate Test Coverage Report') {
steps {
script {
def generateCoverage = { serviceName, covFile, htmlFile, coverPath ->
sh """
docker exec goc-server bash -c "cd ${coverPath} && goc profile --service ${serviceName} -o ${GOC_PATH}/COVERAGE/${covFile}"
docker exec goc-server bash -c "cd ${coverPath} && go tool cover -func=${GOC_PATH}/COVERAGE/${covFile}"
docker exec goc-server bash -c "cd ${coverPath} && gocov convert ${GOC_PATH}/COVERAGE/${covFile} | gocov-html > ${GOC_PATH}/COVERAGE/${htmlFile}"
docker cp goc-server:${GOC_PATH}/COVERAGE/${htmlFile} ${repoPath}/TestCoverage/
""".trim()
}
sh "export GOC_SERVER_ADDRESS=${env.GOC_SERVER_ADDRESS}"
sh "cd ${repoPath} && mkdir -p ${repoPath}/TestCoverage"
sh "docker exec goc-server bash -c 'cd ${env.GOC_PATH} && mkdir -p ${env.GOC_PATH}/COVERAGE'"
sh "docker exec goc-server bash -c 'goc list'"
sh "docker exec goc-server bash -c 'goc clear --address=${env.GOC_SERVER_ADDRESS}'"
if (env.MODULE_NAME == "job") {
generateCoverage(env.SERVICE_NAME, env.COV_FILE, env.COV_HTML, env.PROJECT_ROOT_PATH)
generateCoverage('standard-compute', env.COMPUTE_COV_FILE, env.COMPUTE_COV_HTML, env.COMPUTE_COVER_PATH)
} else {
generateCoverage(env.SERVICE_NAME, env.COV_FILE, env.COV_HTML, env.PROJECT_ROOT_PATH)
}
}
}
}
后续因为不需要统计mock文件和idgen文件的内容,修改代码:
stage('Generate Test Coverage Report') {
steps {
script {
def generateCoverage = { serviceName, covFile, htmlFile, coverPath ->
sh """
docker exec goc-server sh -c "cd ${coverPath} && goc profile --service ${serviceName} -o ${GOC_PATH}/COVERAGE/${covFile} || exit 1"
docker exec goc-server sh -c "cd ${coverPath} && go tool cover -func=${GOC_PATH}/COVERAGE/${covFile} || exit 1"
# 新增步骤:使用 awk 过滤掉 mock 文件 和 idgen 文件
docker exec goc-server sh -c "awk '!/mock/ && !/idgen/' ${GOC_PATH}/COVERAGE/${covFile} > ${GOC_PATH}/COVERAGE/filtered_${covFile} || exit 1"
docker exec goc-server sh -c "cd ${coverPath} && gocov convert ${GOC_PATH}/COVERAGE/filtered_${covFile} | gocov-html > ${GOC_PATH}/COVERAGE/${htmlFile} || exit 1"
docker cp goc-server:${GOC_PATH}/COVERAGE/${htmlFile} ${repoPath}/TestCoverage/ || exit 1
""".trim()
}
sh """
export GOC_SERVER_ADDRESS=${env.GOC_SERVER_ADDRESS}
cd ${repoPath}
mkdir -p TestCoverage
docker exec goc-server sh -c 'cd ${env.GOC_PATH} && mkdir -p ${env.GOC_PATH}/COVERAGE'
docker exec goc-server sh -c 'goc list'
docker exec goc-server sh -c 'goc clear --address=${env.GOC_SERVER_ADDRESS}'
"""
if (env.MODULE_NAME == "job") {
generateCoverage(env.SERVICE_NAME, env.COV_FILE, env.COV_HTML, env.PROJECT_ROOT_PATH)
generateCoverage('standard-compute', env.COMPUTE_COV_FILE, env.COMPUTE_COV_HTML, env.COMPUTE_COVER_PATH)
} else {
generateCoverage(env.SERVICE_NAME, env.COV_FILE, env.COV_HTML, env.PROJECT_ROOT_PATH)
}
}
}
}
并且在运行的测试模块为job时,想要同时发送job和standard-compute服务的代码覆盖率报告到企业微信群,就需要调整部分流水线内容:
stage('Test Coverage Report') {
steps {
script {
// 检查 COMUTE_COV_HTML 文件是否存在
if (fileExists("${repoPath}/TestCoverage/${env.COMPUTE_COV_HTML}")) {
publishHTML(target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: "${BUILD_ID}/TestCoverage",
reportFiles: "${env.COV_HTML},${env.COMPUTE_COV_HTML}",
reportName: "Test Coverage HTML Report"
])
} else {
echo "Warning: COMPUTE_COV_HTML file does not exist."
publishHTML(target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: "${BUILD_ID}/TestCoverage",
reportFiles: "${env.COV_HTML}",
reportName: "Test Coverage HTML Report"
])
}
}
}
}
另外,由于小编在配置好之后执行流水线已经可以获取到job和standard-compute模块的代码了,但是这时候为了方便调试,只运行了其中一个case,生成的测试用例结果文件test_report.html文件报错了:Error when executing always post condition:
java.lang.ArithmeticException: Division undefined
经排查到是这段代码导致的:
def passedRate = String.format("%.2f", passedCount.toInteger()/(total.toInteger()-skippedCount.toInteger()) * 100)
报错原因是:这时分母可能为0,所以报错异常了。
修改代码后问题解决:
def totalInt = total.toInteger()
def skippedCountInt = skippedCount.toInteger()
def passedRate
if (totalInt - skippedCountInt > 0) {
passedRate = String.format("%.2f", passedCount.toInteger() / (totalInt - skippedCountInt) * 100)
} else {
passedRate = "N/A" // 或任何其他指示分母为0的值
}
优化2:自动往数据库中添加初始数据
拿到这个需求的时候,首先确认好自己需要的数据库和部分数据库数据,导出为sql脚本,首先编写对应的脚本:
(1)编写database_list.txt文件,里面一行一个数据库名称,例如:
(2)编写import_mysql.sh文件,用于创建数据库,导入数据库并导入数据文件。
DB_USER="your_db_user"
DB_PASSWORD="password"
DB_HOST="your_db_host"
DB_PORT="port"
DB_LIST_FILE="/home/jenkins/compose/conf/mysql_init/database_list.txt"
SQL_FOLDER="/home/jenkins/compose/conf/mysql_init/mysqldata"
INSERT_SQL_FILE="/home/jenkins/compose/conf/mysql_init/mysqldata/insert_statements.sql"
create_database_if_not_exists() {
local dbname=$1
echo "Creating $dbname database if it doesn't exist..."
mysql -h "$DB_HOST" -P "$DB_PORT" -u "$DB_USER" -p"$DB_PASSWORD" -e "CREATE DATABASE IF NOT EXISTS \`$dbname\`;"
}
# 从文件读取数据库名并创建数据库
while IFS= read -r dbname; do
create_database_if_not_exists "$dbname"
done < "database_list.txt"
echo "All specified databases have been created."
import_database() {
local dbname=$1
local sql_file="${SQL_FOLDER}/${dbname}.sql"
if [ "$dbname" != "db1" ] && [ -f "$sql_file" ];then
echo "Starting import for $dbname..."
mysql -h "$DB_HOST" -P "$DB_PORT" -u "$DB_USER" -p"$DB_PASSWORD" "$dbname" < "$sql_file" && echo "Imported data into $dbname successfully." || echo "Error occurred during data import into $dbname."
else
echo "SQL file $sql_file does not exist."
fi
}
# 同时开始导入数据到所有数据库
while IFS= read -r dbname; do
import_database "$dbname" &
done < "$DB_LIST_FILE"
# 等待所有后台进程结束
wait
echo "All databases have been imported."
# 检查 SQL 文件是否存在
if [ -f "$INSERT_SQL_FILE" ]; then
echo "Starting to import INSERT statements..."
mysql -h "$DB_HOST" -P "$DB_PORT" -u "$DB_USER" -p"$DB_PASSWORD" < "$INSERT_SQL_FILE" && echo "INSERT statements imported successfully." || echo "Error occurred during INSERT statements import."
else
echo "SQL file $INSERT_SQL_FILE does not exist."
fi
最开始编写的脚本内容:
DB_USER="your_db_user"
DB_PASSWORD="password"
DB_HOST="your_db_host:port"
在执行./import_mysql.sh文件时,报错:ERROR 2005 (HY000): Unknown MySQL server host 'localhost:3306’
排查到:DB_HOST 变量的值设置为 "localhost:3306",这在 mysql 命令行工具中是不正确的格式。MySQL 的 -h 选项仅接受主机名或IP地址,不能包含端口号。端口号应该使用 -P 选项单独指明。需要将 DB_HOST 变量中的主机名和端口号分开。正确的内容已写在上述脚本中~
(3)修改流水线
在pid-goc-coverage流水线中加入以下内容:
加入初始化数据库的操作
stage('Initialize Database') {
steps {
script {
//调用 import_mysql.sh 脚本初始化数据库
sh 'cd ${MYSQL_INIT_PATH} && ./import_mysql.sh'
}
}
}
加入最后清理删除数据操作
script {
sh '''
cd ${MYSQL_INIT_PATH}
while read db_name; do
mysql -h "yourhost" -P "port" -u "user" -p "password" -e "DROP DATABASE IF EXISTS $db_name;"
done < database_list.txt
'''
}
优化3:合并多个html文件
最开始的操作是写在api-test步骤下,后续考虑直接写在post目录下每次都会执行,具体内容如下:
ls -la ${repoPath}/test-report
ls ${repoPath}/test_report_*.html
cat ${repoPath}/test_report_*.html >> ${repoPath}/test-report/test_report.html
ls -la ${repoPath}/test-report
rm ${repoPath}/test_report_*.html
mkdir -p ${repoPath}/TestCoverage
ls -la ${repoPath}/TestCoverage
上述问题解决后,发现不同环境分区获取的.env文件会同时获取到storage目录下的和job目录下的,修改内容如下:
stage('api-test') {
steps {
script {
// 转换环境文件数组为环境变量
env.ENV_FILES = ['CloudSlurm', 'CloudPbs', 'HpcSlurm', 'HpcPbs'].join(',')
env.STORAGE_ENV_FILES = ['Storage-CloudSlurm', 'Storage-CloudPbs', 'Storage-HpcSlurm', 'Storage-HpcPbs'].join(',')
// 根据模块选择使用的环境文件数组
def selectedEnvFiles = (env.MODULE_NAME == 'storage') ? env.STORAGE_ENV_FILES.tokenize(',') : env.ENV_FILES.tokenize(',')
for (int i = 0; i < selectedEnvFiles.size(); i++) {
def envFile = selectedEnvFiles[i]
sh """
#!/bin/bash
set -e
set -o pipefail
echo "Testing with ${envFile}"
# 每次循环前清理旧的环境配置文件
rm -f $repoPath}/test/.env*
rm -f ${repoPath}/test/storage/v1/.env*
# 检查和复制新的环境配置文件
if [ ! -f "/home/jenkins/compose/conf/job/${envFile}" ]; then
echo "Environment file /home/jenkins/compose/conf/job/${envFile} does not exist"
exit 1
fi
cp /home/jenkins/compose/conf/job/${envFile} ${repoPath}/test/.env
# 当模块是storage时,还需处理storage的环境文件
if [ "${env.MODULE_NAME}" == "storage" ]; then
if [ ! -f "/home/jenkins/compose/conf/storage/${envFile}" ]; then
echo "Storage environment file /home/jenkins/compose/conf/storage/${envFile} does not exist"
exit 1
fi
cp /home/jenkins/compose/conf/storage/${envFile} ${repoPath}/test/storage/v1/.env
fi
cd ${repoPath}
mkdir -p test-report
echo $ENV
export PATH="/home/jenkins/go/bin:$PATH"
go mod tidy
# 执行测试并输出到文件
if [[ "${MODULE_NAME}" == "merchandise" || "${MODULE_NAME}" == "storage" ]]; then
GOMAXPROCS=1 go test -v -timeout 30m ./test/${MODULE_NAME}/... -run="^Test"+"/Test"+$PRIORITY -json | go-test-report -o test_report_${envFile}.html
else
go test -v -timeout 30m ./test/${MODULE_NAME}/... -run="TestZoneList" -json | go-test-report -o test_report_${envFile}.html
fi
"""
}
echo 'Api-test pass.'
}
}
}
此时运行流水线后报错:
+ grep Failed: /var/jenkins_work/workspace/pid-goc-coverage/184/test-report/test_report.html
+ awk '{print substr($5,9,length($5)-17)}'
Error when executing always post condition:
java.lang.NumberFormatException: For input string: "4
4
4
4"
原因是 awk 试图提取失败测试的数量,但看起来可能有多个结果被返回,导致所有数字在没有分隔的情况下被拼接成了一个字符串。正确的行为应该是返回单个结果,或者对多个结果进行某种形式的合计。
解决方法:重新定义一下之前获取报告失败率的情况:
def genReportBody() {
// 生成测试报告内容
def apitestReport = readFile("${BUILD_ID}/test-report/test_report.html")
def codeCoverageReport = readFile("${repoPath}/TestCoverage/${COV_HTML}")
//def ComputeCoverageReport = readFile("${repoPath}/TestCoverage/${COMPUTE_COV_HTML}")
// 获取执行时间
sh(script: 'pwd')
def duration = sh(script: 'grep "Duration:" '+"${repoPath}/test-report/test_report.html"+' | awk \'{print substr($6,9,length($6)-17)}\'', returnStdout: true).trim()
echo duration
def runtime = duration.split("\\.")[0].trim()
echo runtime
def total = sh(script: "grep 'Total:' ${repoPath}/test-report/test_report.html | awk '{sum += substr(\$5, 9, length(\$5)-26)} END {print sum}'", returnStdout: true).trim()
// 处理可能的多行输出
def totalCounts = total.split("\\n")
def totalsum = 0;
for (String count : totalCounts) {
totalsum += Integer.parseInt(count.trim())
}
def passedCount = sh(script: "grep 'Passed:' ${repoPath}/test-report/test_report.html | awk '{sum += substr(\$5, 9, length(\$5)-17)} END {print sum}'", returnStdout: true).trim()
// 处理可能的多行输出
def passedCounts = passedCount.split("\\n")
def passedsum = 0;
for (String count : passedCounts) {
passedsum += Integer.parseInt(count.trim())
}
def failedCount = sh(script: "grep 'Failed:' ${repoPath}/test-report/test_report.html | awk '{sum += substr(\$5, 9, length(\$5)-17)} END {print sum}'", returnStdout: true).trim()
// 处理可能的多行输出
def failedCounts = failedCount.split("\\n")
def failedsum = 0;
for (String count : failedCounts) {
failedsum += Integer.parseInt(count.trim())
}
def skippedCount = sh(script: "grep 'Skipped:' ${repoPath}/test-report/test_report.html | awk '{sum += substr(\$5, 9, length(\$5)-17)} END {print sum}'", returnStdout: true).trim()
// 处理可能的多行输出
def skippedCounts = skippedCount.split("\\n")
def skippedsum = 0
for (String count : skippedCounts) {
skippedsum += Integer.parseInt(count.trim())
}
def totalInt = totalsum
def skippedCountInt = skippedsum
def passedRate
if (totalInt - skippedCountInt > 0) {
passedRate = String.format("%.2f", passedsum / (totalInt - skippedCountInt) * 100)
} else {
passedRate = "N/A" // 或任何其他指示分母为0的值
}
def failedCountInt = failedsum
再次运行流水线报告内容展示为:
经排查是之前定义的runtime有问题,修改后:
post {
always {
...
script {
def reportBody = genReportBody()
echo "Report Body: ${reportBody}"
def emailBody = "${reportBody}The total duration is ${totalDurationStr} seconds"
echo "Email Body: ${emailBody}"
emailext (
subject: 'Test Report',
body: emailBody,
mimeType: 'text/html',
to: '${RECIPIENT_LIST}'
)
}
}
// 其他步骤保持原样...
}
totalDuration = 0
// 使用find命令搜索所有的test_report.html文件并存储在数组中
def reportFiles = sh(script: "find ${repoPath}/test-report -type f -name 'test_report.html'", returnStdout: true).trim().split('\n')
// 遍历数组中的每个文件
reportFiles.each { reportFile ->
// 获取每个文件的持续时间并累加
def durationString = sh(script: "grep 'Duration:' ${reportFile} | awk '{sum += substr(\$5, 9, length(\$5)-17)} END {print sum}'", returnStdout: true).trim()
def duration = durationString.toInteger()
totalDuration += duration
}
// 将总持续时间从毫秒转换为合适的时间单位(例如秒)
totalDuration = totalDuration / 1000
// 输出总持续时间
echo "Total Duration in seconds: ${totalDuration}"
totalDurationStr = "${totalDuration}"
解决完上述问题后,报告已生成,但是遇到新的报错,其实API Test Report报告是没有生成的,日志报错:[htmlpublisher] Archiving HTML reports...
[htmlpublisher] Archiving at BUILD level /var/jenkins_work/workspace/pid-goc-coverage/290/test-report to /var/jenkins_home/jobs/pid-goc-coverage/builds/290/htmlreports/Test_20API_20HTML_20Report
ERROR: Directory '/var/jenkins_work/workspace/pid-goc-coverage/290/test-report' exists but failed copying to '/var/jenkins_home/jobs/pid-goc-coverage/builds/290/htmlreports/Test_20API_20HTML_20Report'.
排查步骤:
首先加部分操作查看文件是否存在:
+ ls -la /var/jenkins_work/workspace/pid-goc-coverage/290/test-report
total 0
drwxrwxr-x 2 jenkins jenkins 6 Apr 23 13:49 .
drwxrwxr-x 7 jenkins jenkins 260 Apr 23 13:49 ..
[Pipeline] sh
+ whoami
jenkins
[Pipeline] sh
+ pwd
/var/jenkins_work/workspace/pid-goc-coverage
查看上述日志发现此时发布报告,但是还没有生成test_report.html文件,意识到:之前合并多个html文件的步骤是放在post步骤中的,找到问题原因,有两种解决方式:
方式1:将合并多个html文件的步骤写成stage放在发布报告之前
方式2:将发布报告的步骤也放在post中,且放在合并多个html文件步骤之后
下面我是采用了方式2解决此问题。
post {
always {
sh """
cd ${repoPath}
go test -v ./test/job/... '-run=^TestJobList/TestSuccessJobList_Terminate'
go test -v ./test/job/... '-run=^TestJobList/TestSuccessJobListSuspended_Terminate'
if [[ $ENV = "coverage" && ($PRIORITY = "AdminP0" || $PRIORITY = "AdminP1") ]]
then
go test -v ./test/job/... '-run=^TestJobList/TestSuccessAdminJobListSuspended_Terminate'
go test -v ./test/job/... '-run=^TestJobList/TestSuccessAdminJobList_Terminate'
fi
ls -la ${repoPath}
ls -la ${repoPath}/test-report
ls ${repoPath}/test_report_*.html
cat ${repoPath}/test_report_*.html >> ${repoPath}/test-report/test_report.html
ls -la ${repoPath}/test-report
rm ${repoPath}/test_report_*.html
mkdir -p ${repoPath}/TestCoverage
ls -la ${repoPath}/TestCoverage
ls -la ${repoPath}/test-report
"""
echo "clean over..."
script {
echo "Pbulish API TEST REPORT"
sh "ls -la ${repoPath}/test-report"
sh "whoami"
sh "pwd"
publishHTML (target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: "${BUILD_ID}/test-report",
reportFiles: "test_report.html",
reportName: "Test API HTML Report"
])
}
script {
def reportBody = genReportBody()
echo "Report Body: ${reportBody}"
def emailBody = "${reportBody}The total duration is ${totalDurationStr} seconds"
echo "Email Body: ${emailBody}"
emailext (
subject: 'Test Report',
body: emailBody,
mimeType: 'text/html',
to: '${RECIPIENT_LIST}'
)
}
}
success {
echo 'Build && Test Succeeded.'
}
failure {
echo 'Build && Test Failured.'
}
}
}
上述问题解决后已经可以获取到API Test Report和Test Coverage Report,但是由于我们是通过不同环境变量组合生成了多个html文件,这时遇到问题:虽然API Test Report链接点进去可以看到文件,但是会发现只有第一个展示的html文件可以点击查看脚本case执行情况,其他的报告虽然在界面展示了,但是并不能点击!
针对这个问题,考虑到之前发布API Test Report的方式不太合理,下面对此情况做了调整并将每个报告都能够展示在同一链接下。
并且在实际项目中,只有job模块需要运行不同环境变量下的代码,其他模块代码运行没有区别,考虑到资源损耗各方面的问题,修改流水线如下:
stage('api-test') {
steps {
script {
// 转换环境文件数组为环境变量
env.ENV_FILES = ['CloudSlurm', 'CloudPbs', 'HpcSlurm', 'HpcPbs'].join(',')
//env.STORAGE_ENV_FILES = ['AdminCloudPbs', 'AdminHpcPbs', 'AdminCloudSlurm', 'AdminHpcSlurm'].join(',')
// 根据模块选择使用的环境文件数组
def selectedEnvFiles = env.ENV_FILES.tokenize(',')
for (int i = 0; i < selectedEnvFiles.size(); i++) {
def envFile = selectedEnvFiles[i]
sh """
echo "Testing with ${envFile}"
# 每次循环前清理旧的环境配置文件
rm -f ${repoPath}/test/.env*
rm -f ${repoPath}/test/storage/v1/.env*
# 检查和复制新的环境配置文件
if [ "${env.MODULE_NAME}" == "job" ]; then
if [ ! -f "/home/jenkins/compose/conf/job/${envFile}" ]; then
echo "Environment file /home/jenkins/compose/conf/job/${envFile} does not exist"
exit 1
fi
cp /home/jenkins/compose/conf/job/${envFile} ${repoPath}/test/.env
elif [ "${env.MODULE_NAME}" == "storage" ]; then
# 当模块是storage时,还需处理storage的环境文件
if [ ! -f "/home/jenkins/compose/conf/storage/CloudSlurm" ]; then
echo "Storage environment file /home/jenkins/compose/conf/storage/CloudSlurm does not exist"
exit 1
fi
cp /home/jenkins/compose/conf/storage/CloudSlurm ${repoPath}/test/storage/v1/.env
else
# 对于其他模块,检查 /home/jenkins/compose/conf/job/CloudSlurm
if [ ! -f "/home/jenkins/compose/conf/job/CloudSlurm" ]; then
echo "The directory /home/jenkins/compose/conf/job/CloudSlurm does not exist"
exit 1
fi
cp /home/jenkins/compose/conf/job/CloudSlurm ${repoPath}/test/.env
fi
cd ${repoPath}
mkdir -p test-report
echo $ENV
export PATH="/home/jenkins/go/bin:$PATH"
go mod tidy
# 执行测试并输出到文件
if [ "${MODULE_NAME}" == "merchandise" ] || [ "${MODULE_NAME}" == "storage" ]; then
GOMAXPROCS=1 go test -v -timeout 30m ./test/${MODULE_NAME}/... -run="^Test"+"/Test"+$PRIORITY -json | go-test-report -o test_report_${envFile}.html
else
go test -v -timeout 30m ./test/${MODULE_NAME}/... -run="^Test"+"/Test"+$PRIORITY -json | go-test-report -o test_report_${envFile}.html
fi
mv ${repoPath}/test_report_*.html ${repoPath}/test-report
"""
}
echo 'Api-test pass.'
}
}
}
}
post {
always {
sh "ls -la ${repoPath}/test-report"
sh "ls -la ${repoPath}/TestCoverage"
script {
echo "Publish API TEST REPORT"
def reportFiles = "test_report_CloudPbs.html,test_report_CloudSlurm.html,test_report_HpcPbs.html,test_report_HpcSlurm.html"
if (env.MODULE_NAME != "job") {
reportFiles = "test_report_CloudSlurm.html"
}
publishHTML (target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: "${repoPath}/test-report",
reportFiles: reportFiles,
reportName: "Test API HTML Report"
])
}
script {
emailext (
subject: 'Test Report',
body: genReportBody(),
mimeType: 'text/html',
to: '${RECIPIENT_LIST}'
)
}
sh """
cd ${repoPath}
if [[ $MODULE_NAME == "job" ]]; then
go test -v ./test/job/... '-run=^TestJobList/TestSuccessJobList_Terminate'
go test -v ./test/job/... '-run=^TestJobList/TestSuccessJobListSuspended_Terminate'
if [[ $ENV == "coverage" && ($PRIORITY == "AdminP0" || $PRIORITY == "AdminP1") ]]; then
go test -v ./test/job/... '-run=^TestJobList/TestSuccessAdminJobListSuspended_Terminate'
go test -v ./test/job/... '-run=^TestJobList/TestSuccessAdminJobList_Terminate'
fi
fi
"""
echo "clean over..."
}
success {
echo 'Build && Test Succeeded.'
}
failure {
echo 'Build && Test Failed.'
}
}
}
def convertToJUnitXML(jsonFile, junitFile) {
sh "python - <<EOF\nimport json\nimport xml.etree.ElementTree as ET\n\ndef json_to_junit(json_file, junit_file):\n with open(json_file, 'r') as f:\n test_results = json.load(f)\n\n testsuites = ET.Element('testsuites')\n\n for package in test_results:\n testsuite = ET.SubElement(testsuites, 'testsuite', name=package['Package'])\n for test in package['Tests']:\n testcase = ET.SubElement(testsuite, 'testcase', name=test['Name'])\n if test['Action'] == 'fail':\n failure = ET.SubElement(testcase, 'failure', type='failure')\n failure.text = test['Output']\n elif test['Action'] == 'skip':\n skipped = ET.SubElement(testcase, 'skipped')\n skipped.text = test['Output']\n\n tree = ET.ElementTree(testsuites)\n tree.write(junit_file, xml_declaration=True, encoding='utf-8')\n\njson_to_junit('${jsonFile}', '${junitFile}')\nEOF"
}
def genReportBody() {
// 定义测试报告内容变量
def testReport1, testReport2, testReport3, testReport4
// 根据模块名选择读取不同的报告
if (env.MODULE_NAME == "job") {
testReport1 = readFile("${repoPath}/test-report/test_report_CloudPbs.html")
testReport2 = readFile("${repoPath}/test-report/test_report_CloudSlurm.html")
testReport3 = readFile("${repoPath}/test-report/test_report_HpcPbs.html")
testReport4 = readFile("${repoPath}/test-report/test_report_HpcSlurm.html")
}else {
testReport1 = readFile("${repoPath}/test-report/test_report_CloudSlurm.html")
}
def CoverageReport = readFile("${repoPath}/TestCoverage/${COV_HTML}")
// 生成测试报告
def reportContent = """
<h2>OpenAPI Test Report (${MODULE_NAME})</h2>
<p>Environment: ${ENV}</p>
<h3>Test Cases:</h3>
<ul>
<li>
API Test Report:
<a href="https://jenkins.xxx.cn/view/pid/job/${JOB_NAME}/${BUILD_ID}/Test_20API_20HTML_20Report/" target="_blank">https://jenkins.xxx.cn/view/pid/job/${JOB_NAME}/${BUILD_ID}/Test_20API_20HTML_20Report/</a>
</li>
<li>
Test Coverage Report:
<a href="https://jenkins.xxx.cn/view/pid/job/${JOB_NAME}/${BUILD_ID}/Test_20Coverage_20HTML_20Report/" target="_blank">"https://jenkins.xxx.cn/view/pid/job/${JOB_NAME}/${BUILD_ID}/Test_20Coverage_20HTML_20Report/"</a>
</li>
</ul>
"""
def project_user = [
'job': '@aaa',
'storage': '@bbb',
'account': '@ccc',
'cloudapp': '@ddd',
'license': '@eee',
'iam': '@aaa',
'merchandise': '@bbb']
def modOwner=project_user[MODULE_NAME]
//计算调度小组
sh """
sh /home/jenkins/compose/conf/wxmsg.sh "https://qyapi.weixin.qq.com/cgi-bin/webhook/send?key=XXXXXXXX" ${MODULE_NAME} ${ENV} ${BUILD_ID} "$modOwner" ${JOB_NAME} ${PRIORITY}
"""
return reportContent
}
优化4:优先上传文件到指定目录
由于实际项目中,进行作业提交、计算和调度等,首先需要通过调用storage模块的代码实现上传相应的文件存储在指定的工作目录下,每次手动操作首先容易忘记,其次比较麻烦,直接将此步骤加入到流水线步骤中:
stage('PreUpload') {
steps {
script{
env.STORAGE_PreUpload_ENV_FILES = ['AdminCloudPbs', 'AdminHpcPbs', 'AdminCloudSlurm', 'AdminHpcSlurm', 'NormalCloudPbs', 'NormalHpcPbs', 'NormalCloudSlurm', 'NormalHpcSlurm'].join(',')
def selectedEnvFiles = env.STORAGE_PreUpload_ENV_FILES.tokenize(',')
for (int i = 0; i < selectedEnvFiles.size(); i++) {
def envFile = selectedEnvFiles[i]
sh """
echo "UploadFile with ${envFile}"
# 检查和复制新的环境配置文件
if [ ! -f "/home/jenkins/compose/conf/preupload/${envFile}" ]; then
echo "Environment file /home/jenkins/compose/conf/preupload/${envFile} does not exist"
exit 1
fi
cp "/home/jenkins/compose/conf/preupload/${envFile}" "${repoPath}/test/storage/v1/.env"
# 执行preupload相关测试
cd ${repoPath}
export ENV=\$(ls ${repoPath}/test/storage/v1/.env*)
go test -v ./test/storage/... -run="TestUploadFile/TestPrepareUploadSmallFile"
echo "Prepare Upload Small File test completed."
"""
}
}
}
}
另外之前将准备工作直接写在了Preparation流水线步骤中,但是之前写的是只要构建流水线就会复制大文件到固定文件夹下,但实际这个文件只有在storage模块执行时需要,为了减少损耗,修改流水线步骤如下:
stage('Preparation') {
steps {
sh """
export CGO_ENABLED=1;GO111MODULE=on;GOINSECURE=git.xxx;GOPRIVATE=git.xxx;GOPROXY=https://goproxy.cn,direct
export PATH="${arcPath}:${goRoot}:${kubectlRoot}:${makeRoot}:$PATH"
go env -w GOINSECURE=git.xxx
go env -w GOPRIVATE=git.xxx
cd ${repoPath}
if [ "${env.MODULE_NAME}" == "storage" ]; then
cp /home/jenkins/testdata/randomfile3G test/storage/upload/
cp /home/jenkins/testdata/randomfile100M test/storage/upload/
cp /home/jenkins/testdata/randomfile3G test/storage/Adminupload/
else
echo "Skipping copy operation because MODULE_NAME is not 'storage'."
fi
"""
}
}
优化5:覆盖率字段通过机器人发送内容配置直观展示
首先查看HTML文件中获取覆盖率的参数:totalcov,但是最开始读取的时候发现html文件中有多个totalcov参数,其中部分是css样式定义,只需要提取最后一行展示覆盖率的数据。
这时候在脚本中加入定义CoverageRate的参数写在之前定义的genReportBody()中,具体定义方式如下:
// 获取覆盖率
def CoverageRate = sh(script: "grep 'totalcov' ${repoPath}/TestCoverage/${COV_HTML} | tail -n 1 | awk -F'[<>]' '{print \$3}'", returnStdout: true).trim()
echo "Coverage Result: ${CoverageRate}"
最后要想实现发送CoverageRate参数,需要将机器人发送内容修改,如下:
sh """
sh /home/jenkins/compose/conf/wxmsg.sh "https://qyapi.weixin.qq.com/cgi-bin/webhook/send?key=yourkey" ${MODULE_NAME} ${ENV} ${BUILD_ID} "$modOwner" ${JOB_NAME} ${PRIORITY} ${CoverageRate}
"""
其中/home/jenkins/compose/conf/wxmsg.sh 脚本内容如下(可调整字段展示颜色):
url=$1
MODULE_NAME=$2
ENV=$3
BUILD_ID=$4
modOwner=$5
jobName=$6
testRange=${7}
CoverageRate=$8
curl "${url}" \
-H 'Content-Type: application/json' \
-d '
{
"msgtype": "markdown",
"markdown": {
"content": "# Coverage Test Report <font color=\"warning\">('${MODULE_NAME}')</font>\n
>### Environment: <font color=\"yellow\">'${ENV}'</font>
>### API Test Report:\n<font color=\"comment\">[Test API Report](公司jenkins地址/view/pid/job/'${jobName}'/'${BUILD_ID}'/Test_20API_20HTML_20Report)</font>
>### Test Coverage Report:\n<font color=\"comment\">[Test Coverage Report](公司jenkins地址/view/pid/job/'${jobName}'/'${BUILD_ID}'/Test_20Coverage_20HTML_20Report)</font>
>### Coverage Rate: <font color=\"green\">'${CoverageRate}'</font>
>### Test Range: <font color=\"comment\">'${testRange}'</font>
>### Mod Owner: <font color=\"comment\">'${modOwner}'</font>",
"mentioned_list":["@all"]
}
}
'
最终报告展示内容如下:
至此整个流水线已经比较完整了,接下来就是项目中遇到实际问题实际解决喽~
四、pid-docker-build
目前经过上述流水线已经实现了获取代码覆盖率和发送报告功能,但是我们前期的docker的服务镜像是基于当时的代码构建的,这样如果一直用之前的镜像就会出现服务代码不是最新的,那当然测试出来的结果无疑也不是最新的,为了解决此问题,构建新的流水线:pid-docker-build,实现了docker自动拉取代码编译镜像。
需要深入了解的可查看之前docker-compose启动微型集群获取集成测试代码覆盖率-CSDN博客
文章中提到的项目目录结构。
1.流水线脚本
#!groovy
pipeline {
agent any
environment {
// 定义仓库URLs
Project_Root_URL = "ssh://vcssh@xxx/source/project-root.git"
Standard_Compute_URL = "ssh://vcssh@xxx/source/standard-compute.git"
SSO_URL = "ssh://vcssh@xxx/source/sso.git"
DOCKER_SERVICES_DIR = "/home/jenkins/compose/project-root/internal"
RepoPath = "/var/jenkins_work/workspace/${JOB_NAME}/${BUILD_ID}"
CodePath = "/home/jenkins/compose/code"
arcPath = "/home/jenkins/phab/arcanist/bin"
kubectlRoot = "/opt/jenkins_deps/bin"
makeRoot = "/usr/bin"
goRoot = "/usr/local/go/bin"
}
stages {
stage('Preparation') {
steps {
sh "export PATH=${arcPath}:${goRoot}:${kubectlRoot}:${makeRoot}:$PATH"
}
}
stage('Checkout') {
steps {
sh """
mkdir -p ${CodePath}
if [ ! -d "${CodePath}/project-root" ]; then
git clone --depth 1 ${Project_Root_URL} ${CodePath}/project-root
else
git -C ${CodePath}/project-root fetch --all
git -C ${CodePath}/project-root reset --hard origin/master
fi
if [ ! -d "${CodePath}/standard-compute" ]; then
git clone --depth 1 ${Standard_Compute_URL} ${CodePath}/standard-compute
else
git -C ${CodePath}/standard-compute fetch --all
git -C ${CodePath}/standard-compute reset --hard origin/master
fi
if [ ! -d "${CodePath}/sso" ]; then
git clone --depth 1 ${SSO_URL} ${CodePath}/sso
else
git -C ${CodePath}/sso fetch --all
git -C ${CodePath}/sso reset --hard origin/master
fi
"""
}
}
stage('Build Services and Deploy') {
steps {
script {
// 定义一个包含所有服务名称的列表
def services = ['job','storage','standard-compute','merchandise','account_bill','license','iamserver']
// 迭代服务列表,为每个服务构建镜像
for (service in services) {
dir("${env.DOCKER_SERVICES_DIR}/${service}") {
// 执行每个服务目录下的Makefile
sh 'make all'
}
}
}
}
}
}
post {
always {
// 清理工作路径
sh "rm -rf ${RepoPath}"
echo 'Workspace cleaned up!'
// dir("${DOCKER_SERVICES_DIR}") {
// sh " docker-compose up -d"
// sh "docker-comose down"
// sh "docker system prune -af"
// }
success {
echo 'Build and deployment successful!'
}
failure {
echo 'Build or deployment failed.'
}
}
}
2.编写makefile文件
SRC_ROOT := /home/jenkins/compose/code/project-root/cmd
MODULE_DIR := job
OBJECT := $(MODULE_DIR)
IMAGE_OBJECT := $(MODULE_DIR)-test
AGENTPORT := 12321
.DEFAULT_GOAL: all
.PHONY: all goc-build image
all: goc-build image
goc-build:
@echo "== Building $(MODULE_DIR)... =="
cd $(SRC_ROOT)/$(MODULE_DIR) && CGO_ENABLED=0 goc build --center=http://goc服务地址 --agentport :$(AGENTPORT) . -o $(OBJECT)
@echo "== Moving $(MODULE_DIR) to current directory... =="
mv $(SRC_ROOT)/$(MODULE_DIR)/$(OBJECT) ./
@echo "== Build process completed successfully. =="
image:
@echo "== Building Docker image $(IMAGE_OBJECT)... =="
docker build . -t $(IMAGE_OBJECT)
@echo "== Docker image $(IMAGE_OBJECT) built successfully. =="
config目录下存放配置文件;
Dockerfile文件之前已经详细说明过,这里不在赘述。
job:goc build . 构建的二进制文件
但是当一个模块下有多个选项时,比如storage下存在cloud-storage和hpc-storage,该如何修改和编写makefile呢?具体内容如下,也可以和只有一个选项时的makefile进行对比查看变化点。
SRC_ROOT := /home/jenkins/compose/code/project-root/cmd
MODULE_DIR := storage
OBJECT := $(MODULE_DIR)
IMAGE_OBJECT ?= $(OBJECT)-test
AGENTPORT ?= 12326 # for hpc
# AGENTPORT ?= 12329 # for cloud
.DEFAULT_GOAL: all
.PHONY: all goc-build image hpc cloud
all: hpc cloud
goc-build:
@echo "== Building $(MODULE_DIR)... =="
cd $(SRC_ROOT)/$(MODULE_DIR) && CGO_ENABLED=0 goc build --center=http://goc服务地址 --agentport :$(AGENTPORT) . -o $(OBJECT)
@echo "== Moving $(MODULE_DIR) to current directory... =="
mv $(SRC_ROOT)/$(MODULE_DIR)/$(OBJECT) ./
@echo "== Build process completed successfully. =="
image:
@echo "== Building Docker image $(IMAGE_OBJECT)... =="
docker build . -t $(IMAGE_OBJECT)
@echo "== Docker image $(IMAGE_OBJECT) built successfully. =="
hpc:
@echo "== Configuring for hpc build =="
export AGENTPORT=12326; \
export IMAGE_OBJECT=$(OBJECT)-hpc-test; \
$(MAKE) goc-build image
@echo "== hpc configuration completed =="
cloud:
@echo "== Configuring for cloud build =="
export AGENTPORT=12329; \
export IMAGE_OBJECT=$(OBJECT)-cloud-test; \
$(MAKE) goc-build image
@echo "== cloud configuration completed =="
上述中虽然有cloud-storage和hpc-storage,但是使用的基础镜像是一致的,但是对于sc来说,镜像也是不一样的,那该如何编写makefile呢?
SRC_ROOT := /home/jenkins/compose/code
MODULE_DIR := standard-compute/cmd
OBJECT := standard-compute
IMAGE_OBJECT ?= $(OBJECT)-test
AGENTPORT ?= 12328 # for slurm
# AGENTPORT ?= 12331 # for pbs
.DEFAULT_GOAL: all
.PHONY: all goc-build slurm-image pbs-image slurm pbs
all: slurm pbs
goc-build:
@echo "== Building $(MODULE_DIR)... =="
cd $(SRC_ROOT)/$(MODULE_DIR) && CGO_ENABLED=0 goc build --center=http://goc服务地址 --agentport :$(AGENTPORT) . -o $(OBJECT)
@echo "== Moving $(MODULE_DIR) to current directory... =="
mv $(SRC_ROOT)/$(MODULE_DIR)/$(OBJECT) ./
@echo "== Build process completed successfully. =="
slurm-image:
@echo "== Building Docker image $(IMAGE_OBJECT)... =="
docker build -f slurm-Dockerfile . -t $(IMAGE_OBJECT)
@echo "== Docker image $(IMAGE_OBJECT) built successfully. =="
pbs-image:
@echo "== Building Docker image $(IMAGE_OBJECT)... =="
docker build -f pbs-Dockerfile . -t $(IMAGE_OBJECT)
@echo "== Docker image $(IMAGE_OBJECT) built successfully. =="
slurm:
@echo "== Configuring for Slurm build =="
export AGENTPORT=12328; \
export IMAGE_OBJECT=$(OBJECT)-slurm-test; \
$(MAKE) goc-build slurm-image
@echo "== Slurm configuration completed =="
pbs:
@echo "== Configuring for PBS build =="
export AGENTPORT=12331; \
export IMAGE_OBJECT=$(OBJECT)-pbs-test; \
$(MAKE) goc-build pbs-image
@echo "== PBS configuration completed =="