My 2018 SQL Server, Docker and Jenkins Presentation In Blog Form, Part I

The mainstay of my presentation material this year has been my deck on continuous integration, Docker and Jenkins. For people who have not had the chance to see this presentation or have seen it and wanted to get some more context around it, I have written this first in a series of posts. Much, in fact just about all of the material in this post features in other posts on my blog. The aim of this set of posts is to present the material in a more digestible manner for people who might not be fully fully familiar with Docker and Jenkins.

This first post will cover an introduction to Jenkins and use of the “Sidecar pattern” for deploying DACPACs to. Subsequent posts will expand on this to include:

  • Multi branch build pipelines
  • Parallelism
  • Unit testing with tSQLt
  • The management of database state via Docker volumes

Many people in the SQL Server community have displayed a great interest in containers, only to be left scratching their heads thinking “Well, that is nice, but what can I practically use them for ?”. In my humble opinion, spinning up SQL Server inside a container as a deployment target for a continuous integration pipeline, is one of the, if not the best ways to leverage SQL Server and Docker.

Continuous Integration Pipeline 101

Capture

This simple picture represents the most basic of continuous integration pipelines at a conceptual level, and the question is, how can we make this better ?. What leaps out is the fact that instead of using a dedicated SQL Server instance, the deployment target can be spun up as a container. This is what is colloquially referred to as “The sidecar pattern” for using containers.

DACFx For The Unitiated

DACFx – data tier application framework to give it its full name, is the engine that underpins visual studio when a database project is deployed. This provides three functions:

  • The extraction of databases
  • The migration of database from SQL Server to Azure
  • The deployment of DACPAC and BACPACs

For the purposes of this post, we will focus on the use of DACFx for the deployment of DACPAC files via sqlpackage.exe. A DACPAC file encapsulates database objects and a BACPAC encapsulates database objects and data. The most important thing to note is that DACPACs are declarative by nature, meaning that when a DACPAC is published, DACFx will ensure that the end state of the database it is published to reflects the state of the schema as encapsulated by the DACPAC (or BACPAC for that matter).

DACPAC creation requires that msbuild with SQL Server as a target is installed, one way of doing this is to install visual studio with the tooling for Data storage and processing:

Capture

msbuild requires a SQL Server data tools project in order to produce a DACPAC, the easiest way to do this is to create a project of the type SQL Server in visual studio and then import an existing database schema into the project.

To utilise msbuild within a Jenkins pipeline, the msbuild must be installed and configured via the “Global Tool Configuration” section:

Capture

Why Use Jenkins ?

Jenkins is free, it is portable by virtue of the fact it will run anywhere where Java will run. Wherever you run it, it is a first class citizen, unlike TFS, it does not lag behind VSTS for functionality. It has fantastic third party plugin support, if you can think of anything that has traction in the developer community, the chances are that a Jenkins plugin exists for it. Also it supports rich and mature pipeline as code, this is the ability to specify a continuous integration pipeline in Groovy script, of which there are two flavors:

Capture.PNG

“Declarative Pipeline” is opinionated in that it is extremely rigid in the way it enforces developers to code the pipeline, however, it is also very easy to pick up and leads to pipeline code that is highly readable.

“Scripted Pipeline” is intended for power users and offers more flexibility than its Declarative counterpart. However, the scripted pipeline syntax requires a steeper learning curve and is not conducive to code that is as readable as that produced when using its declarative counterpart.

The Real Power Of Jenkins Pipeline As Code

Continuous integration pipeline-as-code is highly recommended, and there is a feature that makes this particularly powerful. This is the ability to store the pipeline code in a repository along with the source code that the pipeline will build, deploy and test. To do this we need to specify that the source of the pipeline script is a source code management (SCM) system:

This is an example of a repo that contains a Jenkinsfile:

Capture.PNG

A Tip

If you intend to use Jenkins pipeline-as-code stored in a GitHub repo in anger, it is recommended that you use a paid-for GitHub subscription, otherwise you will find that when your pipeline makes calls to GitHub. The pipeline may fail due to throttling. I have used the GitHub account for the company I work for in this instance in order to get around this.

Pipeline 1: Deploying To A Sidecar Container

Our very first CI pipeline will perform the following functions:

  • Check the code out of the GitHub repo
  • Create a DACPAC from the SQL Server data tools project
  • Spin up a SQL Server instance
  • Deploy the DACPAC to the instance
  • Remove the container

This is the pipeline code used to achieve this:

def StartContainer() {
    bat "docker run -e \"ACCEPT_EULA=Y\" -e \"SA_PASSWORD=P@ssword1\" --name ${CONTAINER_NAME} -d -i -p ${PORT_NUMBER}:1433 microsoft/mssql-server-linux:2017-GA"
    powershell "While (\$((docker logs ${CONTAINER_NAME} | select-string ready | select-string client).Length) -eq 0) { Start-Sleep -s 1 }"
}

def RemoveContainer() {
    powershell "If (\$((docker ps -a --filter \"name=${CONTAINER_NAME}\").Length) -eq 2) { docker rm -f ${CONTAINER_NAME} }"
}

def DeployDacpac() {
    def SqlPackage = "C:\\Program Files\\Microsoft SQL Server\\140\\DAC\\bin\\sqlpackage.exe"
    def SourceFile = "${SCM_PROJECT}\\bin\\Release\\${SCM_PROJECT}.dacpac"
    def ConnString = "server=localhost,${PORT_NUMBER};database=SsdtDevOpsDemo;user id=sa;password=P@ssword1"
    unstash 'theDacpac'
    bat "\"${SqlPackage}\" /Action:Publish /SourceFile:\"${SourceFile}\" /TargetConnectionString:\"${ConnString}\" /p:ExcludeObjectType=Logins"
}

pipeline {
    agent any

    environment {
        CONTAINER_NAME = "SqlLinux"
        PORT_NUMBER    = 15556
        SCM_PROJECT    = "Jenkins-Docker-Basic-Ci-Pipeline"
    }

The method StartContainer is used to spin up the container, note that the container name is hard coded as SqlLinux with a hard coded external port on the host of 15556, we also have methods for deploying the DACPAC to the container and removing the container. The rest of the script from the pipeline keyword down is as follows:

pipeline {
    agent any

    environment {
        CONTAINER_NAME = "SqlLinux"
        PORT_NUMBER    = 15556
        SCM_PROJECT    = "Jenkins-Docker-Basic-Ci-Pipeline"
    }

    stages {
        stage('git checkout') {
            steps {
                timeout(time: 5, unit: 'SECONDS') {
                    checkout scm
                }
            }
        }
        stage('build dacpac') {
            steps {
                bat "\"${tool name: 'Default', type: 'msbuild'}\" /p:Configuration=Release"
                stash includes: "${SCM_PROJECT}\\bin\\Release\\${SCM_PROJECT}.dacpac", name: 'theDacpac'
            }
        }

        stage('start container') {
            steps {
                RemoveContainer()
                timeout(time: 20, unit: 'SECONDS') {
                    StartContainer()
                }
            }
        }

        stage('deploy dacpac') {
            steps {
                timeout(time: 60, unit: 'SECONDS') {
                   DeployDacpac()
                }
            }
        }
    }
    post {
        always {
            RemoveContainer()
        }
        success {
            print 'post: Success'
        }
        unstable {
            print 'post: Unstable'
        }
        failure {
            print 'post: Failure'
        }
    }
}

Points Of Interest In The Body Of The Pipeline Script

  • the agent keyword
    This allows the entire build pipeline or different stages to be run on different hosts or inside containers.
  • environment
    Specifies key value pairs that will be defined as environment variables for all pipeline steps.
  • timeout wrapper
    If whatever reason something fails, we want things to fail fast, therefore a best practice is to wrap commands inside steps with timeouts. The timeout should be a value, seconds in the example, which is represents the maximum amount of time that the command should reasonably complete within.
  • post section
    This section contains handlers we can flesh out with groovy script in order to handle specific events:

    • always
      self explanatory
    • success
      pipeline completes without any errors
    • unstable
      the tests run against the pipeline using a testing plugin have failed
    • failed
      the pipeline has bombed out with an error

Performing A Pipeline Build

All being well when the initial build is performed, something that looks like the following should be visible:

Capture.PNG

This is absolutely fine, all the way up to the point when we want to dig into the logging information associated with this:

Started by user unknown or anonymous
Obtained Jenkinsfile from git https://github.com/PureStorage-OpenConnect/Jenkins-Docker-Basic-Ci-Pipeline
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] node
Running on Jenkins in C:\Program Files (x86)\Jenkins\workspace\Demo 1
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Declarative: Checkout SCM)
[Pipeline] checkout
 > git.exe rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git.exe config remote.origin.url https://github.com/PureStorage-OpenConnect/Jenkins-Docker-Basic-Ci-Pipeline # timeout=10
Fetching upstream changes from https://github.com/PureStorage-OpenConnect/Jenkins-Docker-Basic-Ci-Pipeline
 > git.exe --version # timeout=10
using GIT_ASKPASS to set credentials 
 > git.exe fetch --tags --progress https://github.com/PureStorage-OpenConnect/Jenkins-Docker-Basic-Ci-Pipeline +refs/heads/*:refs/remotes/origin/*
 > git.exe rev-parse "refs/remotes/origin/master^{commit}" # timeout=10
 > git.exe rev-parse "refs/remotes/origin/origin/master^{commit}" # timeout=10
Checking out Revision dc825448d7d101a07c769a182f085ae9c51424de (refs/remotes/origin/master)
 > git.exe config core.sparsecheckout # timeout=10
 > git.exe checkout -f dc825448d7d101a07c769a182f085ae9c51424de
Commit message: "Update README.md"
First time build. Skipping changelog.
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (git checkout)
[Pipeline] timeout
Timeout set to expire in 5 sec
[Pipeline] {
[Pipeline] checkout
 > git.exe rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git.exe config remote.origin.url https://github.com/PureStorage-OpenConnect/Jenkins-Docker-Basic-Ci-Pipeline # timeout=10
Fetching upstream changes from https://github.com/PureStorage-OpenConnect/Jenkins-Docker-Basic-Ci-Pipeline
 > git.exe --version # timeout=10
using GIT_ASKPASS to set credentials 
 > git.exe fetch --tags --progress https://github.com/PureStorage-OpenConnect/Jenkins-Docker-Basic-Ci-Pipeline +refs/heads/*:refs/remotes/origin/*
 > git.exe rev-parse "refs/remotes/origin/master^{commit}" # timeout=10
 > git.exe rev-parse "refs/remotes/origin/origin/master^{commit}" # timeout=10
Checking out Revision dc825448d7d101a07c769a182f085ae9c51424de (refs/remotes/origin/master)
 > git.exe config core.sparsecheckout # timeout=10
 > git.exe checkout -f dc825448d7d101a07c769a182f085ae9c51424de
Commit message: "Update README.md"
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (build dacpac)
[Pipeline] tool
[Pipeline] bat
[Demo 1] Running batch script

C:\Program Files (x86)\Jenkins\workspace\Demo 1>"C:\Program Files (x86)\MSBuild\14.0\Bin\msbuild.exe" /p:Configuration=Release 
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Building the projects in this solution one at a time. To enable parallel build, please add the "/m" switch.
Build started 09/06/2018 23:14:20.
Project "C:\Program Files (x86)\Jenkins\workspace\Demo 1\Jenkins-Docker-Basic-Ci-Pipeline.sln" on node 1 (default targets).
ValidateSolutionConfiguration:
  Building solution configuration "Release|Any CPU".
Project "C:\Program Files (x86)\Jenkins\workspace\Demo 1\Jenkins-Docker-Basic-Ci-Pipeline.sln" (1) is building "C:\Program Files (x86)\Jenkins\workspace\Demo 1\Jenkins-Docker-Basic-Ci-Pipeline\Jenkins-Docker-Basic-Ci-Pipeline.sqlproj" (2) on node 1 (default targets).
PrepareForBuild:
  Creating directory "bin\Release\".
  Creating directory "obj\Release\".
GenerateSqlTargetFrameworkMoniker:
Skipping target "GenerateSqlTargetFrameworkMoniker" because all output files are up-to-date with respect to the input files.
CoreCompile:
  C:\Program Files (x86)\MSBuild\14.0\bin\csc.exe /noconfig /nowarn:1701,1702 /nostdlib+ /errorreport:prompt /warn:4 /define:TRACE /highentropyva+ /reference:"C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.6.1\mscorlib.dll" /reference:"C:\Program Files (x86)\Jenkins\workspace\Demo 1\Jenkins-Docker-Basic-Ci-Pipeline\Assemblies\tSQLtCLR.dll" /debug:pdbonly /optimize+ /out:obj\Release\Jenkins_Docker_Basic_Ci_Pipeline.dll /ruleset:"C:\Program Files (x86)\Microsoft Visual Studio 14.0\Team Tools\Static Analysis Tools\\Rule Sets\MinimumRecommendedRules.ruleset" /subsystemversion:6.00 /target:library /warnaserror- /utf8output "C:\WINDOWS\TEMP\.NETFramework,Version=v4.6.1.SqlClrAttributes.cs"
  Using shared compilation with compiler from directory: C:\Program Files (x86)\MSBuild\14.0\Bin
SqlBuild:
  Creating a model to represent the project...
  Loading project references...
  Loading project files...
  Building the project model and resolving object interdependencies...
  Validating the project model...

All the information we want to see as to what Jenkins is doing under the covers is there, the problem is that sifting through this can be a pain. Luckily help is at hand in the form of the new “Blue ocean” pipeline GUI:

Capture

The blue ocean gui allows us to click on any stage of the pipeline, observe the steps associated with that stage and then drill down into the logging associated with a specific step. This also comes with the added bonus that is has the look and feel of a piece of software actually produced in the 21st century !!!.

In the next post in the series I will expand upon this example to cover multi branch pipelines.

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s