Jenkins Hybrid Windows Linux Build Pipelines For Docker and SQL Server

Consider a scenario in which you wish to use DACPACs, but you want to spin up SQL Server in a container on Linux (say Ubuntu) because you wish to forgo the cost of having to license windows or use a Docker volume manager that is only available on Linux, what is the solution to this ?.

Jenkins Distributed Builds To The Rescue

One of the many nice features that Jenkins has its ability to perform a distributed build. Distributed builds provide a number of different ways in which a build pipeline can be executed:

  • part of a build running on windows on premises
  • part of the build running in Linux on premises
  • part of the build running inside a container
  • a build that is performed both on premises and in the public cloud

The distributed build feature also allows builds to be scaled out.

Pre-Requisites

Setting up the build environment for this post is essentially the same as that detailed in my previous post, the only differences are:

  • A build slave needs to be configured
  • A subtly different Jenkins script needs to be used, this can be found in this git repo along with a SQL Server data tools project in order to test the build pipeline out

Jenkins Build Slaves 101

By default when Jenkins is installed, a Jenkins master is always present, to perform a distributed build we need to start using build slaves. In this post my build master runs on Windows 10 and my build slave will run on Ubuntu Server 16.04 on virtual box on the same machine:

ubuntu-slave

The “ubuntu-vbox” build slave is configured as follows:

Untitled

For the purposes of brevity I have elected not to use ssh keys. When installing Ubuntu, make sure that:

  • The open ssh server component is installed
  • The Java run-time is installed, this article acts as a good reference for how to do this
  • A directory will be required for the master to install slave.jar on the slave, I have used /var/jenkins in this example

Making A Build Pipeline Distributed

I am going to take the build pipeline from my last blog post and turn it into a distributed build pipeline. To recap, this is the build pipeline in its original state from my previous blog posting:

properties([
  parameters([
    string(name: 'linuxbuildslave', defaultValue: 'ubuntu-vbox'),
    string(name: 'slaveipaddress' , defaultValue: '192.168.56.101'),
  ])
])

def StartContainer() {
    sh "docker run -e \"ACCEPT_EULA=Y\" -e \"SA_PASSWORD=P@ssword1\" --name SQLLinux${env.BRANCH_NAME} -d -i -p 15565:1433 microsoft/mssql-server-linux && sleep 15"
}

def BranchToPort(String branchName) {
    def BranchPortMap = [
        [branch: 'master'   , port: 15565],
        [branch: 'Release'  , port: 15566],
        [branch: 'Feature'  , port: 15567],
        [branch: 'Prototype', port: 15568],
        [branch: 'HotFix'   , port: 15569]
    ]
    BranchPortMap.find { it['branch'] ==  branchName }['port']
}

def DeployDacpac() {
    def SqlPackage = "C:\\Program Files\\Microsoft SQL Server\\140\\DAC\\bin\\sqlpackage.exe"
    def SourceFile = "SelfBuildPlSlave\\bin\\Release\\SelfBuildPlSlave<span 				data-mce-type="bookmark" 				id="mce_SELREST_start" 				data-mce-style="overflow:hidden;line-height:0" 				style="overflow:hidden;line-height:0" 			></span>.dacpac"
    def ConnString = "server=${params.slaveipaddress},${BranchToPort(env.BRANCH_NAME)};database=SsdtDevOpsDemo;user id=sa;password=P@ssword1"

    unstash 'theDacpac'
    bat "\"${SqlPackage}\" /Action:Publish /SourceFile:\"${SourceFile}\" /TargetConnectionString:\"${ConnString}\" /p:ExcludeObjectType=Logins"
}

node('master') {
    stage('git checkout') {
        checkout scm

    }
    stage('build dacpac') {
        bat "\"${tool name: 'Default', type: 'msbuild'}\" /p:Configuration=Release"
        stash includes: 'SelfBuildPlSlave\\bin\\Release\\SelfBuildPlSlave.dacpac', name: 'theDacpac'
    }
}

node( params.linuxbuildslave ) {
    stage('start container') {
        StartContainer()
    }
}

node('master') {
    stage('deploy dacpac') {
        try {
            DeployDacpac()
        }
        catch (error) {
            throw error
        }
    }
}

node( params.linuxbuildslave ) {
    sh "docker rm -f SQLLinux${env.BRANCH_NAME}"
}

node() specifies where an element of the build pipeline will run:

  • node(master) will execute build activity on the master node
    (Windows 10 in this example)
  • node (params.linuxbuildslave) will execute build activity on the host specified by the linuxbuildslave parameter (Ubuntu running on virtual box in this case)

Build Pipeline Parameters

The first six lines of the groovy script illustrate the use of what are known as build pipeline parameters, these provides a means of specifying values used by the build pipeline without having to change the declarative code for build pipeline. Instead of the normal “Build now” option, the option to “Build with parameters” is now present, select this and the following screen will appear:

Untitled

The ip address of the slave is also passed in as a parameter, however, defaults can be setup both for this and the linux build slave.

To wrap things up, this post details a simple Jenkins build pipeline with build activity split between Windows and Linux (Ubuntu) nodes.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s