My 2018 SQL Server, Docker and Jenkins Presentation In Blog Form, Part III: Adding tSQLt Testing To The Mix

Where Were We ?

In part I of this series I set the scene for why you would want to use docker and Jenkins for SQL Server continuous integration pipelines. The first post also covered was how to use both of these technologies in conjunction with building a DACPAC and deploying it to a container. Following on from this in part II, I introduced Jenkins multi branch pipeline and presented an elegant method for preventing containers from failing to start due to external port clashes.

Pipeline As Code – Why Is This Important ?

When presenting this material at SQL Saturday in Paris, a great question came up from the audience:
                                                Do I need to use pipeline-as-code ?

The answer is no, however by not using pipeline-as-code you miss out on a lot of important value adds:

  • Anything that you store as code promotes reuse better
  • Anything that can be stored as code always behaves in a consistent manner
  • The pipeline can be stored under source code control along with the source code it is targeted at building
  • The process of spinning up new pipelines is made simpler and faster
  • Multi-branch pipelines can be leveraged

The follow on question from this is:

                        I don’t want to learn Jenkins groovy script, what can I do ?

All the examples I use in this series of blog posts have associated repos on GitHub, therefore you can take the Jenkinsfile’s from these repos and use them as templates.

The Subject Of This Post . . . Testing

So far we have build a DACPAC from a SQL Server Data Tools project and spun-up a SQL Server instance running inside a container to deploy this to. The one critical piece of the jigsaw we are missing is testing, this is what will be covered in this post. The GitHub repo for this blog, can be found here . I work for Pure Storage and I’m using their repo so as not to get my usage throttled back which would be the case were I to use a public repo.

The testing framework that this post focuses on is tSQLt, I’m not going to go into the nuances of tSQLt in this post to any great depth, however the most expedient way of getting up and running with this via the tSQLt quick start guide. A key tenet of the tSQLt framework is the use of “Test classes”, these are essentially schemas containing the stored procedures that embody the tests. In this example two test classes will be used:

CREATE PROCEDURE [tSQLtHappyPath].[testTotalInvoiceAmountPass]
AS
BEGIN
    DECLARE  @ActualTotal   NUMERIC(10,2)
            ,@ExpectedTotal NUMERIC(10,2) = 12345678.90;

    SELECT @ActualTotal = ISNULL(SUM(Total), 12345678.90)
    FROM   [dbo].[Invoice];

    EXEC tSQLt.AssertEquals @ExpectedTotal , @ActualTotal;
END;

CREATE PROCEDURE [tSQLtUnhappyPath].[testTotalInvoiceAmountFail]
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>AS
BEGIN
    DECLARE  @ActualTotal  NUMERIC(10,2)
            ,@ExpectedTotal NUMERIC(10,2) = 12345678.91;

    SELECT @ActualTotal = ISNULL(SUM(Total), 12345678.90)
    FROM   [dbo].[Invoice];

    EXEC tSQLt.AssertEquals @ExpectedTotal , @ActualTotal;
END;
GO

The stored procedure in the tSQLtHappyPath test class is designed to pass and the procedure in the tSQLtUnhappyPath is designed to fail. So far so good, but how can we integrate this into a continuous integration pipeline. The short answer is that the results from these tests are written to a table in the database, this table can be queried and the results rendered in a format the Jenkins JUnit plugin understands.

On lines 42 through to 43 a parameter is specified to determine whether the pipeline goes down the happy or un-happy execution paths:

parameters {
    booleanParam(defaultValue: true, description: '', name: 'HAPPY_PATH')
}

Further down the script, from line 80 onward, something known as conditional execution is used to determine which path to send the pipeline execution down. In practice you would never ever do this, the only reason we are doing this in this particular instance is purely for illustrative purposes:

stage('run tests (Happy path)') {
    when {
        expression {
            return params.HAPPY_PATH
        }
    }
    steps {
        bat "sqlcmd -S localhost,${PORT_NUMBER} -U sa -P P@ssword1 -d SsdtDevOpsDemo -Q \"EXEC tSQLt.Run \'tSQLtHappyPath\'\""
        bat "sqlcmd -S localhost,${PORT_NUMBER} -U sa -P P@ssword1 -d SsdtDevOpsDemo -y0 -Q \"SET NOCOUNT ON;EXEC tSQLt.XmlResultFormatter\" -o \"${WORKSPACE}\\${SCM_PROJECT}.xml\""
        junit "${SCM_PROJECT}.xml"
    }
}

stage('run tests (Un-happy path)') {
    when {
        expression {
            return !(params.HAPPY_PATH)
        }
    }
    steps {
        bat "sqlcmd -S localhost,${PORT_NUMBER} -U sa -P P@ssword1 -d SsdtDevOpsDemo -Q \"EXEC tSQLt.Run \'tSQLtUnhappyPath\'\""
        bat "sqlcmd -S localhost,${PORT_NUMBER} -U sa -P P@ssword1 -d SsdtDevOpsDemo -y0 -Q \"SET NOCOUNT ON;EXEC tSQLt.XmlResultFormatter\" -o \"${WORKSPACE}\\${SCM_PROJECT}.xml\""
        junit "${SCM_PROJECT}.xml"
    }
}

Using the new Blue Ocean GUI, the results for successful execution look something like this:

tSQLt success 1

Drilling down into the test results reveals:

tSQLt success 2.PNG

Forcing the pipeline execution down the un-happy path results in something quite different, firstly, as expected the test can be seen to have failed:

tSQLt fail 1.PNG

This time when we drill down into the test result, we see something that adds tremendous value, the result the tSQLt unit test was expecting and the result that it actually received:

tSQLt fail 2.PNG

To use Jenkins parlance, any build that fails unit testing has a status of ‘Unstable’. The post section of the pipeline allows us to capture this fact and act accordingly. For example the pipeline could send a message to the relevant development team requesting that whatever has broken the build be fixed ASAP.

Note the unstable keyword in the skeleton pipeline below:

skeleton.PNG

In the next post in this series we will cover leveraging parallelism in the pipeline.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s