Linting is a process where a code analyzer is run to find programmatic errors, bugs, “stylistic constructs” or anti-patterns. For example, in Python there are specific modules that help you with linting when the code is build. If the code is not indented correctly (or whatever rule your code has to obey in Python), the line is flagged in the build. Depending on the implementation, the build gives a warning, or if they’re really strict, the build fails.
Wouldn’t it be nice if we could do the same for T-SQL? That when someone tries to push a SELECT * or a script with a NOLOCK hint to the server, the build fails and the T-SQL script never makes it to the server? Well, your dreams have come true! (well, if this is your dream at least)
ScriptDom is a free tool from the SQL Server DacFx that can make this happen. It’s a .NET library that can parse and analyze your T-SQL scripts and find potential anti-patterns like the ones mentioned above. At SQLBits 2022, Mala Mahadevan and I are giving a session on this very topic. Check it out if you want a deep dive into this technology. Mala has written a couple of PowerShell scripts for the demos, which can be found on GitHub. The focus of this blog post is not explaining how those PS scripts work, but rather how you can integrate those into an Azure Devops Build Pipeline (if the recording of the SQLBits session is live, I’ll link to it if you’re interested in more detail).
To make this work, you need the following:
This is how my repo looks like:
The .yml files are automatically added when a build pipeline is created (more on that later). The StackOverflow_UnsungHeroes.sql file is a SQL script where I’ve added a NOLOCK hint. Inside the UnitTests folder, I have a folder with the PS scripts and the .dll file:
All the scripts starting with “Find” are from Mala (again, check her GitHub). The CheckLintingErrors script is a wrapper I wrote. It will take the default working directory and search for all .sql files. It also includes references to the .dll file and the PS files.
$folder = $env:System_DefaultWorkingDirectory Add-Type -Path "./UnitTests/PowerShell/Microsoft.SqlServer.TransactSql.ScriptDom.dll" . ./UnitTests/PowerShell/Find-LintingErrorsForAzure.ps1 . ./UnitTests/PowerShell/Find-4PartNameWithPattern.ps1 . ./UnitTests/PowerShell/Find-NoLockHintWithPattern.ps1 . ./UnitTests/PowerShell/Find-SelectStarWithPattern.ps1 $files = Get-ChildItem $folder -Recurse -Include *.sql foreach ($f in $files){ Write-Host $f.FullName Find-LintingErrors $f }
Once this is set-up, we can create our build pipeline. In Azure Devops, go to the Pipelines section.
In Pipelines, we can create a new build pipeline (you can create a release pipeline under the tab Releases. It’s a slightly different pipeline with a different purpose, but you should be able to integrate ScriptDom in a release pipeline as well). Click on the New Pipeline button and select Azure Repos Git as the source for your code.
Select the repository with your T-SQL scripts.
Azure Devops will analyze the code in your repo and suggest a build template. As usual, these things don’t care about SQL so you’ll be presented with the options to either use a default starter pipeline, or to use your own YAML file. The first option is what we need.
The build pipeline is now “ready”. It will run some standard Hello World PowerShell scripts.
You can get rid of the scripts, and when you click on a line below “steps:”, you can insert a new PS script from the Tasks menu:
In the settings, choose “File Path” and reference the CheckLintingErrors script.
The full path is “$(System.DefaultWorkingDirectory)/UnitTests/PowerShell/CheckLintingErrors.ps1”. The YAML script should be like this:
# Starter pipeline # Start with a minimal pipeline that you can customize to build and deploy your code. # Add steps that build, run tests, deploy, and more: # https://aka.ms/yaml trigger: - dev pool: vmImage: ubuntu-latest steps: - task: PowerShell@2 inputs: filePath: '$(System.DefaultWorkingDirectory)/UnitTests/PowerShell/CheckLintingErrors.ps1'
And that’s it. When you run the pipeline, it will check the .sql scripts for a couple of common anti-patterns. However, to make it really useful, we need to make some minor adjustments. In the scripts from Mala, when something is found it’s just written to the screen with the Write-Host command. If we would do this in the build pipeline, you would have to dig through the log of the build to find these messages. It would be better if these would just pop-up in the automatic e-mail that Azure Devops sends. Without any adjustments, the build will succeed and you would be none the wiser.
With the following line, we can add a warning that will be picked up by the build pipeline (I’ve added this in the Find-NoLockHintWithPattern.ps1 script):
# Writes a warning to build summary and to log in yellow text Write-Host "##vso[task.LogIssue type=warning;]NOLOCK DETECTED!"
The result:
And with the following line, we can add an error:
# Writes an error to build summary and to log in red text Write-Host "##vso[task.LogIssue type=error;]I'm serious, NOLOCK! You want ghost data?"
This looks already a bit more alarming when you get it in your inbox:
However, the build still succeeds. If you really want to make sure not a single NOLOCK makes it into your database server, we got to nuke that pipeline. Simply add the line “exit 1” to your PS script to make the build crash.
This will lead to 2 errors, and a failed build.
Mission accomplished!
I recently read the book Agile Data Warehouse Design - Collaborative Dimensional Modeling, from Whiteboard…
You can find the slides for the session Building the €100 data warehouse with the…
I was asked to do a review of the book Microsoft Power BI Performance Best…
This is a quick blog post, mainly so I have the code available if I…
Praise whatever deity you believe in, because it's finally here, a tenant switcher for Microsoft…
This book was making its rounds on social media, and the concept seems interesting enough…