22 DevOps automation

The previous chapters have given you a grounding in dbatools in various areas that relate to a classic database administrator’s role. This chapter will show some examples of ways that you can use dbatools within a DevOps process. First, though, we need to define DevOps. You can find a hundred different definitions and understandings of the term, but we like this one from Microsoft (sqlps.io/whatdevops):

A compound of development (Dev) and operations (Ops), DevOps is the union of people, processes, and technology to continually provide value to customers.

The quote resonates with us because we believe that being able to continually provide value to customers requires people and processes and technology. dbatools is not going to be able to solve DevOps for you, but it is a tool (technology) that you (a person) can use within your processes to provide that value.

Overall, though, DevOps means different things to different people, which is why you’ll often find varying definitions. In a conversation with PowerShell MVP and DevOps engineer Chris Gardner, Chris said, “I see a DevOps engineer as kind of a one-size-fits-all title, but the actual work involved will vary.”

Chocolatey solutions engineer Stephen Valdinger followed, saying, “DevOps isn’t something you do, but rather, it’s a way of doing things. What works for us here may not work for you there, so you adjust.” He then went on to say that DevOps is a way of working that reduces time to introduce changes, while at the same time making changes traceable, accountable, and revertable. That’s where dbatools and PowerShell come in. We provide tools that help you move away from nonrepeatable GUI-based changes to machine-readable definition files, also known as Infrastructure as Code, a core tenant of DevOps.

In this chapter, you will learn how to extract and publish databases, and use the knowledge that you have gathered from the entire book, to create PowerShell tasks for use in CI/CD tools for your DevOps processes.

22.1 When should you use dbatools in DevOps?

What does DevOps look like in a database scenario? The possibilities are numerous, and the exact ones that you will use depend on your team’s specific needs. But the following list from devart (sqlps.io/dataops) gives a good idea of what the implementation of DevOps within a database team could look like:

  • Development

    • Creating a new table in a development DB
  • Continuous integration

    • Building a DB from a scripts folder
    • Creating tSQLt tests on a test DB
    • Running unit tests
    • Formatting a SQL file
  • Continuous delivery

    • Defining a package name and version
    • Publishing a package to a local repository
    • Deploying a package to a database
    • Generating synchronizing reports
    • Syncing a test DB with a production DB
    • Publishing a package to a NuGet feed
    • Changes successfully deployed in production
  • Operation

    • Monitoring server performance

And the great news is that dbatools and PowerShell can be used for each of these tasks. PowerShell can be used noninteractively, so it is a perfect tool to help automate your process because it can be repeated without human intervention. Further, because it’s code, it’s traceable, accountable, and revertible.

To kick off this chapter about how we can help your DevOps goals, we’ll start with the most common method of automating database deployments: the SQL Server DAC package. Then, we’ll show you how to integrate DACPACs into a continuous integration and continuous delivery (CI/CD) platform. This will enable you to perform many of the tasks in the list we mentioned earlier. Although we don’t outline how to perform every task, knowing how to perform one allows you to just replace some code and perform all of them.

22.2 DACPAC

Managing changes to database schemas and data can be challenging, but it’s necessary throughout the software development lifecycle. Before the creation of modern-day database management tools, we’d record the CREATE, READ, UPDATE, or DELETE statements and use them to manually update the database. As you can imagine, this was a headache of a solution that left us vulnerable to mistakes.

To solve this problem, database professionals made the move to declarative database development, which autogenerates deployment scripts. A number of products do this, including a free component created by Microsoft known as the data-tier application component package, or DACPAC.

DACPACs are artifacts (like zip files filled with SQL files) that define all of the objects associated with a database and are created when a database project is built in Visual Studio or Azure Data Studio (ADS). Developers can use DACPACs to quickly return to the same version of a database. Teams that release database changes can also use DACPACs to deploy a consistent version of the database through both production and test environments.

This deployment process typically includes four steps, each of which can be performed using dbatools. First, you need to export your changes to a DACPAC file. Then, you’ll likely want to set options such as “I want to exclude logins.” Once you’ve set your options, you’ll have to create a profile, and finally, you’ll publish the changes. You can accomplish this using the following:

  • Export-DbaDacPackage

  • New-DbaDacOption

  • New-DbaDacProfile

  • Publish-DbaDacPackage

These commands can be used within the development and deployment processes for databases and are most useful when you want to test against the schema but not real-world data. DACPACs are usually created by a build process, either manually within ADS or SQL Server Management Studio (SSMS), or within a pipeline using the command line with SqlPackage, DacFx, or dbatools.

A BACPAC, which is like a DACPAC with data included, can also be created from an existing database. This is particularly useful when you want to test against data from production databases.

In the next section, we’ll start with the first step in the DACPAC process: exporting a DACPAC.

22.2.1 Exporting a DACPAC from an existing database

Imagine the following situation, which we have experienced during our consultations: a company encounters an issue with a live database and assigns the issue to a developer to investigate and suggest mitigation options. The developer has been working on some new features and is unsure of the current schema within the production database, so they ask you for a DACPAC that represents it. Although it should be possible to do this from the deployment system, this may take time and require a lot of processing. To speed up the resolution of the issue, you may wish to create the DACPAC directly from the production system. You can export a DACPAC from an existing database with Export-DbaDacPackage, as shown next.

Listing 22.1 Exporting a DACPAC from the production database

PS> $splatExportDacPac = @{
  SqlInstance = "sql01:15595"
  Database = "Factory"
  FilePath = "C:	empProdFactory_20201230.dacpac"
}
PS> Export-DbaDacPackage @splatExportDacPac
 
Database    : Factory
Elapsed     : 8.69 s
Path        : C:	empProdFactory_20201230.dacpac
Result      : Extracting schema (Start)
              Gathering database credentials
              Gathering database options
              Gathering generic database scoped configuration option
              Gathering users
              Gathering roles
              Gathering application roles
              Gathering role memberships
              Gathering filegroups
              Gathering full-text catalogs
              Gathering assemblies
              Gathering certificates
              Gathering asymmetric keys
              Gathering symmetric keys
              Gathering encrypted symmetric keys
              Gathering schemas
              ~~~~~~~~~~~~
              Output Truncated
              ~~~~~~~~~~~~
              Gathering credentials
              Gathering logins
              Gathering server audits
              Extracting schema (Complete)
SqlInstance : sql01:15595

This command has created a DACPAC named ProdFactory_20201230.dacpac from the Factory database on the production instance. Listing 22.1 shows sample output that includes how long it took, where the file was saved and if there were any errors. The command has connected to the production database named Factory and has extracted the schema from the database and created a DACPAC. You can now pass that DACPAC to the developer. The next step will be for them to create a database using the DACPAC so that they can investigate the issue.

Try it now 22.1

Export a DACPAC from a database on your instance.

22.2.2 Publishing a DACPAC

Imagine that you are a developer who needs to investigate and mitigate an issue in the production database. You need to create the database with the exact schema of the production database, and you have been given the DACPAC.

You can use the Publish-DbaDacPackage command to create the database from the DACPAC. The DACPAC contains the schema of the production database, and publishing the DACPAC creates the database if it does not exist or incrementally updates the database so that the schema matches the schema in the DACPAC.

As a developer, you might use your local development instance or maybe a container. We recommend Andrew Pruski’s SQL Server container series, which you can find at sqlps.io/containers, to learn about creating SQL Server containers.

The command in the next listing will create a database named FactoryIssue on the developer container using the provided DACPAC. You can then run this command as many times as you like, and it will return the database back to the same schema. This process can also be used within your deployment pipeline to deploy the database changes through the environments using your CI/CD tooling.

Listing 22.2 Publishing a DACPAC from the production database

PS> $splatExportDacPac = @{
  SqlInstance = "sql01:15595"
  Database    = "Factory"
  Path        = "C:	empProdFactory_20201230.dacpac"
}
PS> Export-DbaDacPackage @splatExportDacPac
 
ComputerName    : sql01
SqlInstance     : sql01:15595
Database        : Factory
Dacpac          : C:	empProdFactory_20201230.dacpac
PublishXml      :
Result          : Initializing deployment (Start)
                  Initializing deployment (Complete)
                  Analyzing deployment plan (Start)
                  Analyzing deployment plan (Complete)
                  Updating datbase (Start)
                  Gathering roles
                  Gathering application roles
                  Gathering role memberships
                  Gathering filegroups
                  Gathering full-text catalogs
                  Gathering assemblies
                  Gathering certificates
                  Gathering asymmetric keys
                  Gathering symmetric keys
                  Gathering encrypted symmetric keys
                  Gathering schemas
                  ~~~~~~~~~~~~
                  Output Truncated
                  ~~~~~~~~~~~~
                  Gathering credentials
                  Gathering logins
                  Gathering server audits
                  Extracting schema (Complete)
 
PS> $splatPublishDacPac = @{
  SqlInstance = $developercontainer
  Database = "FactoryIssue"
  Path     = "C:	empProdFactory_20201230.dacpac"
}
PS> Publish-DbaDacPackage @splatPublishDacPac

The output in listing 22.2 shows the output that you receive when you run the command. It has the details of the host, instance, database, filename, and the result. The result shows the deployment of the database. In this example, you can see that it initializes the deployment, analyzes the plan, scripts the actions, and then creates the database. After creating the database, it creates the database users and then the tables, views, indexes, and other objects in the database schema contained in the DACPAC.

Try it now 22.2

Using the DACPAC that you created in the previous “Try it now,” publish the DACPAC to a new database on your instance.

You now know how to extract a DACPAC from an existing database and use it to publish the schema to a new database. You have many options for configuring this process.

22.2.3 DACPAC options

The options for exporting and publishing DACPACs are great enough to be a book all by themselves, but we have only part of a chapter to explore them. Because of this, we will focus on the most common options we’ve used. Let’s start with two simple examples for publishing and exporting DACPACs. We’ll return to the database the developer created earlier in our story and focus in on the users. In figure 22.1, you can see a number of users from the Production environment.

You can see there are users called FactoryApi and Production3rdPartyAccount. These are users from the production system that are not likely to exist in the development environment. Because of this, you want to configure the deployment to exclude them. You will have to provide the developer with the dbatools code to deploy the database without the users included.

Figure 22.1  The developer’s database has the production users in it.

You can use the New-DbaDacOption command to create a configuration that will exclude those users from the database and use that with the Publish-DbaDacPackage command. You can alter 83 DeployOptions, like CreateNewDatabase and DoNotAlterReplicatedObjects. For this example, as shown in the next listing, you will use the ExcludeObjectTypes option and exclude Users, RoleMembership, and Logins, because these will not exist on the development environment.

Listing 22.3 Publishing a DACPAC without the production users

PS> $dacoptions  = New-DbaDacOption -Type DACPAC -Action Publish
PS> $dacoptions.DeployOptions.ExcludeObjectTypes = "Users","RoleMembership"
 ,"Logins"
PS> $splatPublishDacPacNoUsers = @{
  SqlInstance = $developercontainer
  Database = "FactoryIssue"
  FilePath = "C:	empProdFactory_20201230.dacpac"
  DacOption = $dacoptions
}
PS> Publish-DbaDacPackage @splatPublishDacPacNoUsers

When you run this code, it will create a DACPAC options configuration that excludes the Users, RoleMembership, and Logins from the deployment. This configuration, shown in figure 22.2, will be used to create a database that matches the schema of the DACPAC (the production database schema) and does not include the production users.

Figure 22.2  The developer’s database has no production users in it.

In addition to enabling a developer to create a database that matches the production schema, you can use code like listing 22.3 to deploy your databases within your CI/CD processes. Some examples of why you’d want to deploy your databases within your CI/CD processes include automatically testing your application against new changes or synchronizing reports with production data.

Using code to define the options for extracting or publishing DACPACs is an excellent method of ensuring that the database is deployed with the same configuration each time. Within your pipeline to deploy your database code changes to the production database, the code will usually be deployed to many environments prior to reaching the production environment.

These pipelines will have automated build and test environments, user acceptance environments, quality assurance environments, and staging environments (we find these will have a myriad of different names and naming conventions). Each one of these environments may need to have the database configuration deployed with a different set of options. Database projects use a publish profile for this purpose, and dbatools can create publish profiles and use them for deployment as well. These profiles can then be included within the source-controlled code for the database project.

To replace the example in listing 22.3, where you excluded the users and logins from the published database, you can create a publish profile and use that for deployment. To ensure that the users and logins are excluded, you will need to set the following properties:

  • IgnoreUserLoginMappings

  • IgnorePermissions

  • ExcludeObjectTypes

  • ExcludeLogins

  • ExcludeUsers

  • IgnoreUserSettingsObjects

Try it now 22.3

If you’re wondering how we figured out this list, we did it using PowerShell! First, we explore all of the objects available from New-DbaDacOption:

New-DbaDacOption -Action Publish | Get-Member

After seeing the results, DeployOptions looks likely to contain the options we’re looking for. Let’s expand them:

New-DbaDacOption -Action Publish | Select -ExpandProperty DeployOptions

If you’d like more information about different options, you can find it at sqlps.io/ publish.

Now that you know about them, you can use these options to publish to your development environment using your own requirements. But first, let’s create a publish profile that addresses the task at hand: excluding the users and logins from the published database using the code shown here.

Listing 22.4 Creating a publish profile

PS> $splatNewDacProfile = @{
  SqlInstance = "sql01"
  Database = "Factory"
  Path = "c:	emp"
  PublishOptions = @{
      IgnoreUserLoginMappings = $true
      IgnorePermissions = $true
      ExcludeObjectTypes = 'Users;RoleMembership;Logins'
      ExcludeLogins = $true
      ExcludeUsers = $true
      IgnoreUserSettingsObjects = $true
  }
}
PS> New-DbaDacProfile @splatNewDacProfile
 
ConnectionString : Data Source=sql01;Integrated Security=True;MultipleAc...
Database         : Factory
FileName         : c:	empsql01-Factory-publish.xml
SqlInstance      : sql01

Note the FileName column, which contains the path to the publish profile file, which you can save in source control. The file is just an XML file that should look something similar to the the following XML code.

Listing 22.5 Viewing the XML output of New-DbaDacProfile

<?xml version="1.0" ?><Project ToolsVersion="14.0" 
 xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <TargetDatabaseName>Factory</TargetDatabaseName>
    <TargetConnectionString>Data Source=sql01;Integrated Security=True;
    MultipleActiveResultSets=False;Encrypt=False;
    TrustServerCertificate=False</TargetConnectionString>
    <ProfileVersionNumber>1</ProfileVersionNumber>
    <IgnorePermissions>True</IgnorePermissions>
    <IgnoreUserSettingsObjects>True</IgnoreUserSettingsObjects>
    <ExcludeUsers>True</ExcludeUsers>
    <ExcludeLogins>True</ExcludeLogins>
    <IgnoreUserLoginMappings>True</IgnoreUserLoginMappings>
    <ExcludeObjectTypes>Users;RoleMembership;Logins</ExcludeObjectTypes>
  </PropertyGroup>
</Project>

22.3 Running dbatools (and PowerShell) on a CI/CD system

Teams use a significant number of toolsets to deploy changes. Some of the most popular ones that we have seen when consulting include these:

  • GitHub Actions

  • Azure DevOps

  • Jenkins

  • Octopus Deploy

  • Bamboo

  • Team City

  • GitLab

  • CircleCI

All of these toolsets have a unique benefit for people like you who know dbatools (and, therefore, PowerShell): they all have the ability to run PowerShell scripts. This means that you can take the knowledge that you have learned from this book and apply it quickly and easily to create the pipeline or process that will meet the requirements of your team.

22.3.1 Creating a task

A task in the CI/CD process is like an additional job step. An example would be copying users to the CI/CD system, creating new databases, or hydrating current databases. Your method for adding a new task to your CI/CD process should involve some consideration.

First, you should understand what the task is required to achieve and if any available plugins accomplish your goals. This can be time consuming to figure out, bceause a huge number of plugins exist for various CI/CD system, but many are made with a limited scope to solve the author’s specific scenario. If you find that’s the case for your CI/CD plugins environment, we suggest using dbatools, likely within a PowerShell plugin. What’s really cool is, once you know how to use PowerShell within one CI/CD system, you’ll understand the concepts for pretty much all the others.

Once you have decided to use dbatools, you can then write the script that will achieve the task. If your process requires that you restore a database, apply data masking, deploy the changes, and create the correct logins/users, for example, you can use your learning from chapter 11, chapter 20, this chapter, and chapter 9 to create the scripts to do this.

22.3.2 Ensuring the dbatools module is available

The CI/CD toolsets mentioned in this chapter normally use an agent to run the tasks for the pipeline. The terminology and the exact methodology may be slightly different for each one, but some basic principles will apply: you’ll need a plugin to accomplish your goals, and you’ll also need dbatools to exist on the CI/CD server. As we learned in chapter 2, you’ll also need to ensure that the user account for the process has access to the dbatools module.

22.3.3 Understanding how to add parameters to the script

Different tooling and different team processes require different methods to achieve them. Whichever method you use, you will need to understand two things: how the parameters/variables are stored, and how they are referenced within the PowerShell that you write. You have to examine the documentation for the toolset to understand this correctly. This will often be different for sensitive variables and parameters compared to usual ones. In Azure DevOps, for example, you can find the documentation at sqlps.io/args.

If you are required to restore a different backup for each environment that your pipeline uses, you can write the dbatools script to restore the database and save it as restoredatabase.ps1 in the deploymentscript directory in your codebase, as shown in the following code listing.

Listing 22.6 DevOps pipeline script for restoring a database

Param($SqlInstance, $BackupFile)
Restore-DbaDatabase -SqlInstance $SqlInstance -Path $BackupFile -WithReplace

This script uses a Param block to allow parameters to be passed in to the script so that the same code can be used with different values.

Azure DevOps pipelines can be defined as YAML code. For each environment within your pipeline, your requirement is to restore a different database backup. Teams will do this to ensure that the data in the database is compliant with regulations preventing production data in nonproduction environments or to ensure that automated testing does not take too long. In your Azure DevOps pipeline code, you will define a task as follows.

Listing 22.7 Restoring a database to the Test environment

- task: PowerShell@2
  inputs:
    targetType: 'filePath'
    filePath: $(System.DefaultWorkingDirectory)deploymentscripts
     restoredatabase.ps1
    arguments: > # Use this to avoid newline characters in multiline string
      -SqlInstance "SQL01Test"
      -BackupFile "\BackupHostSQLDeploymentBackupsFactory_Test.bak"
  displayName: 'Restore the test factory database'

When the pipeline runs this task, it will run the dbatools code in the restoredatabase.ps1 script with the value SQL01Test for the SQL instance parameter and \BackupHost SQLDeploymentBackupsFactory_Test.bak for the backup file. This will restore that backup file on the Test instance of SQL01 and replace the database if it exists, ensuring a clean known state for the database prior to running the rest of the pipeline.

You can then use the same script to restore a different backup for a different environment. You may have a staging environment where you perform a “dry run” of the deployment using a database of a similar size and data complexity to the production database running on similar hardware. You can add the YAML to that pipeline to restore the database for that environment using the same dbatools script file, as shown next.

Listing 22.8 Restoring a database to the Staging environment

- task: PowerShell@2
  inputs:
    targetType: 'filePath'
    filePath: $(System.DefaultWorkingDirectory)deploymentscripts
     restoredatabase.ps1
    arguments: > # Use this to avoid newline characters in multiline string
      -SqlInstance "SQL02Staging"
      -BackupFile "\BackupHostSQLDeploymentBackupsFactory_Staging.bak"
  displayName: 'Restore the staging factory database'

This time, you have used the same script, restoredatabase.ps1, but have passed in the values for the staging instance and backup. This enables you to change the script in a single place and have those changes reflected in every environment, easily reducing complexity.

In this chapter, you have learned how to extract and publish database DACPAC files and how to use the knowledge that you have gathered from the entire book to create PowerShell tasks for use in CI/CD tools for your DevOps processes. In the next chapter, we will return to administration for your SQL estate and talk about how to trace activity on an instance with dbatools.

22.4 Hands-on lab

Try the following tasks:

  • Explore New-DbaDacOption with different -Type and -Action parameters using Get-Member.

  • Use your experience with looping or pipes to export more than one profile.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset