Decrypting SQL Server Objects with dbatools

DECRYPTING SQL SERVER OBJECTS
Standard

Reading Time: 4 minutes

There are lots of great articles that describe how the command for decrypting SQL Server objects when they’re saved with encryption like:

Encrypting and Decrypting SQL Server Stored Procedures, Views and User-Defined Functions

Simple Way to Decrypt SQL Server Stored Procedure

Most of them rely on T-SQL and SSMS but I wanted to do the same thing with PowerShell. This way I can execute it for several objects at the time and maybe even in multiple databases and instances.

Let me first say that I’m not an encryption guru, but I do understand the technology and how I could possibly develop some code to decrypt an object.

This article describes the inner workings of the command “Invoke-DbaDbDecryptObject”.

Why would you encrypt an object?

Encrypting objects was first introduced in SQL Server 2000 to protect objects. I don’t prefer this method because there are a couple of disadvantages to it.

The first disadvantage is that the security is not very good. It’s very simple to get the original code when you have the right privileges. The code is not protected when for instance the SQL Server Profiler is run or when you catch executed procedures using extended events.

The second disadvantage, but you probably should have this anyway, is that you need to make sure that you have the original code in source control. Once the object is created with encryption there is no normal way to retrieve the code.

The third disadvantage is, and this is becoming more and more popular, that there is no easy way to check differences between objects and therefore is harder to use with CI/CD (Continuous Integration / Continuous Delivery).

I would only use this method if the “VIEW DEFINITION” privilege would not be sufficient and there are no other DBA’s who can google the solution to retrieve the original code.

How is an object encrypted?

Objects are encrypted using “WITH ENCRYPTION” when creating or altering an object. This encrypts the object and it’s no longer available using the general methods.

If you want to view the code you’ll get an error message like this:

error encrypted object

During the encryption process, SQL Server look at the column “imageval” in the table “sys.sysobjvalues”. This table can only be queried when you’re connected using the DAC (Dedicated Administrator Connection). This value is a VARBINARY(MAX) value.

The encryption uses a XOR cipher. The XOR cipher work by applying an XOR with a key (B) to a value (A) generating a result (C). This results in the following formula A ^ B = C
The cipher works is also called a modulus 2 addition. If we know the key and the encrypted value we can decrypt the (C ^ B = A).

Decrypting an object

To decrypt the database object we need to calculate the secret (A), apply the XOR cipher to it with known plain text (B) and the known encrypted text (C).

Getting the needed values

(A) Get the secret

The secret is the imageval value in the sys.sysobjvalues table for the object. This can be retrieved like this:


The known object will be an alter statement for that particular type of object which we can use to calculate the key. This known object needs to be a valid create statement like:

(B) Get the binary known object

Because we’re dealing with binary data we need to convert the known object to binary (known plain). This can be done by using the System.Text.Encoding class with the function “GetBytes”.

(C) Get the encrypted known object

To get the encrypted known object (known secret) we need to alter our object in the database.

We don’t want the original object to be replaced with our known object. To achieve that a transaction is used that’s rolled back right after the imageval value has been retrieved.

Start decrypting

To get out decrypted text we have to loop through the secret and apply the known plain and known secret.

The function below demonstrates how the decryption is put in place:

The loop increases the integer by two because each character in the secret has a size of 2 bytes.

In the end the decrypted data is still in a binary version. To get the text we need to use the  method “GetString” using the Encoding object.

Output Invoke-DbaDbDecryptObject

The object is decrypted. Now what?

First of all you should save the code somewhere and preferably in a source control system.

To make things easier I implemented a feature in the command to export all the results to a folder. It will separate objects based on the instance and the object type and create a separate file for each object.

To execute the script using a directory you have to use the –ExportDestination parameter.

 

That’s the command to decrypt your encrypted objects.

As I said, there are several solutions for this problem using T-SQL which are very good.
I always like to use PowerShell for these kind of problems because it makes it easier to go through multiple servers, databases and objects in one script.

If you want to try it out, make sure you have the latest version of dbatools.

Start to see with tSQLt

Standard

Reading Time: 6 minutes

What is tSQLt, why to use and how can I make it a little easier by automating lots of tests with PowerShell

This is going to be my first post on tSQLt. I will tell you why I love this framework and why using it changed my way how I develop databases.

tSQLt is a unit testing framework for SQL Server databases. There are lots of articles out there that did a very good job describing what the framework is and what it does. The links below helped me out quite a bit:

How did I get started with tSQLt

A couple of months ago I got introduced into PowerShell unit testing with Pester, That has nothing to do with SQL Server databases but I learned why it’s important make these kind of tests.

The tSQLt framework already existed for a while but I never spend too much time on it because it seemed to me like another framework to learn and I did not have the time to dive into it. Until I got introduced to Pester and how good it worked for my projects.

The tSQLt is completely free and can be downloaded from here. You only need to change a couple of things in your development instance, install the framework into your development database and your done.

As with many new things there is a big learning curve. You spend hours reading documentation and you try out several ways you could use the framework. It took me a while to understand the framework but once I got used to it I was able to create lots of different kinds of tests.

I work in a smaller environment where we have lots of development. We have not yet implemented continuous integration or continuous delivery. As we grow in the amount of projects and complexity of it all, we need to implement these practices to improve the quality of our releases.

Why I love tSQLt

The thing with unit testing is, you don’t see the benefit of it until things go wrong. I remembered creating hundreds of tests for the stored procedures in our database and thinking why I was actually doing it.
Developing the extensive tests with mocked tables and integration with other procedures was a tedious task.

But I persevered, I finished my tests and it was not long after that, that a change had to made to that specific database. It was a change in a table to rename a certain column. I made the change, changed the procedures I knew relied on that column and ran my unit tests.
The result was that I had over 10 stored procedures that for some reason did not return the expected results. Apparently they relied on that column and I totally overlooked those procedures.

If you have over 400 stored procedures in your database, you don’t remember all the code you’ve written and at some point just forget. It’s exactly that reason why I love tSQLt.
If I had made the change and applied it to our testing environment I would’ve broken a lot of code and our testers would come back to me that thing didn’t work.

With the test results returning the failed procedures I was able to create a quality release.

Creating the tests

The integration tests were the first unit tests I created and I was really satisfied with the result. The integration tests would get results from a stored procedure to test the workings. These kinds of tests can be quite complicated but give you a very good view if your code does what it’s supposed to do.

But I wanted to take it further than that. With CI/CD (Continuous Integration/Continuous Delivery) in mind I wanted to create extra tests to assure a proper release.

I wanted to make the following tests:

  • Are all the tables present
  • Are all the stored procedures present
  • Check object permissions

This database does not use any views or user defined functions. I also did not want to go overboard to test for the existence of objects. You can create a test for any type of object.

Are all the tables present

Imagine a situation where a developer by accident deletes a table and doesn’t notice the change. He/she commits it to your source control (your have your database in source control right? ) and it goes to your build server.

I want to be sure all my tables are present before any other test get’s executed. It may seem redundant because the integration tests would fail too, but it would give me a clear error that an object was missing.

Example of the code:

Are all the stored procedures present

The same situation as with the tables applies to my stored procedures. If a stored procedure would be deleted I wanted to know it right away.

The difference with the tables is that deleting a stored procedure does not have to break any other part in the database. It would only be noticed when an application would rely on it.

Example of the code:

Check object permissions

I cannot count the times I created a new stored procedure and I forgot to grant permissions to it. I would implement the change and it was unusable because the application had no privileges on it.

If you know the permission needed for an object, it’s easy to get those permissions and test if they are what you expect them to be.

This one is a little different than the object existence. I first create the tables for the expected and actual values. I than insert the values I expect should be in the the result.

In the act-part I get the permissions from the object and insert them into the table for the actual values.

Did you create all those tests by manually?

OH HELL NO!

If you’ve read any automation posts from my blog before, you know that when it comes to repetitive tasks, I am as lazy as it comes. I do not do two things manually if I can automate it once.

The only tests that could not be generated where the integration tests. Besides those, I generated all the all of them.

The code to assert the existence of the objects is basically the same.The only thing that would change would be the object name and maybe the schema.
The same goes for the tests to check the permissions. In most of the cases the permissions may be the same (at least for the same schema). With some tinkering I was able to generate all the tests for that purpose too.

How did I generate all the tests?

Simple! I use dbatools and a loop to go through the objects.

  1. Generate the test you need but just for one object
  2. Go to PowerShell and let it retrieve all the objects needed
  3. Export the test you create in step on and copy it to a query variable in your PowerShell script
  4. Make a loop around the pasted script to replace the object name
  5. Run the PowerShell script

The result is the script below. The script tests for the existence of the stored procedures. You can of course change the types of objects.

 

I hope you enjoyed this article to create some handy tests using tSQLt. This framework really helped me with my day-to-day development and has saved me numerous times.

Comments are always appreciated. If you want to stay informed about new blog posts, please subscribe.

Hide or Show Columns in SSRS Reports Using Parameters

Standard

Reading Time: 4 minutes

Regularly I have reports that have an extensive amount of columns.
Because the amount of columns, reports tend to become inefficient and have too much information we don’t always need. The users may want to select certain columns and to make the report easier to read.

Hiding and showing columns in SSRS reports using parameters is a solution to make reports more dynamic and easier to use.

At the time of writing of this article, SQL Server Reporting Services did not yet have the possibility to use checkbox parameters. Instead we will be using a multi-value text parameter to show or hide our columns.

The project file

For the demo I created a dummy project that looks to the AdventureWorks2014 database. I have created a data set that returns the addresses from the Person.Address table.

The report contains a table that shows the data from the address data set.

If you want to play around with the demo, the project file can be downloaded from here. Make sure you use Visual Studio 2017 or higher.

Create the parameter

For us to have the ability to hide or show columns we have to create a parameter. We want to be able to filter on multiple columns so we want to create a multi-value parameter.

To do that, either right-click on “Parameters” in your “Report Data” pane, or right-click in the top pane for the parameter and click on “Add Parameter”.

Add Parameter

When you click “Add Parameter” you get a window that looks like this:

Add Parameter Window

Give the parameter a name, in this case I named it “ColumnFilter”, but you can name it whatever you want.
Select “Text” as the data type ( you can choose any other data type but I like text) and make sure you select “Allow multiple values”.

For this example we’re going to hide or show the address fields. The data set has two address columns, AddressLine1, AddressLine2.

Go to the tab “Available Values” and add the following fields:

Add Parameter Available Values

In this example we have two values in our parameter. The label and the value are pretty self explanatory.

We’re going to add the values to the “Default Values” tab to make sure our report shows the columns by default.

Add Parameter Default Values

Click “OK” and we have our parameter.

Show and hide the columns

Now that we have our parameter we can change the table to hide or show the address columns.

Right click on the column “Address Line 1” and click on “Column Visibility”.

Column Visibility

The following window will appear:

Column Visibility Show Or Hide

Click on the “Show or hide based …..”. Click on the “fx” button and the following window appears:

Show or Hide Expression Window

This window is really powerful. We can do a lot with expressions throughout our SSRS reports.

In this case we’re only going to enter a small line of code:

Expressions are a little different that you might be used to with normal programming. We are not able to search through our multi-valued parameters like we do with arrays in other programming languages (at least I have not found a way yet).
Instead we will be joining all the values from the ColumnFilter parameter and check if it contains one of the values we like to search for.

What the expression does is combine all the values separated with a comma (“,”). If all the values are selected we’ll have a string like this “address1,address2”.
The next part will look if the string contains one of the values. The “.Contains” method returns true if a value is within that string and false if it’s not.
That entire expression is contained within an IIF that assesses if the statement is true or not. In this case, when the string contains the value we look for, we want to return false to not hide the column.

Add the code to the Expression window and click on “OK”. Click “OK” again to close the column visibility window.

Do the same thing with the column “Address Line 2” but change the value in the expression to “address2”.

Executing the report

Now that we’ve done all the work. Open the report preview.

In this example I deselect the the “Address Line 2” value and click on “View Report”. The result is the same table but with the column “Address Line 2” hidden.

The result can be seen below:Hidden Column Address Line 1

Conclusion

Showing or hiding columns with parameters is a very useful way to make your reports more dynamic.

This is one way of doing this, I bet there are other ways of showing or hiding columns but these tend to work really well in my environment. The users are able to adjust their reports and I don’t have to create multiple reports to get to the same result.

I hope you enjoyed this post and comments are always appreciated.

 

 

 

Test Log Shipping With DbaChecks

Standard

Reading Time: 2 minutes

I’ve kind of wrote a lot about log shipping the last few months and this is another one of them.
This time I’m talking how to test log shipping with dbachecks.

DbaChecks?! What is that?

DbaChecks is a brand new module developed by the community to test your environment for all kinds of best practices. There are even possibilities to export the data in such a way that you can create amazing dashboard in Power BI.

The module uses Pester to test a whole lot best practices. Because of that we can test very quickly what’s wrong with our server(s).

These tests are all included in the module:

get-dbacheck

Why this blog post

We want everyone to know about this module. Chrissy LeMaire reached out to me and asked if I could write some tests for the log shipping part and I did.

Because I wrote the log shipping commands for dbatools I was excited about creating a test that could be implemented into this module for everyone to use.

Test Log Shipping With DbaChecks

If you want to go all nuts you can use the following command and see all the tests fly by

invoke-dbccheck all checks

To make it easier to read I would only want to know about the things that failed

To test for a certain group of test you can use a tag of specify a certain test

invoke-dbccheck logshipping

 

Using this module will really help you to check your entire SQL Server infrastructure with ease. So what are you waiting for, download the module and go test your instances.

 

Log Shipping With dbatools – Part 4: Recover a Log Shipped Database

Standard

Reading Time: 3 minutes

In the third part of the series we discussed the way to get the log shipping errors using the command “Get-DbaLogShippingError”. This blog will be about how to recover a log shipped database.

Out of the box solutions

I can be very short in this paragraph, there is no out-of-the-box solution to recover a log shipped database.

Why this command

Initially log shipping was meant to be used as a warm standby. You have your data on another instance but you still have some human intervention to get it all back up.

Imagine the following situation. You have setup log shipping using either the GUI or the commands in dbatools. You have about 15 databases and everything is working fine.

Until one day the primary instance goes down and is not recoverable. For the production to continue you have to bring the log shipped databases online fast.

You have to figure what the last transaction log backup was. You have to check if it was copied to the secondary instance and if it’s restored.

To do this by running a couple of queries, copying the files if needed and run the log shipping jobs takes time. I’d rather run a command and recover one or more databases and get back to the problem of the primary instance.

Invoke-DbaLogShippingRecovery

The Invoke-DbaLogShippingRecovery command is will execute the following steps:

  1. Check the agent status and start the agent if it’s not a started state.
  2. Retrieve the latest transaction log backup and try to copy it to the secondary instance if it’s not present. It will wait and check the log shipping status to see if the backup file is copied.
  3. Retrieve the last restored transaction log backup. Execute the restore process to get the database up-to-date.
  4. Disable the jobs after the copy and restore process.
  5. After all the actions it restores the database to a normal state.

To execute the command

The result of the command

Recover Log Shipping Result Command

The image below shows the database in a normal state after the command in the SQL Server Management Studio.

Recover Log Shipping Result GUI

The result of the jobs being disabled

Recover Log Shipping Result Jobs

More options

In my example I showed how I recovered a single database. The parameter does accept multiple databases.

Besides setting the individual databases you can also let the command recover all the log shipped databases

In some cases you want to recover the databases but not execute the recovery to a normal state

 

This concludes the command “Invoke-DbaLogShippingRecovery”. This was the final post in this series. If you want to look back at the other command follow the links below:

  1. Setup Log Shipping
  2. Test Log Shipping Status
  3. Get Log Shipping Errors

 

Log Shipping With dbatools – Part 3: Get Log Shipping Errors

Standard

Reading Time: 3 minutes

In the second part of the series we discussed the way to test the log shipping status using the command “Test-DbaLogShippingStatus”. This blog will be about how to get log shipping errors to analyze why your log shipping isn’t working.

Out of the box solutions

There are several options to get the log shipping errors right out-of-the-box. One of them is using queries and the other one is the event viewer.

Queries

There are queries that can help to get log shipping errors. One that is likely used is by using the log_shipping_monitor_error_detail table.

Although this is a good way to get the log shipping errors, unless I’m using CMS or a monitoring server, I have no way to find errors on multiple instances with log shipped databases.
Furthermore, I want to keep my all my tools together and make it easy to solve any errors.

Event Viewer

The other way to see if there are any errors in the log shipping is by using the event viewer.

Get Log Shipping Errors Event Viewer

Personally I don’t like using the event viewer because there could be other errors which I would have to filter through before getting to the root cause. The other reason this doesn’t give me a complete view is because it doesn’t register any errors that occurred with restore of the backups.

Why this command

When there is something wrong with the log shipping I want to know about it as soon as possible. Using the above options doesn’t give me the flexibility to do that.
For instance I cannot check multiple instances at once, unless I’m using CMS or a monitoring server.

To make the tool set complete this command was needed to make sure a user would be able to get an overview of the log shipping status.

Get-DbaLogShippingError

By default the command will return all the errors that ever occurred. It collects all the information and returns it in a table structure to the user.

To execute the command:

The result is an overview of the errors.

Get Log Shipping Errors No Filter

In my play area I made sure I would have a lot of errors to test the command. It turned out I had more than 3100 errors!

Get Log Shipping Errors No Filter Count

This brings us to one of the reasons I created this command; filtering the results.

I may only want to see the errors that occurred the last hour.

The result is still a quite big list but more manageable.

Get Log Shipping Errors Filter Date From

In this example we’re only using the parameter “-DateTimeFrom” but we can also use “-DateTimeTo” to filter between certain periods.

More filtering

Of course there are other filters to make it easier to zoom in into a specific problem.

It’s possible to filter on the databases:

Get Log Shipping Errors Filter Database

To filter on the type of instance you can use the “-Primary” and “-Secondary” parameters

To filter on the specific actions use the “-Action” parameter:

 

This concludes the command “Get-DbaLogShippingError”. The next post will be about the command “Invoke-DbaLogShippingRestore”.

Log Shipping With dbatools – Part 2: Test Log Shipping Status

Standard

Reading Time: 3 minutes

In the first part of the series I described the command “Invoke-DbaLogShipping”. This makes it possible to set up log shipping. This blog post will be about the command to test log shipping status.

Before I describe the command, I want to discuss the options that are available in the SQL Server Management Studio (SSMS) and SQL Server.

Out of the box monitoring

Microsoft has made it possible to check the log shipping using SSMS. This can be done using queries or by using the “Transaction Log Shipping Status” report.

Queries

To return the log shipping status using T-SQL execute the following code in a query window in SSMS:

The result is as follows:

Log Shipping Status Query Result

A good method to find any possible issues but you need to check multiple columns to find out if something is wrong.

Transaction Log Shipping Status Report

By default SSMS has a report that shows the status of the log shipping. To open the report right-click on the server, go the “Reports”, Go the “Standard Reports” and click on “Transaction Log Shipping Status”.

Log Shipping Status Report

The result is an overview of each database with the status of that process:

Log Shipping Status Report Result

The report shows in red colored text if something is wrong which is a great way to find any problems. Still we need to go into SSMS and click around before we get to this information.

Why this command

Monitoring your log shipping processes is important. You need the synchronization status of the log shipped databases.

The log ship process consists of three steps; Backup, Copy and Restore. The log shipping tracks the status for these processes.
It registers the last transaction log backup, the last file copied and the last file restored. It also keeps track of the time since the last backup, copy and restore.

But that’s not all. Log shipping also checks if the threshold for the backup and restore has been exceeded.

During the log shipping thresholds are set for the backup and restore process.
The default thresholds are 60 minutes for the backup and 45 minutes for the restore. There is no threshold for the copy process.
If for example, the last time since the last backup exceeds the backup threshold an alert will be triggered.

That’s a lot of information to consider and that’s why this command was developed. It will enable the user to get a complete overview of the log shipping status without having to know all the details.

Test-DbaLogShippingStatus

The commands will return a lot of the information by default. It will collect all the information and based on that returns a status.

To execute the command:

The result is a detailed overview of each database including the status.

Log Shipping Status Command Detailed

This looks a lot like the results you get from the queries of the report we talked about earlier. All this information can be a little overwhelming and it’s not always needed.
If you don’t need all that information there is an option to format and filter the output.

It will only show you the core information to know what the status of the log shipping is.

The result of the command:

Log Shipping Status Command Simple

As you can see there is lot less information. It only shows the instance, database, instance type and the status. In most cases that’s all you need.

More filtering

The command also allows you to filter out the primary and the secondary databases.

There is a way to filter on specific databases using the “-Database” parameter.

Of course there is also an option to exclude certain databases by using the “-ExcludeDatabase” parameter.

 

This concludes the command “Test-DbaLogShippingStatus”. The next post will be about the command “Get-DbaLogShippingError“.

Log Shipping With dbatools – Part 1: Setup Log Shipping

Standard

Reading Time: 7 minutes

This post is the first one of a series of four describing all the different commands. We’ll discuss the commands to setup log shipping with dbatools, how to check the status, to check for errors and to recover in case of an emergency.

What is log shipping

Before we go into the why I want you to know what log shipping is.

Log shipping dates back to the early versions of SQL Server. The first article that I could find dates back to the year 2000 and explains how to setup log shipping with SQL Server version 7.

Microsoft has done a lot of work on the log shipping setup process and we no longer have to setup the “poor man’s log shipping” anymore. The setup contains a nice GUI and it makes easier to setup log shipping for your database(s).

Log shipping is the process of creating a transaction log backup. Copy that backup to a secondary server and restore the log to synchronize the databases. The image below shows the different elements of a log shipping architecture..

Log Shipping

https://docs.microsoft.com/en-us/sql/database-engine/log-shipping/media/ls-typical-configuration.gif

I’m not going to explain in detail what log shipping is or how to set it up using the traditional method. There are lots of websites that have already done a great job describing everything related to log shipping. Here are a couple of websites that could help

Why create a command set

Like I said before, Microsoft has done a great job of automating the log shipping process and making it easier for the user to setup. The GUI describes all the settings needed. Even someone inexperienced could setup log shipping in a short amount of time (given that all the requirements have been met).

Microsoft has simplified the process since the earlier versions of SQL Server, setting up log shipping still takes a lot of time. You have to setup each database individually and most of the time your are doing almost the same.
There are ways to automate this using dynamic T-SQL. However, I found that, at least in my situation, it was prone to errors and it took me more time to troubleshoot it than to set it up manually.

Why we needed to setup log shipping

A couple of months ago my company wanted to create a High Availability solution for our production databases.
Researching a solution like clustering for Availability Groups, we looked at architecture decisions, licensing and costs. The outcome was that we weren’t able to use any SQL Server Enterprise HA solutions.
Log shipping is not a so called hot standby that immediately recovers in case a service or database goes down. Instead it is called a warm standby where there is human interaction needed to recover.

We have about 20 production databases and in my opinion we are a small shop. Looking at the database size and amount of transactions.
We decided to go with log shipping. It’s well established within SQL Server and not yet deprecated like mirroring.

So on a Monday morning, after all the backups were done, I started to configure our databases for log shipping. For me the process of setting up log shipping was very tedious. I basically had to go through each of the databases and use the same settings over and over. The biggest difference would be the database name in the backup or copy directory.

The reason for the command set

In the end it took me about 3 to 4 days to set it up. This was mostly because I didn’t want to stress the network too much by setting up multiple databases at once. At some point I made some mistakes and had to start over again. I really don’t like to do repetitive tasks and in my daily work I automate most of those processes.

To setup log shipping for a single database in the GUI you need about 40 clicks and have to  insert lots of information in different windows.

That’s where I contacted Chrissy LeMaire ( b | t ) from the dbatools project and asked if it would be interesting to create log shipping commands for the PowerShell module. She was really enthusiastic about it and I started to develop the commands.

The command set

Initially it started with the command “Invoke-DbaLogShipping” but soon after a couple of commands followed. The command set contains the commands

This blog will be the first of four posts where we dive into “Invoke-DbaLogShipping” because we first need to have log shipping setup before we can use the other commands.

The Command Invoke-DbaLogShipping

During the development of the command I chose to make the process for setting up the log shipping as easy as possible for the user. My main requirement for the command was that as long as you have the minimal requirements for log shipping you would be able to set it up.

The main command is “Invoke-DbaLogShipping”.

Within log shipping there are a lot of decisions that can be made. All those decisions ended up with lots of parameters. The command has over 90 different parameters ranging from setting the backup folder to setting the name of schedules.

I will explain a couple of the most important ones that I think will be used the most:

General

  • Force: When this parameter is used, lots of assumptions will be made and you need only a minimum of settings to setup log shipping.

Server

  • SourceSqlInstance: The primary instance
  • DestinationSqlInstance: The secondary instance

Database

  • Database: Database (or databases) that need to be log shipped

Backup

  • BackupNetworkPath: Network path where the backups will be saved. This is the parent folder, a child folder for each database will be created in this folder.
  • BackupLocalPath: The local path for the backups. Only used by the primary instance and not mandatory
  • BackupRetention: How long to keep the backups. The default is 3 days.
  • BackupScheduleStartDate: Start date when the backup schedule needs to start
  • BackupScheduleStartTime: Start time of the backup schedule. Maybe the backups only take place a certain time periods of the day.
  • BackupScheduleEndTime: End time of the backup schedule. Maybe the backups only take place a certain time period of the day.
  • CompressBackup: Do you want the backup to be compressed. By default it looks at the SQL Server edition and server settings.
  • FullBackupPath: Path to the full backup. Is only used when log shipping is being setup.
  • GenerateFullBackup: Instead of using an existing full backup you can use this parameter to create a full backup on the fly during setup.
  • UseExistingFullBackup: Besides FullBackupPath and GenerateFullBackup you can also set the option to let the command retrieve the last full backup made for the database.

Copy

  • CopyDestinationFolder: Where do the backups need to be copied to

Restore

  • DisconnectUsers: Important setting if your secondary database is in read-only mode. Users will be disconnected if the restore process starts and this is parameter was set.
  • Standby: Do you want your databases to be in stand-by mode. By default the database will be in no-recovery mode.
  • StandbyDirectory: Directory where the TUF (transaction undo file) file needs to go for a database. This is needed when using the -Standby parameter.

Setup log shipping using Invoke-DbaLogShipping

Before we look into the command we need to go through the prerequisites of log shipping. The items below are the bare minimum to setup log shipping.

Prerequisites

  • At least SQL Server 2000 standard edition
  • The primary database needs to be in Full recovery mode.
  • A network share/drive that’s accessible for the primary and secondary server

A more sophisticated setup would be:

  • A primary server containing the primary databases
  • A secondary server containing the secondary databases
  • A monitoring server logging all the information
  • At least SQL Server 2000 standard edition
  • The primary database needs to be in Full recovery mode.
  • A separate network share for the backup process
  • A separate network share for the copy process

Technically you don’t need multiple servers to setup log shipping. You can set it up with just one single SQL Server instance. In an HA solution this wouldn’t make sense but technically it’s possible.

Having a separate server acting as the monitoring server ensures that when one of the server goes down, the logging of the actions still takes place.

Having a separate network share for both the backup and copy makes it easier to setup security and decide which accounts can access the backups. The backup share needs to be readable and writable by the primary instance and readable by the secondary instance.
The copy share needs to be accessible and writable for only the secondary instance.

Additional information

  • Backup Compression
  • Database Status
  • Schedules
  • Cleanup

Backup Compression

Backup compression was introduced in SQL Server 2008 Enterprise. Beginning in SQL Server 2008 R2, backup compression is supported by SQL Server 2008 R2 Standard and all higher editions. Every edition of SQL Server 2008 and later can restore a compressed backup.

Backup Compression in  SQL Server

Database Status

You can decide whether you want the database to be in a recovery or read-only state. Read-only gives you the ability to let other processes, like SSRS or ETL, read production data without interfering with the actual production instance.

Schedules

When to run the backup of the transaction log. Does it need to run every 15 minutes, which is the default, or every hours, once a day etc. The same goes for the copy and restore schedule.

Cleanup

How long do you need to hold on to the backups. The default is three days which in some cases can be quite long.

Setup

This is where it’s going to be fun. As an example I’ll be using the command with the minimum of parameters. To do this a couple of the requirements need to be met. Let’s go through these prerequisites:

  1. Setup the shares for the backup and copy destination
  2. Privileges for primary and secondary instance to the backup and copy shares are set
  3. Database(s) that need to be log shipped
  4. Set the database in read-only or no-recovery mode
  5. Compress or not to compress the backup
  6. Generate or use an existing full backup

The command below will log ship the database from instance SQL1 to SQL2 by generating a full backup. The backups will be compressed and to initialize the backup a new full backup is generated.
No other settings are made. To make sure all the settings are met the -Force parameter is used to enable all the assumptions.

The result:

logshipping example force

This is only a fraction of the parameters available. The remaining parameters make it possible to configure your log shipping exactly the way you want it to be.

Execute the following command to get more information:

The next blog will be about the command “Test-DbaLogShippingStatus“. It’s responsible for testing if all of your log shipped databases are in a synchronized state.