Lets get all Posh! Log Shipping

Standard

This month’s T-SQL Tuesday is brought to us by my good friend Rob Sewell (b | t) and the subject is “Let’s get all Posh – What are you going to automate today?

My post will be about log shipping because setting up log shipping using the GUI is a very tedious task and doing it with TSQL scripting is no fun either. So let’ see how we can do this with PowerShell.

 

 

Log shipping explained

If you’ve been a SQL Server DBA for a while you probably heard about log shipping.

Log shipping makes it possible to increase the database’s availability by automatically backing up the the transaction log, copying the backup file and restoring the backup file to another database instance (or the same instance with a different database name).

Log shipping is an older technology but still has a purpose in lots of environments where it’s being used for reporting, ETL or to give developers and users access to production without the risk of them breaking anything.

Why use PowerShell

If you’ve ever had to setup log shipping in the SQL Server Management Studio (SSMS) than you know how tedious it can be. I actually counted the amount of clicks to setup a single database for log shipping and you need about 40 of them to get the job done.

So one day we decided to create a warm standby for one of our production servers. I had setup log shipping before and if you know what to do it’s not that difficult.
It becomes a problem when you have to setup log shipping for more than 20 databases ranging from 10 GB to 800 GB in size.
I ended up spending about a week setting it up (between other tasks I do have more work to do) and for me that’s way too much time for something as simple as this.

So why use PowerShell, because anything I can do with the SSMS can be done with the SMO. This would save a lot of time for me and anyone using the the functionality.

How did I do it

I contacted Chrissy LeMaire who is the original brainchild of the dbatools PowerShell module and she was instantly excited to have this functionality within the module.

The first thing I had to do was to find out how SQL Server in the background setup the log shipping when I used the GUI. Yeah I know using the GUI is bad when you use a lot of PowerShell but sometimes I like to do it old school.

I quickly found out that SQL Server does a lot of things to set up log shipping. It creates jobs, job steps, job schedules, uses stored procedures to set values in specific tables etc etc.
I went to the dbatools module and used the Find-DbaCommand function to see if there was any functionality to create the agent objects (jobs, steps and schedules) but they weren’t there so the first step was to create a full set of functions to manage those objects.
A couple of thousand lines later I finished the agent functionality and could continue to developing the log shipping function.

That function started pretty small, but like the agent functionality, quickly grew to be some sort of monster with over 70 different parameters that would support every kind of situation.

After about a couple of hundred iterations it was finally committed to the development branch of the dbatools module to get tested. I have great respect for the people who did the testing of that functionality because I didn’t envy them. I knew how complicated it was to build everything and to test all that must have been a lot of work. Thank you to all the testers that accepted the challenge.

The solution

The entire log shipping functionality is now separated between 5 functions. Four of them are used internally and are not visible as a public function because you can easily break stuff if it’s not being used correctly.

The main function, Invoke-DbaLogShipping, is available in the dbatools module for anyone to use.

If you open the GUI for the log shipping you have lots of choices but most of them are already supplied in the GUI itself and you can decide whether to use them or not.
The whole idea behind the functionality was that yit would allow you to quickly setup log shipping using a lot of defaults like you can in the GUI, but if you’re more experienced you can change any setting to your preferences.

it will take you about a minute to set all the parameters and after that the only action that will slow you down is the size of you database files when you use a restore.

Enough talk I want to see some action!

I will demonstrate how to setup log shipping using the least amount of parameters. You actually need six parameters to make it work:

  1. SourceSqlInstance – Which instance need to be used as the source server)
  2. DestinationSqlInstance – Which server needs to be used to log ship the database too
  3. BackupNetworkPath – Where do the backup need to be placed
  4. Database – Which database, or databases, needs to be log shipped
  5. GenerateFullBackup or UseExistingFullBackup – Generate a full backup or use an existing one

The command below will log ship a database from server1 to server2.
It will log ship one database called db1.
The network path for the backups is \\server1\logshipping\backup.
The backups will be created with backup compression.
No new backup will be created, instead it will use the last full backup
In this example the suffix for the database is “LS” because I want to differentiate the log shipped databases.

Below is the result of the log shipping.

What else can be done with this function?

When I developed this function it had a lot of iterations. I’m a developer that will have a good idea what to build and during the process will think of other functionality that needs to be put in.
I had a lot of iterations where I constantly thought of new ideas that would make other people’s work easier.

A couple of examples:

  • To monitor or not to monitor log shipping
  • Enabled or disabled log shipping jobs
  • Naming convention of the jobs
  • Standby or read-only mode for the secondary database
  • To compress or not to compress the backup files
  • Generate a full backup for initialization
  • Log ship multiple databases in one command. This will be executed in serial.
  • Force every setting and discard anything that’s there

The last one needs a little more explanation. I want this function to work really easy and with existing settings I want it to work. The -Force parameter is a really powerful parameter which will assume a lot of the defaults but will also overwrite existing settings like jobs, schedules, database settings etc.

Conclusion

I will never ever use the GUI again to setup log shipping again and I hope you will too. It’s just way too much work and I’ve had it fail on my lots of times not knowing what went wrong having to set everything up from scratch.

There is more functionality coming for the log shipping functions like status checks and fail-over initialization.

If you find that something is missing, or works inefficient or doesn’t work (I hope not), in this function please let us know. You can find us on Slack, GitHub, Twitter, YouTube, LinkedIn. You can find me on on the Slack channel most of the time. If you have any questions you know where to find me.

The last thing I want to say is that dbatools module is a community driven module that enables database administrators to work more efficient. It’s because of the hard work of all the volunteers that this module is such a success.
It’s a group of very hard working and smart people who I have great respect for. It is because of the enthusiasm of all those people, and especially Chrissy, that I love working on the module.

Thank you again Rob for hosting this month’s TSQL Tuesday, it was really fun.

Create Your Own PowerShell Profile

Standard

Being a fan of automation I like to create my own PowerShell profile. It enables me to load various settings that normally take more time. The PowerShell profile resides in your home directory and if you work in an AD environment with roaming data you’ll have the same profile on every computer.

PowerShell profiles are not new and dates back to PowerShell v2.0. Others people have written about this subject before but I wanted to share my take on it.

Create the profile

To check if you have a profile execute the following command:

Profile Test Existence

If this returns false it means you don’t have a profile and you have to create one. The easiest way to do that is by executing the following:

Profile Create

From this point on you can put in anything that helps you speed up.

PowerShell drives

I have several directories which I use regularly. Instead of having to type the entire path I can create a PowerShell drive.

Profile PS-Drive Example

Now instead having to type the entire path I can simply use the new drive. This not only speeds up the way you enter your directory but also simplifies the reference to your current working directory.

Credentials

The next thing I do is when I’m part of a domain I usually need another user account to access my database servers. So I call the next piece of code to enter the credentials for my administrative account:

Get-Credential Window

You can reference the $Credential parameter to other functions that need more privileges. For instance, if you use dbatools, a lot of the commands have this parameter to make sure you connect to the SQL Server instance with the right credential.

Color schemes

Color schemes are important. There have been studies that show that certain fonts and colors make working on a console easier for the eyes.

I don’t like the colors for errors and warnings and if I have to go through a lot of them it’s hard to read the bright red on the blue background.

Profile Color Example

At least I can read this better than the original color scheme. Play around with this and you’ll see it makes handling messages from the console a lot easier.

Window and buffer size

It’s possible to change these values to make sure all your data fits inside your console.

Setting the location

For me it’s not always a good thing that the working directory is set to my home folder. This happens by default when you open a PowerShell console. If you open it with administrative privileges it points to  system32 folder of you Windows directory.

Conclusion

In the end you can load all kinds of settings in your profile to make your life a little easier. From the settings above to creating aliases, starting other scripts etc.

I hope this was informative for you and maybe you’ll get started with profiles too.

Testing Log Shipping with PowerShell and Pester

Standard

Thanks to my good friend Rob Sewell (b | t) I got into testing with PowerShell with Pester.
He showed me how to use the Pester module to test about anything you can test with PowerShell.

In my environment I have lots of databases being log shipped. To see the current status of the log shipping I either have to execute a procedure or execute a standard report from SSMS. And I know there are other solutions but in most cases it comes back to these two.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds. This doesn’t show me the problems right away but it’s fast.
The report shows me in nice green and red colors which which is good and what’s not but I’d have to do it for every server manually. So it’s pretty but it’s slow.

Why isn’t there anything that has the best of both worlds where I have the speed of the query with the clear results of the report?

That’s where Pester comes in!

I can use Pester to test the values that come back from the procedure and do basically the same thing the report does.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds.
There is also an overall value called “status” which shows a 0 when everything is fine and a 1 when something is wrong.

I wanted to keep the check as simple as possible. If everything is fine, I don’t need to go into much detail. If there is something wrong I want to know which part of the log shipping failed, like the backup, copy or restore.

The Overall Test

This test only looks at the the marker the log shipping sets to indicate if everything is OK or not.

Log Shipping Pester Overall

Ooh oh! Something failed. Good thing we have our test.

The Detailed Test

The detailed test goes into the detail of the last backup, last copy and last restore time. We want to know what went wrong and where.

In this case I only want to zoom in on the database with the failed test so I use the -Database parameter to give the name of the database.

Log Shipping Pester Detailed Single DB

Of course it’s possible to execute the detailed test for all the databases. Just remove the database parameter and the same tests are run for all the databases.

Log Shipping Pester Detailed All DB

The entire test took less than two seconds and made it possible for me to quickly see if everything is fine with the log shipping.

The code can be downloaded from github. I hope this helps you out and make your life a little easier as it did for me.