Lets get all Posh! Log Shipping

Standard

This month’s T-SQL Tuesday is brought to us by my good friend Rob Sewell (b | t) and the subject is “Let’s get all Posh – What are you going to automate today?

My post will be about log shipping because setting up log shipping using the GUI is a very tedious task and doing it with TSQL scripting is no fun either. So let’ see how we can do this with PowerShell.

 

 

Log shipping explained

If you’ve been a SQL Server DBA for a while you probably heard about log shipping.

Log shipping makes it possible to increase the database’s availability by automatically backing up the the transaction log, copying the backup file and restoring the backup file to another database instance (or the same instance with a different database name).

Log shipping is an older technology but still has a purpose in lots of environments where it’s being used for reporting, ETL or to give developers and users access to production without the risk of them breaking anything.

Why use PowerShell

If you’ve ever had to setup log shipping in the SQL Server Management Studio (SSMS) than you know how tedious it can be. I actually counted the amount of clicks to setup a single database for log shipping and you need about 40 of them to get the job done.

So one day we decided to create a warm standby for one of our production servers. I had setup log shipping before and if you know what to do it’s not that difficult.
It becomes a problem when you have to setup log shipping for more than 20 databases ranging from 10 GB to 800 GB in size.
I ended up spending about a week setting it up (between other tasks I do have more work to do) and for me that’s way too much time for something as simple as this.

So why use PowerShell, because anything I can do with the SSMS can be done with the SMO. This would save a lot of time for me and anyone using the the functionality.

How did I do it

I contacted Chrissy LeMaire who is the original brainchild of the dbatools PowerShell module and she was instantly excited to have this functionality within the module.

The first thing I had to do was to find out how SQL Server in the background setup the log shipping when I used the GUI. Yeah I know using the GUI is bad when you use a lot of PowerShell but sometimes I like to do it old school.

I quickly found out that SQL Server does a lot of things to set up log shipping. It creates jobs, job steps, job schedules, uses stored procedures to set values in specific tables etc etc.
I went to the dbatools module and used the Find-DbaCommand function to see if there was any functionality to create the agent objects (jobs, steps and schedules) but they weren’t there so the first step was to create a full set of functions to manage those objects.
A couple of thousand lines later I finished the agent functionality and could continue to developing the log shipping function.

That function started pretty small, but like the agent functionality, quickly grew to be some sort of monster with over 70 different parameters that would support every kind of situation.

After about a couple of hundred iterations it was finally committed to the development branch of the dbatools module to get tested. I have great respect for the people who did the testing of that functionality because I didn’t envy them. I knew how complicated it was to build everything and to test all that must have been a lot of work. Thank you to all the testers that accepted the challenge.

The solution

The entire log shipping functionality is now separated between 5 functions. Four of them are used internally and are not visible as a public function because you can easily break stuff if it’s not being used correctly.

The main function, Invoke-DbaLogShipping, is available in the dbatools module for anyone to use.

If you open the GUI for the log shipping you have lots of choices but most of them are already supplied in the GUI itself and you can decide whether to use them or not.
The whole idea behind the functionality was that yit would allow you to quickly setup log shipping using a lot of defaults like you can in the GUI, but if you’re more experienced you can change any setting to your preferences.

it will take you about a minute to set all the parameters and after that the only action that will slow you down is the size of you database files when you use a restore.

Enough talk I want to see some action!

I will demonstrate how to setup log shipping using the least amount of parameters. You actually need six parameters to make it work:

  1. SourceSqlInstance – Which instance need to be used as the source server)
  2. DestinationSqlInstance – Which server needs to be used to log ship the database too
  3. BackupNetworkPath – Where do the backup need to be placed
  4. Database – Which database, or databases, needs to be log shipped
  5. GenerateFullBackup or UseExistingFullBackup – Generate a full backup or use an existing one

The command below will log ship a database from server1 to server2.
It will log ship one database called db1.
The network path for the backups is \\server1\logshipping\backup.
The backups will be created with backup compression.
No new backup will be created, instead it will use the last full backup
In this example the suffix for the database is “LS” because I want to differentiate the log shipped databases.

Below is the result of the log shipping.

What else can be done with this function?

When I developed this function it had a lot of iterations. I’m a developer that will have a good idea what to build and during the process will think of other functionality that needs to be put in.
I had a lot of iterations where I constantly thought of new ideas that would make other people’s work easier.

A couple of examples:

  • To monitor or not to monitor log shipping
  • Enabled or disabled log shipping jobs
  • Naming convention of the jobs
  • Standby or read-only mode for the secondary database
  • To compress or not to compress the backup files
  • Generate a full backup for initialization
  • Log ship multiple databases in one command. This will be executed in serial.
  • Force every setting and discard anything that’s there

The last one needs a little more explanation. I want this function to work really easy and with existing settings I want it to work. The -Force parameter is a really powerful parameter which will assume a lot of the defaults but will also overwrite existing settings like jobs, schedules, database settings etc.

Conclusion

I will never ever use the GUI again to setup log shipping again and I hope you will too. It’s just way too much work and I’ve had it fail on my lots of times not knowing what went wrong having to set everything up from scratch.

There is more functionality coming for the log shipping functions like status checks and fail-over initialization.

If you find that something is missing, or works inefficient or doesn’t work (I hope not), in this function please let us know. You can find us on Slack, GitHub, Twitter, YouTube, LinkedIn. You can find me on on the Slack channel most of the time. If you have any questions you know where to find me.

The last thing I want to say is that dbatools module is a community driven module that enables database administrators to work more efficient. It’s because of the hard work of all the volunteers that this module is such a success.
It’s a group of very hard working and smart people who I have great respect for. It is because of the enthusiasm of all those people, and especially Chrissy, that I love working on the module.

Thank you again Rob for hosting this month’s TSQL Tuesday, it was really fun.

Create Your Own PowerShell Profile

Standard

Being a fan of automation I like to create my own PowerShell profile. It enables me to load various settings that normally take more time. The PowerShell profile resides in your home directory and if you work in an AD environment with roaming data you’ll have the same profile on every computer.

PowerShell profiles are not new and dates back to PowerShell v2.0. Others people have written about this subject before but I wanted to share my take on it.

Create the profile

To check if you have a profile execute the following command:

Profile Test Existence

If this returns false it means you don’t have a profile and you have to create one. The easiest way to do that is by executing the following:

Profile Create

From this point on you can put in anything that helps you speed up.

PowerShell drives

I have several directories which I use regularly. Instead of having to type the entire path I can create a PowerShell drive.

Profile PS-Drive Example

Now instead having to type the entire path I can simply use the new drive. This not only speeds up the way you enter your directory but also simplifies the reference to your current working directory.

Credentials

The next thing I do is when I’m part of a domain I usually need another user account to access my database servers. So I call the next piece of code to enter the credentials for my administrative account:

Get-Credential Window

You can reference the $Credential parameter to other functions that need more privileges. For instance, if you use dbatools, a lot of the commands have this parameter to make sure you connect to the SQL Server instance with the right credential.

Color schemes

Color schemes are important. There have been studies that show that certain fonts and colors make working on a console easier for the eyes.

I don’t like the colors for errors and warnings and if I have to go through a lot of them it’s hard to read the bright red on the blue background.

Profile Color Example

At least I can read this better than the original color scheme. Play around with this and you’ll see it makes handling messages from the console a lot easier.

Window and buffer size

It’s possible to change these values to make sure all your data fits inside your console.

Setting the location

For me it’s not always a good thing that the working directory is set to my home folder. This happens by default when you open a PowerShell console. If you open it with administrative privileges it points to  system32 folder of you Windows directory.

Conclusion

In the end you can load all kinds of settings in your profile to make your life a little easier. From the settings above to creating aliases, starting other scripts etc.

I hope this was informative for you and maybe you’ll get started with profiles too.

Testing Log Shipping with PowerShell and Pester

Standard

Thanks to my good friend Rob Sewell (b | t) I got into testing with PowerShell with Pester.
He showed me how to use the Pester module to test about anything you can test with PowerShell.

In my environment I have lots of databases being log shipped. To see the current status of the log shipping I either have to execute a procedure or execute a standard report from SSMS. And I know there are other solutions but in most cases it comes back to these two.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds. This doesn’t show me the problems right away but it’s fast.
The report shows me in nice green and red colors which which is good and what’s not but I’d have to do it for every server manually. So it’s pretty but it’s slow.

Why isn’t there anything that has the best of both worlds where I have the speed of the query with the clear results of the report?

That’s where Pester comes in!

I can use Pester to test the values that come back from the procedure and do basically the same thing the report does.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds.
There is also an overall value called “status” which shows a 0 when everything is fine and a 1 when something is wrong.

I wanted to keep the check as simple as possible. If everything is fine, I don’t need to go into much detail. If there is something wrong I want to know which part of the log shipping failed, like the backup, copy or restore.

The Overall Test

This test only looks at the the marker the log shipping sets to indicate if everything is OK or not.

Log Shipping Pester Overall

Ooh oh! Something failed. Good thing we have our test.

The Detailed Test

The detailed test goes into the detail of the last backup, last copy and last restore time. We want to know what went wrong and where.

In this case I only want to zoom in on the database with the failed test so I use the -Database parameter to give the name of the database.

Log Shipping Pester Detailed Single DB

Of course it’s possible to execute the detailed test for all the databases. Just remove the database parameter and the same tests are run for all the databases.

Log Shipping Pester Detailed All DB

The entire test took less than two seconds and made it possible for me to quickly see if everything is fine with the log shipping.

The code can be downloaded from github. I hope this helps you out and make your life a little easier as it did for me.

 

 

Why I love dbatools

Standard

I’ve been working for the dbatools project for a while now and I felt telling you why I love this project.

A little background about me, I’m not a full time programmer. I learned programming with Java years ago and did little personal projects with PHP, C# and a couple of other languages. I started PowerShell about 7 years ago and thought I was capable of delivering solid code. That all changed with dbatools.

Being part of the team

So for the last year I’ve part of the project, from debugging code, adding new features to existing functions and adding new functions.

A world opened up when I first joined the project. I had no idea I would be learning this much in such a short time. I had to deal with GIT, QA, advanced modules, styling, Pester etc.

So my first command was a little command Find-DbaOrphanedFile. One time Chrissy LeMaire asked if someone could make a function to find orphaned files of databases. I jumped in because I knew this was something I could do and I didn’t have the chance yet to do help out with the project. In about two weeks I had the function done as far as I knew. It did the job and now I wanted to present my awesome code to other developers.

My first challenge was to get to deal with GIT. I had never used GIT and the only source control my company had at the time was Visual SourceSafe. Don’t judge me! I wasn’t the one that decided to use an out-of-date piece of software. Of course when you do things the first time you’re going to fail and I failed big time. I made a branch from the wrong fork, committed stuff but didn’t synchronize it, created a pull request (PR) in the wrong branch and more. I did everything wrong you could do wrong and still Chrissy was nice as always trying to help me out to get everything on track.

After the GIT hurdle I finally submitted the PR and after about a day I got a kind but long comment back from one of the members that did the QA. Before I started, I read some of the standards the project put in place but as a developer you want to get started and a as a result I forgot some of them. The funny thing was though that I learned more about PowerShell, modules, functions, standards etc in that one comment than I had did in the last 4 years.

What struck me was the way the members dealt with the people like me who weren’t familiar with a more professional way of development. The members understand that reacting the wrong way, I would’ve quit helping out with the project because it would be too overwhelming.

That’s one of the strengths of the project, to embrace everyone that wants to help out. Find a way to make everyone a functional member of the team being either a developer, QA, writing articles etc.

That made me more enthusiastic about the project and I started to contribute more. Now I’ve become one of the major contributors.

In the last year I learned more about PowerShell than I did in my history of doing PowerShell. I’ve become more precise when it comes to my code, I go over my tests in meticulous way and try to keep by coding standards. I looked back at some code I’d written over the years and imagined that some crazed out monkey with a brain fart high on crack made it. Now I go through all the code I’ve written over the years and redo everything that’s no longer up to my new standards.

Being a user of the module

The other reason I love dbatools is because it has made my life soo much easier. I see myself as one of the lazy DBAs that would rather spend a couple of hours automating his work, than having to do the same thing over and over again. The project has about 200 different functions and it’s close to releasing version 1.0. This is big deal due to a lot of standardizing, testing and new functions that are going to get released. With that amount of functionality in one single module there is bound to be a solution for you in there to make it easier to do your work. Nowadays I test my backups every day using the Test-DbaLastBackup function. I see the status of all my backups on all my database server within seconds. I retrieve information about many servers without having to go through each one of them. And migration have been a blast.

If you aren’t exited about the project yet, please visit the website and look what has already been accomplished. Go see all the commands and then decide if it’s worth implementing in your daily work. If you’re wondering if a command is there that could help you out, the command index can help you find more information. This is still in beta though but we’re working on getting the information in there.

Thank you!

I want to thank the dbatools team for making me feel welcome and to boost my knowledge to a point that it has made a significant impact in the way I do things in my daily life.
I also want to thank all the contributors that put in all the effort to get the project where it is today. Without all the people putting in their valuable time this wouldn’t have worked.

Testing of backups updated

Standard

Last week I showed how you can test your backups using the Test-DbaLastBackup function in the dbatools module.

The script makes it easier to iterate through multiple servers, export the data and to send the results to you using e-mail.

My good friend Rob Sewel wrote a nice post to take the function Test-DbaLastBackup a little further. In his example he uses COM objects to create the Excel document. I personally don’t like COM objects because they’re slow and you need the application installed on the running machine. Although COM objects have these disadvantages, they’re pretty solid en work smoothly most of the time.

I wasn’t satisfied with my script exporting to CSV because I would open the CSV file with Excel, so why not export it to an Excel file right away. I usually use the ImportExcel module to export data to Excel.
There was also a need to add the data and log directory to the command to make sure you would be able to use other drives to temporary save the database files.

To run the script you need the dbatools and ImportExcel module.

The last thing that was done is that this function is now available on Git and available for anyone to download via this link.

The function also can be seen below:

 

 

Testing your backups with dbatools

Backup Yourselves Data Loss Is Coming
Standard

 

It has always said, you’re as good as your last restore, not your last backup. How many of you make your backups and think that everything is OK. There comes a time that you have to restore your database from a backup and you find out that one or more backup files is corrupt.

Testing your backups is a tedious job and it takes a lot of time which I as a DBA don’t have. I don’t have the time to restore a database, run a DBCC command for every database that’s backed up.

There is a solution and it’s called “Test-DbaLastBackup” which is part of the dbatools module.

It goes through the following steps:

  1. Restore the database with the name “dbatools-testrestore-[databasename]” by default. The prefix can be changed.
  2. Run a DBCC check on the database
  3. Remove the database

Test Backup Progress

You’ll see a progress bar how far the database restore is.
After the restore the DBCC command is executed. You’ll not see any progress for that step.

When the entire process is complete the command will output the results:

Test Backup Progress

But for me that’s not enough. This is one database and some of my servers have more than 20 databases on it with sizes ranging from 50 GB to 500 GB (not that large yet but large enough). I want to create a job that executes the test on all the databases and send the results to me.

It’s not going to be a SQL Server Agent job but a Windows Scheduled Task. I don’t need the SQL Server Agent for this and it makes more sense to do this outside of SQL Server.

To start testing I created a file called “backuptest.ps1” and put in the following code:

I added a try/catch block to make sure I would be able catch what went wrong.

If you don’t know how to execute a PowerShell script from the Windows Task Scheduler please read the following blog post: Use the Windows Task Scheduler to Run a Windows PowerShell Script.

After the setup my action looks like this:

Test Backup Task

Make sure your task is set to run under an account that has privileges to access the SQL Server and write the file.

Test Backup Privs

A couple of things that could be in this script:

  1. Execute the script over multiple servers
  2. Mail the results
  3. Add checks and error catching to make sure everything runs

Unfortunately the command “Test-DbaLastBackup” doesn’t allow us to supply servers. I could copy the row that tests the backup but that’s not me. I want things to run in one go and repetitive things don’t work out for me.

I don’t want to log into my server and lookup the results. I want them in my e-mail box when I check my e-mail i the morning. For that you could use the Send-Message commandlet.

The original script is removed because the script is updated and is now available on GIT via this link. Also check the new post about the updated version of the script.

Executing the script:

The script works well although, I would have liked to put in functionality like values from pipeline etc.

I hope you can use this script to your advantage and that testing your backups is no longer a something to avoid but to embrace.

 

Export-DMVInformation Module Update

DMV
Standard
Last Friday I had the chance to show the Export-DMVInformation module to the Dutch Powershell user group. After the presentation I got a couple of suggestions and wanted to put them in place them into the module.

Changes:

  1. Possibility to execute the module using the pipeline
  2. Get all the databases in one statement by assigning the value “ALL” to the database parameter.
  3. Replaced messages with verbose
Changes 1 and 2 reduce the amount of time needed to get information from multiple instances and multiple databases. Before you had to execute the module in a external loop and for each database . Now you can get the information from all the databases across multiple instances in a single line!

Change 3 gives the ability to choose whether to show all the messages or not. Use the switch “-Verbose” to see the various messages.

An example how the new module works is shown below:

Export DMV result

Please take a look how to install or use the module on the module page.
 
Happy exporting!

 

Automate Glenn Berry’s DMV Scrips With PowerShell

Standard

Last week I attended one of the 24 hours of PASS sessions by Glenn Berry about diagnosing performance problems with DMV’s. If you don’t know what the 24 hours of PASS is, it’s an initiative by PASS where there are 24 sessions, one session every hour for an entire day.

Glenn Berry is also knows as Dr. DMV because he made some genius scripts to analyze your instances and databases using the dynamic managed views (DMV).

I’ve used Glenn’s DMV scripts for years but always found them tedious to execute because there are about 70 individual scripts that either query instance or retrieve database information. Glenn did make it easier for you by creating Excel templates to save the information in.
There are separate scripts for each version of SQL Server that updated every month. Glenn only updates the versions for 2012 to 2016 with new features. The scripts are very well documented and even contain hints about how to solve some issues.

As almost every DBA I’m very lazy when it comes to repetitive tasks and when something takes a couple of minutes to do manually, I’ll make sure I don’t have to do it again by automating it.

So that’s what I did and what I created is a PowerShell module to export all the information from the DMV queries to an Excel sheet automatically.

The hardest part of automating this module was to parse the entire script into individual queries. Fortunately Glenn has made this easy for me by placing a marker at the end of each query.

Note: Because the scripts are only updated from version 2012 and forward I made some changes to the older scripts to make sure the files could be parsed. If you let the script download the DMV files you’ll have no issues executing the module.

How it works

The module has one cmdlet called “Export-DMVInformation”.

The module is dependent on the module “ImportExcel” to make it possible to export the data to an Excel file. You can find the ImportExcel module here.

The module can be executed simply by supplying the name of the instance, but there are other options to make the end result more satisfying.

The cmdlet has the following parameters:

  1. instance: name of the instance. If it’s a default instance just use the server name
  2. database: the name of the database to query. If no name is given the master database will be used.
  3. username: The username in case sql authentication is needed
  4. password: The password that goes with the username if sql authentication is needed
  5. dmvlocation: The location of the dmv queries. The default is “My Documents\dmv\queries”
  6. destination: The destination where the Excel files need to be written to. The default is “My Documents\dmv\results”.
  7. excludeinstance: A flag to exclude the instance queries. Default is 0 or $false.
  8. querytimeout: Timeout in seconds for the query to not let the query take up too much time. The default 300 seconds (or 5 minutes).

The heart of the cmdlet is the dmv file because without it doesn’t work. You don’t have to worry about which version needs to be used because the cmdlet will find that out itself.
If there are no DMV files, or you couldn’t bother do download them, don’t worry. The cmdlet will initiate a download and download the DMV files for you to the default location set in the parameter “dmvlocation”.

The result

A partial result of the cmdlet can be seen below:

Export-DMVInformation Result

There will be a Excel file with data for the instance and a separate file for the database when the cmdlet is finished.

How to install

The easiest method to install the module is by copying the code below and entering it in a PowerShell window:

Alternative installation method

Alternatively you can download the module from here.

Unzip the file.

Make a directory (if not already present) named “Export-DMVInformation” in one of the following standard PowerShell Module directories:

  • $Home\Documents\WindowsPowerShell\Modules (%UserProfile%\Documents\WindowsPowerShell\Modules)
  • $Env:ProgramFiles\WindowsPowerShell\Modules (%ProgramFiles%\ WindowsPowerShell\Modules)
  • $Systemroot\System32\WindowsPowerShell\v1.0\Modules (%systemroot%\System32\ WindowsPowerShell\v1.0\Modules)

Place both the “psd1” and “psm1” files in the module directory created earlier.

Execute the following command in a PowerShell command screen:

Import-Module Export-DMVInformation

To check if the module is imported correctly execute the following command:

Get-Command -Module Export-DMVInformation or Get-Module -Name Export-DMVInformation

If you see a list with the functions than the module is installed successfully. If you see nothing happening something has gone wrong.

I hope you enjoy the module and that it can help you to get performance information fast.