Testing Log Shipping with PowerShell and Pester

Standard

Thanks to my good friend Rob Sewell (b | t) I got into testing with PowerShell with Pester.
He showed me how to use the Pester module to test about anything you can test with PowerShell.

In my environment I have lots of databases being log shipped. To see the current status of the log shipping I either have to execute a procedure or execute a standard report from SSMS. And I know there are other solutions but in most cases it comes back to these two.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds. This doesn’t show me the problems right away but it’s fast.
The report shows me in nice green and red colors which which is good and what’s not but I’d have to do it for every server manually. So it’s pretty but it’s slow.

Why isn’t there anything that has the best of both worlds where I have the speed of the query with the clear results of the report?

That’s where Pester comes in!

I can use Pester to test the values that come back from the procedure and do basically the same thing the report does.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds.
There is also an overall value called “status” which shows a 0 when everything is fine and a 1 when something is wrong.

I wanted to keep the check as simple as possible. If everything is fine, I don’t need to go into much detail. If there is something wrong I want to know which part of the log shipping failed, like the backup, copy or restore.

The Overall Test

This test only looks at the the marker the log shipping sets to indicate if everything is OK or not.

Log Shipping Pester Overall

Ooh oh! Something failed. Good thing we have our test.

The Detailed Test

The detailed test goes into the detail of the last backup, last copy and last restore time. We want to know what went wrong and where.

In this case I only want to zoom in on the database with the failed test so I use the -Database parameter to give the name of the database.

Log Shipping Pester Detailed Single DB

Of course it’s possible to execute the detailed test for all the databases. Just remove the database parameter and the same tests are run for all the databases.

Log Shipping Pester Detailed All DB

The entire test took less than two seconds and made it possible for me to quickly see if everything is fine with the log shipping.

The code can be downloaded from github. I hope this helps you out and make your life a little easier as it did for me.

 

 

T-SQL Tuesday: The times they are a-changing

Standard

sql tuesdayThis months T-SQL Tuesday is is inspired by the blog post Will the Cloud Eat My DBA Job? by Kendra Little. Technology has changed a lot in the past years, especially with cloud/globalization/automation. What an impact has this had on your job? Do you feel endangered?

Over the years I’ve seen a lot of things change with SQL Server. I can remember that somebody told me when I started with SQL Server 200 that T-SQL was never going to be a big thing. How wrong was that guy right?!

I personally have not yet had the chance to do a lot with the cloud like Azure SQL Database.I did get the change to fiddle around with a trial period of Azure to see what I could do and to get a feel for the interface. Unfortunately this was just a trial and after it ended I didn’t pick it up again. I should because I like to learn new skills.

What an impact has this had on your job?

Some things have had a great impact on my work. Take PowerShell for instance. If I didn’t start with that when it first came out I would still be clicking around SSMS like crazy. I like to automate everything that’s repetitive because I like to spend my time efficiently doing stuff that gives me energy and not do things over and over again.

At this moment cloud has not an impact on me. Employers in the past did not see any benefit of it at that moment and my current employer does not yet see use of it either. Maybe something will change in the near future but for now it’s not going to happen 🙁
I would really love to start working with the cloud and move some of our databases because I think it is a valuable addition to traditional architectures.

Do you feel endangered?

No! And I’ll tell you why.

When I first started to work with servers and databases you had a physical server where you had to put in a CD or DVD and run the install from a console to get Windows Server installed. As soon as everything was set up you could remotely log in and do the rest of the work from the office.

The server room would look a little like this:

Servers in serverroom

Nowadays you, or your system administrator either logs in to VMWare or Hyper V. He or she creates a new server, and if they work smart, it’s done with cloning or scripting and the server is done within minutes.
The last time I physically touched a server to do database administrator work was about 8 years ago. If you have the same situation you kind of work in a cloud-like environment.

The second reason I don’t feel endangered, is because as a DBA I have to deal with everything within and around SQL Server. Bluntly said a VM for me is nothing more than a box of resources where my instances do their work. I know people will disagree with me but if you’re not the system administrator the underlying hardware layer is invisible to you.

If your employer is a company with one or more database servers there will always be a need for a DBA. There will be performance issues, new installations, high availability, reporting etc etc.

Or do you have more exciting features/toys to work with?

In the last couple of years so much has changed in SQL Server that it’s impossible to comprehend everything.

I’m excited to work with SQL Server vNext on Linux for example. There are a lot of new features for Business Intelligence with Analysis Services and Power BI.

You can create stretched databases where parts of the database are in the cloud and others are on premise. Imagine a hybrid environment where parts of the network are locally/on premise and other parts are deployed in the cloud.

Microsoft has embraced PowerShell for SQL Server and has now made it open-source. How cool is that! There are more and more people developing in PowerShell and creating modules like dbatools.

There is so much new stuff that I have to cherry pick what to do first.

Do you embrace the change and learn new skills?

I embrace change and love to learn new skills.

The world is changing for the DBA and we as DBAs have to change with it. We’re no longer the strange guy in the cubicle that only shows up when something goes wrong. I see more and more situations that we have to be on the forefront taking action and be visible to our colleagues.

If your employer wants to work in the cloud don’t be afraid of it, embrace it, learn everything you can about it.If you have processes that are inefficient, automate/optimize them and make your life easier.

Be the one that has the vision to get the company to higher level and you’ll see that everything will work out.

 

 

Testing of backups updated

Standard

Last week I showed how you can test your backups using the Test-DbaLastBackup function in the dbatools module.

The script makes it easier to iterate through multiple servers, export the data and to send the results to you using e-mail.

My good friend Rob Sewel wrote a nice post to take the function Test-DbaLastBackup a little further. In his example he uses COM objects to create the Excel document. I personally don’t like COM objects because they’re slow and you need the application installed on the running machine. Although COM objects have these disadvantages, they’re pretty solid en work smoothly most of the time.

I wasn’t satisfied with my script exporting to CSV because I would open the CSV file with Excel, so why not export it to an Excel file right away. I usually use the ImportExcel module to export data to Excel.
There was also a need to add the data and log directory to the command to make sure you would be able to use other drives to temporary save the database files.

To run the script you need the dbatools and ImportExcel module.

The last thing that was done is that this function is now available on Git and available for anyone to download via this link.

The function also can be seen below:

 

 

Testing your backups with dbatools

Backup Yourselves Data Loss Is Coming
Standard

 

It has always said, you’re as good as your last restore, not your last backup. How many of you make your backups and think that everything is OK. There comes a time that you have to restore your database from a backup and you find out that one or more backup files is corrupt.

Testing your backups is a tedious job and it takes a lot of time which I as a DBA don’t have. I don’t have the time to restore a database, run a DBCC command for every database that’s backed up.

There is a solution and it’s called “Test-DbaLastBackup” which is part of the dbatools module.

It goes through the following steps:

  1. Restore the database with the name “dbatools-testrestore-[databasename]” by default. The prefix can be changed.
  2. Run a DBCC check on the database
  3. Remove the database

Test Backup Progress

You’ll see a progress bar how far the database restore is.
After the restore the DBCC command is executed. You’ll not see any progress for that step.

When the entire process is complete the command will output the results:

Test Backup Progress

But for me that’s not enough. This is one database and some of my servers have more than 20 databases on it with sizes ranging from 50 GB to 500 GB (not that large yet but large enough). I want to create a job that executes the test on all the databases and send the results to me.

It’s not going to be a SQL Server Agent job but a Windows Scheduled Task. I don’t need the SQL Server Agent for this and it makes more sense to do this outside of SQL Server.

To start testing I created a file called “backuptest.ps1” and put in the following code:

I added a try/catch block to make sure I would be able catch what went wrong.

If you don’t know how to execute a PowerShell script from the Windows Task Scheduler please read the following blog post: Use the Windows Task Scheduler to Run a Windows PowerShell Script.

After the setup my action looks like this:

Test Backup Task

Make sure your task is set to run under an account that has privileges to access the SQL Server and write the file.

Test Backup Privs

A couple of things that could be in this script:

  1. Execute the script over multiple servers
  2. Mail the results
  3. Add checks and error catching to make sure everything runs

Unfortunately the command “Test-DbaLastBackup” doesn’t allow us to supply servers. I could copy the row that tests the backup but that’s not me. I want things to run in one go and repetitive things don’t work out for me.

I don’t want to log into my server and lookup the results. I want them in my e-mail box when I check my e-mail i the morning. For that you could use the Send-Message commandlet.

The original script is removed because the script is updated and is now available on GIT via this link. Also check the new post about the updated version of the script.

Executing the script:

The script works well although, I would have liked to put in functionality like values from pipeline etc.

I hope you can use this script to your advantage and that testing your backups is no longer a something to avoid but to embrace.

 

Export-DMVInformation Module Update

DMV
Standard
Last Friday I had the chance to show the Export-DMVInformation module to the Dutch Powershell user group. After the presentation I got a couple of suggestions and wanted to put them in place them into the module.

Changes:

  1. Possibility to execute the module using the pipeline
  2. Get all the databases in one statement by assigning the value “ALL” to the database parameter.
  3. Replaced messages with verbose
Changes 1 and 2 reduce the amount of time needed to get information from multiple instances and multiple databases. Before you had to execute the module in a external loop and for each database . Now you can get the information from all the databases across multiple instances in a single line!

Change 3 gives the ability to choose whether to show all the messages or not. Use the switch “-Verbose” to see the various messages.

An example how the new module works is shown below:

Export DMV result

Please take a look how to install or use the module on the module page.
 
Happy exporting!

 

My SQL Server journey with Linux

Standard

Linux, where did it all begin

About a couple of years ago there was an April fools joke on SQLServerCentral.com about SQL Server coming to Linux. Now we know this isn’t a joke and SQL Server for Linux is a fact.

I started to explore Linux around 1998 on my AMD K6 266 Mhz pc. I bought Linux because back then I could download that entire distribution over a dial-up connection going 5Kb a second. If I would have downloaded that from the internet the phone bill would blow your mind.

My first Linux distro was Redhat Linux version 5.1 (Manhattan) and back then there was no nice GUI for the install so it took me a while to get things started. It wasn’t like I had loads of documentation on the internet back then so I did what I could using books and small pieces of information I found on the internet.

I didn’t have the luxury of owning multiple computers like some class mates had and I decided to do a dual install next to my Windows 98 installation (oh yeah I was really early to upgrade from Windows 95).

Nowadays I can download a Linux distro in a couple of minutes and within 30 minutes I have a virtual machine installed and ready to go.

Preparation

Where do we start? Let’s first download a distribution that’s supported my Microsoft. I decided to download Ubuntu Server 16.4 because I don’t need all the fancy GUI stuff that’s in the Desktop version and don’t to have too much software I’m never going to use.

I installed the distribution in a VM and installed the open SSH server to be able to connect to my VM using Putty. I use Putty because I want to easily copy and paste and I don’t like to be connected to the console all the time.

Installing SQL Server

I can give you a complete overview what I did to install SQL Server but Microsoft already did that for you.

Installing SQL Server using the “Install SQL Server on Linux article from Microsoft is as boring as can be, which is a good thing. If you go through the setup from that page you’ll notice that it all works.

Conclusion

Microsoft did a really good job providing SQL Server for Linux because the installation was boring as as can be. I know of course that this is still the CTP and that most configurations were done for us but still this is a really good introduction of SQL Server to a complete different platform.

So if you haven’t already tried to install SQL Server on Linux go try it out because it’s a piece of cake.

 

 

 

 

 

Automate Glenn Berry’s DMV Scrips With PowerShell

Standard

Last week I attended one of the 24 hours of PASS sessions by Glenn Berry about diagnosing performance problems with DMV’s. If you don’t know what the 24 hours of PASS is, it’s an initiative by PASS where there are 24 sessions, one session every hour for an entire day.

Glenn Berry is also knows as Dr. DMV because he made some genius scripts to analyze your instances and databases using the dynamic managed views (DMV).

I’ve used Glenn’s DMV scripts for years but always found them tedious to execute because there are about 70 individual scripts that either query instance or retrieve database information. Glenn did make it easier for you by creating Excel templates to save the information in.
There are separate scripts for each version of SQL Server that updated every month. Glenn only updates the versions for 2012 to 2016 with new features. The scripts are very well documented and even contain hints about how to solve some issues.

As almost every DBA I’m very lazy when it comes to repetitive tasks and when something takes a couple of minutes to do manually, I’ll make sure I don’t have to do it again by automating it.

So that’s what I did and what I created is a PowerShell module to export all the information from the DMV queries to an Excel sheet automatically.

The hardest part of automating this module was to parse the entire script into individual queries. Fortunately Glenn has made this easy for me by placing a marker at the end of each query.

Note: Because the scripts are only updated from version 2012 and forward I made some changes to the older scripts to make sure the files could be parsed. If you let the script download the DMV files you’ll have no issues executing the module.

How it works

The module has one cmdlet called “Export-DMVInformation”.

The module is dependent on the module “ImportExcel” to make it possible to export the data to an Excel file. You can find the ImportExcel module here.

The module can be executed simply by supplying the name of the instance, but there are other options to make the end result more satisfying.

The cmdlet has the following parameters:

  1. instance: name of the instance. If it’s a default instance just use the server name
  2. database: the name of the database to query. If no name is given the master database will be used.
  3. username: The username in case sql authentication is needed
  4. password: The password that goes with the username if sql authentication is needed
  5. dmvlocation: The location of the dmv queries. The default is “My Documents\dmv\queries”
  6. destination: The destination where the Excel files need to be written to. The default is “My Documents\dmv\results”.
  7. excludeinstance: A flag to exclude the instance queries. Default is 0 or $false.
  8. querytimeout: Timeout in seconds for the query to not let the query take up too much time. The default 300 seconds (or 5 minutes).

The heart of the cmdlet is the dmv file because without it doesn’t work. You don’t have to worry about which version needs to be used because the cmdlet will find that out itself.
If there are no DMV files, or you couldn’t bother do download them, don’t worry. The cmdlet will initiate a download and download the DMV files for you to the default location set in the parameter “dmvlocation”.

The result

A partial result of the cmdlet can be seen below:

Export-DMVInformation Result

There will be a Excel file with data for the instance and a separate file for the database when the cmdlet is finished.

How to install

The easiest method to install the module is by copying the code below and entering it in a PowerShell window:

Alternative installation method

Alternatively you can download the module from here.

Unzip the file.

Make a directory (if not already present) named “Export-DMVInformation” in one of the following standard PowerShell Module directories:

  • $Home\Documents\WindowsPowerShell\Modules (%UserProfile%\Documents\WindowsPowerShell\Modules)
  • $Env:ProgramFiles\WindowsPowerShell\Modules (%ProgramFiles%\ WindowsPowerShell\Modules)
  • $Systemroot\System32\WindowsPowerShell\v1.0\Modules (%systemroot%\System32\ WindowsPowerShell\v1.0\Modules)

Place both the “psd1” and “psm1” files in the module directory created earlier.

Execute the following command in a PowerShell command screen:

Import-Module Export-DMVInformation

To check if the module is imported correctly execute the following command:

Get-Command -Module Export-DMVInformation or Get-Module -Name Export-DMVInformation

If you see a list with the functions than the module is installed successfully. If you see nothing happening something has gone wrong.

I hope you enjoy the module and that it can help you to get performance information fast.

 

 

Enable changes that require table re-creation

Standard

The Problem

When you’re developing on an instance you might want to change something in a database where the change might require to re-create the table. By default, the SQL Server Management Studio (SSMS) will prevent saving changes that require the table to be recreated.

Examples of changes that require table re-creation:

  • Change a column to no longer allow NULL values
  • Adding columns in the before another column
  • Moving a column

In earlier editions it was possible to allow some changes to be done directly using the GUI. What SSMS does in the background is unloading data of the table into a temporary table (not a # temp table), then deleting the original table and renaming the temporary table to the original table name.
You can imagine that actions like that would have a huge impact on a large (maybe production) table.

When you apply such a change you’ll get the following error message:

prevent changes error

The solution

Again, be very careful when enabling this option when you regularly work in production environments. I very much hope so that you never make these kind of changes using the GUI especially not in production. Just don’t!

To enable the changes in SSMS go to the menu “Tools” and click on “Options”.

Go to option “Designers” and uncheck the checkbox at “Prevent saving changes that require table re-creation”.

prevent_changes_solution

Click on “OK” and you’ll be able to apply the changes you couldn’t do before.

Remember that this option is set for your SSMS installation and that this will be used for every instance you connect to.

Happy developing!