Testing Log Shipping with PowerShell and Pester


Thanks to my good friend Rob Sewell (b | t) I got into testing with PowerShell with Pester.
He showed me how to use the Pester module to test about anything you can test with PowerShell.

In my environment I have lots of databases being log shipped. To see the current status of the log shipping I either have to execute a procedure or execute a standard report from SSMS. And I know there are other solutions but in most cases it comes back to these two.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds. This doesn’t show me the problems right away but it’s fast.
The report shows me in nice green and red colors which which is good and what’s not but I’d have to do it for every server manually. So it’s pretty but it’s slow.

Why isn’t there anything that has the best of both worlds where I have the speed of the query with the clear results of the report?

That’s where Pester comes in!

I can use Pester to test the values that come back from the procedure and do basically the same thing the report does.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds.
There is also an overall value called “status” which shows a 0 when everything is fine and a 1 when something is wrong.

I wanted to keep the check as simple as possible. If everything is fine, I don’t need to go into much detail. If there is something wrong I want to know which part of the log shipping failed, like the backup, copy or restore.

The Overall Test

This test only looks at the the marker the log shipping sets to indicate if everything is OK or not.

Log Shipping Pester Overall

Ooh oh! Something failed. Good thing we have our test.

The Detailed Test

The detailed test goes into the detail of the last backup, last copy and last restore time. We want to know what went wrong and where.

In this case I only want to zoom in on the database with the failed test so I use the -Database parameter to give the name of the database.

Log Shipping Pester Detailed Single DB

Of course it’s possible to execute the detailed test for all the databases. Just remove the database parameter and the same tests are run for all the databases.

Log Shipping Pester Detailed All DB

The entire test took less than two seconds and made it possible for me to quickly see if everything is fine with the log shipping.

The code can be downloaded from github. I hope this helps you out and make your life a little easier as it did for me.



My SQL Server journey with Linux


Linux, where did it all begin

About a couple of years ago there was an April fools joke on SQLServerCentral.com about SQL Server coming to Linux. Now we know this isn’t a joke and SQL Server for Linux is a fact.

I started to explore Linux around 1998 on my AMD K6 266 Mhz pc. I bought Linux because back then I could download that entire distribution over a dial-up connection going 5Kb a second. If I would have downloaded that from the internet the phone bill would blow your mind.

My first Linux distro was Redhat Linux version 5.1 (Manhattan) and back then there was no nice GUI for the install so it took me a while to get things started. It wasn’t like I had loads of documentation on the internet back then so I did what I could using books and small pieces of information I found on the internet.

I didn’t have the luxury of owning multiple computers like some class mates had and I decided to do a dual install next to my Windows 98 installation (oh yeah I was really early to upgrade from Windows 95).

Nowadays I can download a Linux distro in a couple of minutes and within 30 minutes I have a virtual machine installed and ready to go.


Where do we start? Let’s first download a distribution that’s supported my Microsoft. I decided to download Ubuntu Server 16.4 because I don’t need all the fancy GUI stuff that’s in the Desktop version and don’t to have too much software I’m never going to use.

I installed the distribution in a VM and installed the open SSH server to be able to connect to my VM using Putty. I use Putty because I want to easily copy and paste and I don’t like to be connected to the console all the time.

Installing SQL Server

I can give you a complete overview what I did to install SQL Server but Microsoft already did that for you.

Installing SQL Server using the “Install SQL Server on Linux article from Microsoft is as boring as can be, which is a good thing. If you go through the setup from that page you’ll notice that it all works.


Microsoft did a really good job providing SQL Server for Linux because the installation was boring as as can be. I know of course that this is still the CTP and that most configurations were done for us but still this is a really good introduction of SQL Server to a complete different platform.

So if you haven’t already tried to install SQL Server on Linux go try it out because it’s a piece of cake.






Create indexes without DATAAREAID in Dynamics 2009


The problem

As I had an issue with performance I wanted to create indexes without DATAREAID in Dynamics AX 2009.

noindexThe indexes didn’t have the DATAAREAID column which is automatically added to the index as the first column on creation.

Because the new created indexes shouldn’t have this column on the first position I had a problem.

The second problem is that every time an update or synchronization takes place in Dynamics AX 2009 the database objects which are not part of the software architecture are removed.

The solutions

I came up with two solutions:

  1. Manually place the DATAAREAID column at the end of the index
  2. Create a job that checks if the indexes still exist on the database in the database

I chose option 2 because of several reasons:

  1. I don’t know the impact of the index change ( the index already existed but with the DATAAREAID column)
  2. I want to be sure the queries benefit the new index for sure
  3. AX 2009 doesn’t support included columns

The last reason was the the decisive reason why the indexes weren’t created in AX 2009.
The indexes I wanted to create used included columns which apparently isn’t supported in AX 2009.

I created a job in SQL Server with the following code:

The looks all very simple and it is!

In the new version, AX 2012, included columns are supported so maybe I have to adjust the solution until then.

Get Active Session per database


Sometimes we have the issue that a database is locked due the fact that someone is using the database for some apparent reason.

The same happened to me when I wanted to restore a database but failed because someone was using it.I wanted a script that showed me the sessions for a specific database and the login name.

The code below helps you with that:

To see all the connections for a;; databases you can use the following script:

Get SQL Server Logs with Powershell


In my last posts I talked about retrieving the Event Logs of Windows with Powershell. I had to do the same thing for the SQL Server logs with the same output.

The script works with three parameters:

  1. ServerList
  2. Output
  3. Rows

1. ServerList

This a mandatory parameter and has to point to a comma-seperated file with the following format:

2. Output

This is the output directory for the results. The parameter is not mandatory but it’s very usefull when you have many servers.

3. Rows

This is the amount of rows that you want to search through. Some SQL Server logs tend to get pretty big (over 100.000 rules a day with some of my servers). The parameter is not mandatory but will be set to 100 if no value is given.

The script uses a script that I posted earlier to ping the host. This script can be found here.

In case the output parameter is used, the script will create a CSV file with the following format:

The servername will be server from the server list. The second part will indicate if an error, warning or no error has been presented. The last part will show why a certain warning occured.

The code:

Hope you enjoy the script!


Check SQL Server Jobs with Powershell


I like to automate as much as possible using scripts with SQL Server. So one of the things I want to know is if any job has failed in a list of servers. I’m a fan of Powershell because it’s so powerfull and is easy to develop.

This script uses Excel to write the status of the different jobs on a server.

The script can be run using two arguments:

  1. ServerList
  2. Output


The server list parameter is a mandatory list that holds all the different instances that need to be checked. The file is a text file with on every new line a instance. Make sure the full instance is put in when a named instance is used like: SQL01\SQLEXPRESS or SQL01\name.

In case of a default instance only the name of the server needs to be put in.


The ouput parameter is used when you don’t want to see the Excel sheets. Instead the scripts writes the file directly to the given folder.


The script can be used like this:

The code can be seen below:



My two cents on becoming a DBA


On various websites you can read how to become a DBA and there are several discussions how to start as a DBA. In this post I’ll tell my story how I became a DBA and some tips what you can do to become a DBA.

My first chance

I’ve had the privilege that I had the chance to start as a junior DBA at a company where I was introduced into pretty advanced features of SQL Server. Features like Clustering, Log shipping etc.
The only problem was that the person who was responsible for teaching me quit his job a few weeks later after my arrival.

The only knowledge I had was from the Microsoft books where I passed the SQL Server 2005 MCTS exam.

The company didn’t want to hire a new DBA so all the tasks were handed to me. I must say that was a very busy time.
Try to understand that my predecessor didn’t document anything and all the knowledge he transferred to me was not more than a simple one page of users and passwords to log in to the 40+ servers that the company had.

Finding documentation

The first thing I do at every jobsite I come in is, look through documentation that the department created. In my case the former DBA hadn’t documented anything so I went to the system engineers. Fortunately they had documentation how the servers were installed and where they were placed in the serverrooms (there were several serverrooms).

The thing I always tell people is to create documentation about everything on your server(s). There is no excuse to say that you don’t have the time, MAKE TIME!! In the end you’ll thank me because when the shit hits the fan you’ll have the information you need.

The documentation gave me a start but was not enough to get me fully started. What to do??

Where to start

In my situation I had more than 40 production servers and several configured to do Log shipping, Clustering services and Replication. None of these are mentioned in the certification I had so I decided not to start with these features until I got a good grasp on the configuration of all the servers. I had to maintain the services though but fortunately SQL Server is very robust and we had a few good system engineers that kept everything running smoothly.

For every server I created a document with the following information:

  • Servername
  • Server version (service packs etc)
  • Databases
    • Size of the data file (MDF)
    • Size of the log file (LDF)
    • Recovery model
    • Compatibility level
  • Security options
  • Server Objects
    • Backup devices
    • Linked servers
  • Maintenance plans
  • Jobs

I created a template for Word where all this information can be placed. You can download it at the bottom of the post.

The document comes in handy when you have to search for changes or other information. It may take some time to do this for all your servers but it helps. The document is ment as a start and you may want to add additional information and you’re welcome to do that.

Create baselines

The next thing I would recommend to do is to create a baseline for every server. A baseline is data that can be used to compare performance issues, test installations, can be used with the DTA (Database Engine Tuning Advisor).

The following articles can help you create good baselines. I don’t want to get this article to be huge, that’s the reason why I use other (good) articles.

Creating a Performance Baseline – Part 1

Creating a Performance Baseline – Part 1

Depending on the amount of data generated by the SQL Server Profiler (Nowadays I don’t use Profiler anymore but we’re talking about more than 10 years ago) I’d let the baseline run for 1 to two days. Keep in minds that at very busy systems this can cause performance problems.

Start learning

After I got myself adjusted to the situation, and I had a start with the information I needed, I could go on making an inventory of the situation.

In order to do this I had to learn a lot about High Availibility because this company did that in any way possible and nobody else had a clue how the former DBA created the solutions.

I started learning for the MCITP SQL Server Administrator. After I passed the exams I still didn’t have a good grasp on the situation. I had no experience in High Availibility and I wanted to familiar with it.

I learned the most by surfing the web, reading articles and answering questions on forums.
Answering questions on forums gave me the most experience. I saw questions that I’d never have thought of. While doing this I got further and further and after a good year I could finally say that if a problem existed I could at least find the solution to the problem or solve it right on the spot.

Due to the documentation of all the servers I was able anticipate on (possible) problems.

Monitor everything

As a DBA you’re not just solving problems when they occur. You’ve got to monitor your servers constantly. This not a task that’s done once a day.

You have to monitor the CPU’s, memory, growth of the databases etc etc. Anything that could possible create a problem needs to be monitored.

There are several products in the world that can help you with that. I don’t have shares in the products, these are just some I’ve used in the past:

If you’re starting somewhere and there is no logging on SQL Server level you’ve got the responsibility to advise your superior to get some kind of tool, free or commercial, that will help you do your work.

My predecessor didn’t have a tool that monitored everything. Instead he checked every server, several times a day by logging in into the server and check the errors logs to see if any problems occurred. This task was getting so time consuming that it took a good part of the day and didn’t leave him able to do anything else.


There’s nothing stronger than the community and being part of the SQL community makes it possible to meet new interesting people and learn from them. Everybody started as a beginner and you can probably help others get started.

Try helping people on forums makes you aware of issues that might happen to you and if your answer helped someone than you have a win.

Try getting in contact with people from the SQL Saturdays and see if you can volunteer.

Closing words

I’ve been a DBA for over 13 years now and I personally don’t like to call myself a senior DBA just because there are a lot of people so much further along the path than I am. But I do think the way I did things like I mentioned above helped me to work in a structured way and gave me the opportunities I have had untill this day.

I hope this helped you and if you have any comment please do 😉


Prepare SQL Server 2005 in a clustered environment for SP3



Recently I was asked to create a new high availability solution for MS SQL Server 2005. When I finished the installation of SQL Server 2005, I found myself troubled as to why the installation of SP3 failed with the update of the database services. After a lot of searching I found the solution to my problem.

Because I spend a lot of time figuring out why my installation failed I decided to write this article so other people won’t have to spend so much time and frustration installing the service pack. This article also covers the problem for Service Pack 2. I haven’t tested Service Pack 1 with this method.

What exactly goes wrong?

When I looked closely at the installation I saw that at a certain moment the setup program starts some actions for services and databases. After watching the databases being created and deleted, I noticed that something went wrong at the “mssqlresource” database. The setup program tries to overwrite the “mssqlresource” database but for some reason fails, which results in a rollback of that part of the installation. It looks like the “mssqlresource” database is still in use by SQL Server 2005 what creates teh error.

Below are the steps that you can take to install SP3 on your SQL 2005 cluster nodes.

1: Create Backups

As for any DBA you should always backup your databases before you start the installation of updates or complete service packs. In my case I didn’t have many databases on my server because it was a fresh installation. I created backups of all the present databases and I copied the “master” and the “mssqlresource” backup files to a different location.

2: Take the SQL Services offline

Before you stop the services for SQL Server you should take that part of the cluster down. If you don’t do this, the cluster will try to start the service as soon as it goes down. Go to the “Cluster Administrator” and select the resourcegroup where the services for that node reside. Right click the “SQL Server (instancename)” and click on “Take offline”. Do this for every SQL Service.

3: Copy and rename the mssqlresource database

Now that the services are down, we can copy and rename the “mssqlresource” database.

The “mssqlresource” database isn’t visible in SSMS (SQL Server Management Studio) so I had to copy the files from this database.

Go to the shared array where the database is installed and copy and paste them in that directory. In my case I rename it to “mssqlresource_old.mdf” and “mssqlresource_old.ldf” but you can use any name you want.

4: Start SQL Server in single-user mode

Before we can make any changes to the system databases we put the SQL Server in the single-user mode so nobody except the admins can connect to the server. Open a command prompt and type the following command:

For servers with multiple instances you can use the following command:

5: Alter the mssqlresource database

Through the SQLCMD program we can send queries to the SQL Server. In the command prompt type the following command:

If you’re authenticated you’ll see a prompt like “1>”. Now we can send the ALTER queries. Type the following commands for the MDF and the LDF files and press the Enter-key after each command. For the MDF File:

For the LDF File:

After the completion of both commands you’ll get a message that the changes will take effect after the restart of the SQL Server.

6: Restart the SQL Server

Stop the service with the following command:

For servers with multiple instances you can use the following command:

Start the service with the following command:

For servers with multiple instances you can use the following command:

If you complete all these steps, you’ve done the necessary preparations for installing SP2 and SP3 in a clustered environment. The database files that are renamed will be replaced by new files during the installation with the original filenames.

This article is based on the Technet article on this location: http://technet.microsoft.com/en-us/library/ms345408.aspx. I wrote this so that you don’t have to search for that solution in the first place when you install SP3.