Log Shipping With dbatools – Part 4: Recover a Log Shipped Database

Standard

Reading Time: 3 minutes

In the third part of the series we discussed the way to get the log shipping errors using the command “Get-DbaLogShippingError”. This blog will be about how to recover a log shipped database.

Out of the box solutions

I can be very short in this paragraph, there is no out-of-the-box solution to recover a log shipped database.

Why this command

Initially log shipping was meant to be used as a warm standby. You have your data on another instance but you still have some human intervention to get it all back up.

Imagine the following situation. You have setup log shipping using either the GUI or the commands in dbatools. You have about 15 databases and everything is working fine.

Until one day the primary instance goes down and is not recoverable. For the production to continue you have to bring the log shipped databases online fast.

You have to figure what the last transaction log backup was. You have to check if it was copied to the secondary instance and if it’s restored.

To do this by running a couple of queries, copying the files if needed and run the log shipping jobs takes time. I’d rather run a command and recover one or more databases and get back to the problem of the primary instance.

Invoke-DbaLogShippingRecovery

The Invoke-DbaLogShippingRecovery command is will execute the following steps:

  1. Check the agent status and start the agent if it’s not a started state.
  2. Retrieve the latest transaction log backup and try to copy it to the secondary instance if it’s not present. It will wait and check the log shipping status to see if the backup file is copied.
  3. Retrieve the last restored transaction log backup. Execute the restore process to get the database up-to-date.
  4. Disable the jobs after the copy and restore process.
  5. After all the actions it restores the database to a normal state.

To execute the command

The result of the command

Recover Log Shipping Result Command

The image below shows the database in a normal state after the command in the SQL Server Management Studio.

Recover Log Shipping Result GUI

The result of the jobs being disabled

Recover Log Shipping Result Jobs

More options

In my example I showed how I recovered a single database. The parameter does accept multiple databases.

Besides setting the individual databases you can also let the command recover all the log shipped databases

In some cases you want to recover the databases but not execute the recovery to a normal state

 

This concludes the command “Invoke-DbaLogShippingRecovery”. This was the final post in this series. If you want to look back at the other command follow the links below:

  1. Setup Log Shipping
  2. Test Log Shipping Status
  3. Get Log Shipping Errors

 

Log Shipping With dbatools – Part 3: Get Log Shipping Errors

Standard

Reading Time: 3 minutes

In the second part of the series we discussed the way to test the log shipping status using the command “Test-DbaLogShippingStatus”. This blog will be about how to get log shipping errors to analyze why your log shipping isn’t working.

Out of the box solutions

There are several options to get the log shipping errors right out-of-the-box. One of them is using queries and the other one is the event viewer.

Queries

There are queries that can help to get log shipping errors. One that is likely used is by using the log_shipping_monitor_error_detail table.

Although this is a good way to get the log shipping errors, unless I’m using CMS or a monitoring server, I have no way to find errors on multiple instances with log shipped databases.
Furthermore, I want to keep my all my tools together and make it easy to solve any errors.

Event Viewer

The other way to see if there are any errors in the log shipping is by using the event viewer.

Get Log Shipping Errors Event Viewer

Personally I don’t like using the event viewer because there could be other errors which I would have to filter through before getting to the root cause. The other reason this doesn’t give me a complete view is because it doesn’t register any errors that occurred with restore of the backups.

Why this command

When there is something wrong with the log shipping I want to know about it as soon as possible. Using the above options doesn’t give me the flexibility to do that.
For instance I cannot check multiple instances at once, unless I’m using CMS or a monitoring server.

To make the tool set complete this command was needed to make sure a user would be able to get an overview of the log shipping status.

Get-DbaLogShippingError

By default the command will return all the errors that ever occurred. It collects all the information and returns it in a table structure to the user.

To execute the command:

The result is an overview of the errors.

Get Log Shipping Errors No Filter

In my play area I made sure I would have a lot of errors to test the command. It turned out I had more than 3100 errors!

Get Log Shipping Errors No Filter Count

This brings us to one of the reasons I created this command; filtering the results.

I may only want to see the errors that occurred the last hour.

The result is still a quite big list but more manageable.

Get Log Shipping Errors Filter Date From

In this example we’re only using the parameter “-DateTimeFrom” but we can also use “-DateTimeTo” to filter between certain periods.

More filtering

Of course there are other filters to make it easier to zoom in into a specific problem.

It’s possible to filter on the databases:

Get Log Shipping Errors Filter Database

To filter on the type of instance you can use the “-Primary” and “-Secondary” parameters

To filter on the specific actions use the “-Action” parameter:

 

This concludes the command “Get-DbaLogShippingError”. The next post will be about the command “Invoke-DbaLogShippingRestore”.

Log Shipping With dbatools – Part 2: Test Log Shipping Status

Standard

Reading Time: 3 minutes

In the first part of the series I described the command “Invoke-DbaLogShipping”. This makes it possible to set up log shipping. This blog post will be about the command to test log shipping status.

Before I describe the command, I want to discuss the options that are available in the SQL Server Management Studio (SSMS) and SQL Server.

Out of the box monitoring

Microsoft has made it possible to check the log shipping using SSMS. This can be done using queries or by using the “Transaction Log Shipping Status” report.

Queries

To return the log shipping status using T-SQL execute the following code in a query window in SSMS:

The result is as follows:

Log Shipping Status Query Result

A good method to find any possible issues but you need to check multiple columns to find out if something is wrong.

Transaction Log Shipping Status Report

By default SSMS has a report that shows the status of the log shipping. To open the report right-click on the server, go the “Reports”, Go the “Standard Reports” and click on “Transaction Log Shipping Status”.

Log Shipping Status Report

The result is an overview of each database with the status of that process:

Log Shipping Status Report Result

The report shows in red colored text if something is wrong which is a great way to find any problems. Still we need to go into SSMS and click around before we get to this information.

Why this command

Monitoring your log shipping processes is important. You need the synchronization status of the log shipped databases.

The log ship process consists of three steps; Backup, Copy and Restore. The log shipping tracks the status for these processes.
It registers the last transaction log backup, the last file copied and the last file restored. It also keeps track of the time since the last backup, copy and restore.

But that’s not all. Log shipping also checks if the threshold for the backup and restore has been exceeded.

During the log shipping thresholds are set for the backup and restore process.
The default thresholds are 60 minutes for the backup and 45 minutes for the restore. There is no threshold for the copy process.
If for example, the last time since the last backup exceeds the backup threshold an alert will be triggered.

That’s a lot of information to consider and that’s why this command was developed. It will enable the user to get a complete overview of the log shipping status without having to know all the details.

Test-DbaLogShippingStatus

The commands will return a lot of the information by default. It will collect all the information and based on that returns a status.

To execute the command:

The result is a detailed overview of each database including the status.

Log Shipping Status Command Detailed

This looks a lot like the results you get from the queries of the report we talked about earlier. All this information can be a little overwhelming and it’s not always needed.
If you don’t need all that information there is an option to format and filter the output.

It will only show you the core information to know what the status of the log shipping is.

The result of the command:

Log Shipping Status Command Simple

As you can see there is lot less information. It only shows the instance, database, instance type and the status. In most cases that’s all you need.

More filtering

The command also allows you to filter out the primary and the secondary databases.

There is a way to filter on specific databases using the “-Database” parameter.

Of course there is also an option to exclude certain databases by using the “-ExcludeDatabase” parameter.

 

This concludes the command “Test-DbaLogShippingStatus”. The next post will be about the command “Get-DbaLogShippingError“.

Log Shipping With dbatools – Part 1: Setup Log Shipping

Standard

Reading Time: 7 minutes

This post is the first one of a series of four describing all the different commands. We’ll discuss the commands to setup log shipping with dbatools, how to check the status, to check for errors and to recover in case of an emergency.

What is log shipping

Before we go into the why I want you to know what log shipping is.

Log shipping dates back to the early versions of SQL Server. The first article that I could find dates back to the year 2000 and explains how to setup log shipping with SQL Server version 7.

Microsoft has done a lot of work on the log shipping setup process and we no longer have to setup the “poor man’s log shipping” anymore. The setup contains a nice GUI and it makes easier to setup log shipping for your database(s).

Log shipping is the process of creating a transaction log backup. Copy that backup to a secondary server and restore the log to synchronize the databases. The image below shows the different elements of a log shipping architecture..

Log Shipping

https://docs.microsoft.com/en-us/sql/database-engine/log-shipping/media/ls-typical-configuration.gif

I’m not going to explain in detail what log shipping is or how to set it up using the traditional method. There are lots of websites that have already done a great job describing everything related to log shipping. Here are a couple of websites that could help

Why create a command set

Like I said before, Microsoft has done a great job of automating the log shipping process and making it easier for the user to setup. The GUI describes all the settings needed. Even someone inexperienced could setup log shipping in a short amount of time (given that all the requirements have been met).

Microsoft has simplified the process since the earlier versions of SQL Server, setting up log shipping still takes a lot of time. You have to setup each database individually and most of the time your are doing almost the same.
There are ways to automate this using dynamic T-SQL. However, I found that, at least in my situation, it was prone to errors and it took me more time to troubleshoot it than to set it up manually.

Why we needed to setup log shipping

A couple of months ago my company wanted to create a High Availability solution for our production databases.
Researching a solution like clustering for Availability Groups, we looked at architecture decisions, licensing and costs. The outcome was that we weren’t able to use any SQL Server Enterprise HA solutions.
Log shipping is not a so called hot standby that immediately recovers in case a service or database goes down. Instead it is called a warm standby where there is human interaction needed to recover.

We have about 20 production databases and in my opinion we are a small shop. Looking at the database size and amount of transactions.
We decided to go with log shipping. It’s well established within SQL Server and not yet deprecated like mirroring.

So on a Monday morning, after all the backups were done, I started to configure our databases for log shipping. For me the process of setting up log shipping was very tedious. I basically had to go through each of the databases and use the same settings over and over. The biggest difference would be the database name in the backup or copy directory.

The reason for the command set

In the end it took me about 3 to 4 days to set it up. This was mostly because I didn’t want to stress the network too much by setting up multiple databases at once. At some point I made some mistakes and had to start over again. I really don’t like to do repetitive tasks and in my daily work I automate most of those processes.

To setup log shipping for a single database in the GUI you need about 40 clicks and have to  insert lots of information in different windows.

That’s where I contacted Chrissy LeMaire ( b | t ) from the dbatools project and asked if it would be interesting to create log shipping commands for the PowerShell module. She was really enthusiastic about it and I started to develop the commands.

The command set

Initially it started with the command “Invoke-DbaLogShipping” but soon after a couple of commands followed. The command set contains the commands

This blog will be the first of four posts where we dive into “Invoke-DbaLogShipping” because we first need to have log shipping setup before we can use the other commands.

The Command Invoke-DbaLogShipping

During the development of the command I chose to make the process for setting up the log shipping as easy as possible for the user. My main requirement for the command was that as long as you have the minimal requirements for log shipping you would be able to set it up.

The main command is “Invoke-DbaLogShipping”.

Within log shipping there are a lot of decisions that can be made. All those decisions ended up with lots of parameters. The command has over 90 different parameters ranging from setting the backup folder to setting the name of schedules.

I will explain a couple of the most important ones that I think will be used the most:

General

  • Force: When this parameter is used, lots of assumptions will be made and you need only a minimum of settings to setup log shipping.

Server

  • SourceSqlInstance: The primary instance
  • DestinationSqlInstance: The secondary instance

Database

  • Database: Database (or databases) that need to be log shipped

Backup

  • BackupNetworkPath: Network path where the backups will be saved. This is the parent folder, a child folder for each database will be created in this folder.
  • BackupLocalPath: The local path for the backups. Only used by the primary instance and not mandatory
  • BackupRetention: How long to keep the backups. The default is 3 days.
  • BackupScheduleStartDate: Start date when the backup schedule needs to start
  • BackupScheduleStartTime: Start time of the backup schedule. Maybe the backups only take place a certain time periods of the day.
  • BackupScheduleEndTime: End time of the backup schedule. Maybe the backups only take place a certain time period of the day.
  • CompressBackup: Do you want the backup to be compressed. By default it looks at the SQL Server edition and server settings.
  • FullBackupPath: Path to the full backup. Is only used when log shipping is being setup.
  • GenerateFullBackup: Instead of using an existing full backup you can use this parameter to create a full backup on the fly during setup.
  • UseExistingFullBackup: Besides FullBackupPath and GenerateFullBackup you can also set the option to let the command retrieve the last full backup made for the database.

Copy

  • CopyDestinationFolder: Where do the backups need to be copied to

Restore

  • DisconnectUsers: Important setting if your secondary database is in read-only mode. Users will be disconnected if the restore process starts and this is parameter was set.
  • Standby: Do you want your databases to be in stand-by mode. By default the database will be in no-recovery mode.
  • StandbyDirectory: Directory where the TUF (transaction undo file) file needs to go for a database. This is needed when using the -Standby parameter.

Setup log shipping using Invoke-DbaLogShipping

Before we look into the command we need to go through the prerequisites of log shipping. The items below are the bare minimum to setup log shipping.

Prerequisites

  • At least SQL Server 2000 standard edition
  • The primary database needs to be in Full recovery mode.
  • A network share/drive that’s accessible for the primary and secondary server

A more sophisticated setup would be:

  • A primary server containing the primary databases
  • A secondary server containing the secondary databases
  • A monitoring server logging all the information
  • At least SQL Server 2000 standard edition
  • The primary database needs to be in Full recovery mode.
  • A separate network share for the backup process
  • A separate network share for the copy process

Technically you don’t need multiple servers to setup log shipping. You can set it up with just one single SQL Server instance. In an HA solution this wouldn’t make sense but technically it’s possible.

Having a separate server acting as the monitoring server ensures that when one of the server goes down, the logging of the actions still takes place.

Having a separate network share for both the backup and copy makes it easier to setup security and decide which accounts can access the backups. The backup share needs to be readable and writable by the primary instance and readable by the secondary instance.
The copy share needs to be accessible and writable for only the secondary instance.

Additional information

  • Backup Compression
  • Database Status
  • Schedules
  • Cleanup

Backup Compression

Backup compression was introduced in SQL Server 2008 Enterprise. Beginning in SQL Server 2008 R2, backup compression is supported by SQL Server 2008 R2 Standard and all higher editions. Every edition of SQL Server 2008 and later can restore a compressed backup.

Backup Compression in  SQL Server

Database Status

You can decide whether you want the database to be in a recovery or read-only state. Read-only gives you the ability to let other processes, like SSRS or ETL, read production data without interfering with the actual production instance.

Schedules

When to run the backup of the transaction log. Does it need to run every 15 minutes, which is the default, or every hours, once a day etc. The same goes for the copy and restore schedule.

Cleanup

How long do you need to hold on to the backups. The default is three days which in some cases can be quite long.

Setup

This is where it’s going to be fun. As an example I’ll be using the command with the minimum of parameters. To do this a couple of the requirements need to be met. Let’s go through these prerequisites:

  1. Setup the shares for the backup and copy destination
  2. Privileges for primary and secondary instance to the backup and copy shares are set
  3. Database(s) that need to be log shipped
  4. Set the database in read-only or no-recovery mode
  5. Compress or not to compress the backup
  6. Generate or use an existing full backup

The command below will log ship the database from instance SQL1 to SQL2 by generating a full backup. The backups will be compressed and to initialize the backup a new full backup is generated.
No other settings are made. To make sure all the settings are met the -Force parameter is used to enable all the assumptions.

The result:

logshipping example force

This is only a fraction of the parameters available. The remaining parameters make it possible to configure your log shipping exactly the way you want it to be.

Execute the following command to get more information:

The next blog will be about the command “Test-DbaLogShippingStatus“. It’s responsible for testing if all of your log shipped databases are in a synchronized state.

Lets get all Posh! Log Shipping

Standard

Reading Time: 5 minutes

This month’s T-SQL Tuesday is brought to us by my good friend Rob Sewell (b | t) and the subject is “Let’s get all Posh – What are you going to automate today?

My post will be about log shipping because setting up log shipping using the GUI is a very tedious task and doing it with TSQL scripting is no fun either. So let’ see how we can do this with PowerShell.

 

 

Log shipping explained

If you’ve been a SQL Server DBA for a while you probably heard about log shipping.

Log shipping makes it possible to increase the database’s availability by automatically backing up the the transaction log, copying the backup file and restoring the backup file to another database instance (or the same instance with a different database name).

Log shipping is an older technology but still has a purpose in lots of environments where it’s being used for reporting, ETL or to give developers and users access to production without the risk of them breaking anything.

Why use PowerShell

If you’ve ever had to setup log shipping in the SQL Server Management Studio (SSMS) than you know how tedious it can be. I actually counted the amount of clicks to setup a single database for log shipping and you need about 40 of them to get the job done.

So one day we decided to create a warm standby for one of our production servers. I had setup log shipping before and if you know what to do it’s not that difficult.
It becomes a problem when you have to setup log shipping for more than 20 databases ranging from 10 GB to 800 GB in size.
I ended up spending about a week setting it up (between other tasks I do have more work to do) and for me that’s way too much time for something as simple as this.

So why use PowerShell, because anything I can do with the SSMS can be done with the SMO. This would save a lot of time for me and anyone using the the functionality.

How did I do it

I contacted Chrissy LeMaire who is the original brainchild of the dbatools PowerShell module and she was instantly excited to have this functionality within the module.

The first thing I had to do was to find out how SQL Server in the background setup the log shipping when I used the GUI. Yeah I know using the GUI is bad when you use a lot of PowerShell but sometimes I like to do it old school.

I quickly found out that SQL Server does a lot of things to set up log shipping. It creates jobs, job steps, job schedules, uses stored procedures to set values in specific tables etc etc.
I went to the dbatools module and used the Find-DbaCommand function to see if there was any functionality to create the agent objects (jobs, steps and schedules) but they weren’t there so the first step was to create a full set of functions to manage those objects.
A couple of thousand lines later I finished the agent functionality and could continue to developing the log shipping function.

That function started pretty small, but like the agent functionality, quickly grew to be some sort of monster with over 70 different parameters that would support every kind of situation.

After about a couple of hundred iterations it was finally committed to the development branch of the dbatools module to get tested. I have great respect for the people who did the testing of that functionality because I didn’t envy them. I knew how complicated it was to build everything and to test all that must have been a lot of work. Thank you to all the testers that accepted the challenge.

The solution

The entire log shipping functionality is now separated between 5 functions. Four of them are used internally and are not visible as a public function because you can easily break stuff if it’s not being used correctly.

The main function, Invoke-DbaLogShipping, is available in the dbatools module for anyone to use.

If you open the GUI for the log shipping you have lots of choices but most of them are already supplied in the GUI itself and you can decide whether to use them or not.
The whole idea behind the functionality was that yit would allow you to quickly setup log shipping using a lot of defaults like you can in the GUI, but if you’re more experienced you can change any setting to your preferences.

it will take you about a minute to set all the parameters and after that the only action that will slow you down is the size of you database files when you use a restore.

Enough talk I want to see some action!

I will demonstrate how to setup log shipping using the least amount of parameters. You actually need six parameters to make it work:

  1. SourceSqlInstance – Which instance need to be used as the source server)
  2. DestinationSqlInstance – Which server needs to be used to log ship the database too
  3. BackupNetworkPath – Where do the backup need to be placed
  4. Database – Which database, or databases, needs to be log shipped
  5. GenerateFullBackup or UseExistingFullBackup – Generate a full backup or use an existing one

The command below will log ship a database from server1 to server2.
It will log ship one database called db1.
The network path for the backups is \\server1\logshipping\backup.
The backups will be created with backup compression.
No new backup will be created, instead it will use the last full backup
In this example the suffix for the database is “LS” because I want to differentiate the log shipped databases.

Below is the result of the log shipping.

What else can be done with this function?

When I developed this function it had a lot of iterations. I’m a developer that will have a good idea what to build and during the process will think of other functionality that needs to be put in.
I had a lot of iterations where I constantly thought of new ideas that would make other people’s work easier.

A couple of examples:

  • To monitor or not to monitor log shipping
  • Enabled or disabled log shipping jobs
  • Naming convention of the jobs
  • Standby or read-only mode for the secondary database
  • To compress or not to compress the backup files
  • Generate a full backup for initialization
  • Log ship multiple databases in one command. This will be executed in serial.
  • Force every setting and discard anything that’s there

The last one needs a little more explanation. I want this function to work really easy and with existing settings I want it to work. The -Force parameter is a really powerful parameter which will assume a lot of the defaults but will also overwrite existing settings like jobs, schedules, database settings etc.

Conclusion

I will never ever use the GUI again to setup log shipping again and I hope you will too. It’s just way too much work and I’ve had it fail on my lots of times not knowing what went wrong having to set everything up from scratch.

There is more functionality coming for the log shipping functions like status checks and fail-over initialization.

If you find that something is missing, or works inefficient or doesn’t work (I hope not), in this function please let us know. You can find us on Slack, GitHub, Twitter, YouTube, LinkedIn. You can find me on on the Slack channel most of the time. If you have any questions you know where to find me.

The last thing I want to say is that dbatools module is a community driven module that enables database administrators to work more efficient. It’s because of the hard work of all the volunteers that this module is such a success.
It’s a group of very hard working and smart people who I have great respect for. It is because of the enthusiasm of all those people, and especially Chrissy, that I love working on the module.

Thank you again Rob for hosting this month’s TSQL Tuesday, it was really fun.

Create Your Own PowerShell Profile

Standard

Reading Time: 3 minutes

Being a fan of automation I like to create my own PowerShell profile. It enables me to load various settings that normally take more time. The PowerShell profile resides in your home directory and if you work in an AD environment with roaming data you’ll have the same profile on every computer.

PowerShell profiles are not new and dates back to PowerShell v2.0. Others people have written about this subject before but I wanted to share my take on it.

Create the profile

To check if you have a profile execute the following command:

Profile Test Existence

If this returns false it means you don’t have a profile and you have to create one. The easiest way to do that is by executing the following:

Profile Create

From this point on you can put in anything that helps you speed up.

PowerShell drives

I have several directories which I use regularly. Instead of having to type the entire path I can create a PowerShell drive.

Profile PS-Drive Example

Now instead having to type the entire path I can simply use the new drive. This not only speeds up the way you enter your directory but also simplifies the reference to your current working directory.

Credentials

The next thing I do is when I’m part of a domain I usually need another user account to access my database servers. So I call the next piece of code to enter the credentials for my administrative account:

Get-Credential Window

You can reference the $Credential parameter to other functions that need more privileges. For instance, if you use dbatools, a lot of the commands have this parameter to make sure you connect to the SQL Server instance with the right credential.

Color schemes

Color schemes are important. There have been studies that show that certain fonts and colors make working on a console easier for the eyes.

I don’t like the colors for errors and warnings and if I have to go through a lot of them it’s hard to read the bright red on the blue background.

Profile Color Example

At least I can read this better than the original color scheme. Play around with this and you’ll see it makes handling messages from the console a lot easier.

Window and buffer size

It’s possible to change these values to make sure all your data fits inside your console.

Setting the location

For me it’s not always a good thing that the working directory is set to my home folder. This happens by default when you open a PowerShell console. If you open it with administrative privileges it points to  system32 folder of you Windows directory.

Conclusion

In the end you can load all kinds of settings in your profile to make your life a little easier. From the settings above to creating aliases, starting other scripts etc.

I hope this was informative for you and maybe you’ll get started with profiles too.

Testing Log Shipping with PowerShell and Pester

Standard

Reading Time: 3 minutes

Thanks to my good friend Rob Sewell (b | t) I got into testing with PowerShell with Pester.
He showed me how to use the Pester module to test about anything you can test with PowerShell.

In my environment I have lots of databases being log shipped. To see the current status of the log shipping I either have to execute a procedure or execute a standard report from SSMS. And I know there are other solutions but in most cases it comes back to these two.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds. This doesn’t show me the problems right away but it’s fast.
The report shows me in nice green and red colors which which is good and what’s not but I’d have to do it for every server manually. So it’s pretty but it’s slow.

Why isn’t there anything that has the best of both worlds where I have the speed of the query with the clear results of the report?

That’s where Pester comes in!

I can use Pester to test the values that come back from the procedure and do basically the same thing the report does.

The procedure returns all the databases, the primary and the secondary, with the values for last backup time, copy time, restore time and the thresholds.
There is also an overall value called “status” which shows a 0 when everything is fine and a 1 when something is wrong.

I wanted to keep the check as simple as possible. If everything is fine, I don’t need to go into much detail. If there is something wrong I want to know which part of the log shipping failed, like the backup, copy or restore.

The Overall Test

This test only looks at the the marker the log shipping sets to indicate if everything is OK or not.

Log Shipping Pester Overall

Ooh oh! Something failed. Good thing we have our test.

The Detailed Test

The detailed test goes into the detail of the last backup, last copy and last restore time. We want to know what went wrong and where.

In this case I only want to zoom in on the database with the failed test so I use the -Database parameter to give the name of the database.

Log Shipping Pester Detailed Single DB

Of course it’s possible to execute the detailed test for all the databases. Just remove the database parameter and the same tests are run for all the databases.

Log Shipping Pester Detailed All DB

The entire test took less than two seconds and made it possible for me to quickly see if everything is fine with the log shipping.

The code can be downloaded from github. I hope this helps you out and make your life a little easier as it did for me.

 

 

Why I love dbatools

Standard

Reading Time: 4 minutes

I’ve been working for the dbatools project for a while now and I felt telling you why I love this project.

A little background about me, I’m not a full time programmer. I learned programming with Java years ago and did little personal projects with PHP, C# and a couple of other languages. I started PowerShell about 7 years ago and thought I was capable of delivering solid code. That all changed with dbatools.

Being part of the team

So for the last year I’ve part of the project, from debugging code, adding new features to existing functions and adding new functions.

A world opened up when I first joined the project. I had no idea I would be learning this much in such a short time. I had to deal with GIT, QA, advanced modules, styling, Pester etc.

So my first command was a little command Find-DbaOrphanedFile. One time Chrissy LeMaire asked if someone could make a function to find orphaned files of databases. I jumped in because I knew this was something I could do and I didn’t have the chance yet to do help out with the project. In about two weeks I had the function done as far as I knew. It did the job and now I wanted to present my awesome code to other developers.

My first challenge was to get to deal with GIT. I had never used GIT and the only source control my company had at the time was Visual SourceSafe. Don’t judge me! I wasn’t the one that decided to use an out-of-date piece of software. Of course when you do things the first time you’re going to fail and I failed big time. I made a branch from the wrong fork, committed stuff but didn’t synchronize it, created a pull request (PR) in the wrong branch and more. I did everything wrong you could do wrong and still Chrissy was nice as always trying to help me out to get everything on track.

After the GIT hurdle I finally submitted the PR and after about a day I got a kind but long comment back from one of the members that did the QA. Before I started, I read some of the standards the project put in place but as a developer you want to get started and a as a result I forgot some of them. The funny thing was though that I learned more about PowerShell, modules, functions, standards etc in that one comment than I had did in the last 4 years.

What struck me was the way the members dealt with the people like me who weren’t familiar with a more professional way of development. The members understand that reacting the wrong way, I would’ve quit helping out with the project because it would be too overwhelming.

That’s one of the strengths of the project, to embrace everyone that wants to help out. Find a way to make everyone a functional member of the team being either a developer, QA, writing articles etc.

That made me more enthusiastic about the project and I started to contribute more. Now I’ve become one of the major contributors.

In the last year I learned more about PowerShell than I did in my history of doing PowerShell. I’ve become more precise when it comes to my code, I go over my tests in meticulous way and try to keep by coding standards. I looked back at some code I’d written over the years and imagined that some crazed out monkey with a brain fart high on crack made it. Now I go through all the code I’ve written over the years and redo everything that’s no longer up to my new standards.

Being a user of the module

The other reason I love dbatools is because it has made my life soo much easier. I see myself as one of the lazy DBAs that would rather spend a couple of hours automating his work, than having to do the same thing over and over again. The project has about 200 different functions and it’s close to releasing version 1.0. This is big deal due to a lot of standardizing, testing and new functions that are going to get released. With that amount of functionality in one single module there is bound to be a solution for you in there to make it easier to do your work. Nowadays I test my backups every day using the Test-DbaLastBackup function. I see the status of all my backups on all my database server within seconds. I retrieve information about many servers without having to go through each one of them. And migration have been a blast.

If you aren’t exited about the project yet, please visit the website and look what has already been accomplished. Go see all the commands and then decide if it’s worth implementing in your daily work. If you’re wondering if a command is there that could help you out, the command index can help you find more information. This is still in beta though but we’re working on getting the information in there.

Thank you!

I want to thank the dbatools team for making me feel welcome and to boost my knowledge to a point that it has made a significant impact in the way I do things in my daily life.
I also want to thank all the contributors that put in all the effort to get the project where it is today. Without all the people putting in their valuable time this wouldn’t have worked.