Why I love dbatools


I’ve been working for the dbatools project for a while now and I felt telling you why I love this project.

A little background about me, I’m not a full time programmer. I learned programming with Java years ago and did little personal projects with PHP, C# and a couple of other languages. I started PowerShell about 7 years ago and thought I was capable of delivering solid code. That all changed with dbatools.

Being part of the team

So for the last year I’ve part of the project, from debugging code, adding new features to existing functions and adding new functions.

A world opened up when I first joined the project. I had no idea I would be learning this much in such a short time. I had to deal with GIT, QA, advanced modules, styling, Pester etc.

So my first command was a little command Find-DbaOrphanedFile. One time Chrissy LeMaire asked if someone could make a function to find orphaned files of databases. I jumped in because I knew this was something I could do and I didn’t have the chance yet to do help out with the project. In about two weeks I had the function done as far as I knew. It did the job and now I wanted to present my awesome code to other developers.

My first challenge was to get to deal with GIT. I had never used GIT and the only source control my company had at the time was Visual SourceSafe. Don’t judge me! I wasn’t the one that decided to use an out-of-date piece of software. Of course when you do things the first time you’re going to fail and I failed big time. I made a branch from the wrong fork, committed stuff but didn’t synchronize it, created a pull request (PR) in the wrong branch and more. I did everything wrong you could do wrong and still Chrissy was nice as always trying to help me out to get everything on track.

After the GIT hurdle I finally submitted the PR and after about a day I got a kind but long comment back from one of the members that did the QA. Before I started, I read some of the standards the project put in place but as a developer you want to get started and a as a result I forgot some of them. The funny thing was though that I learned more about PowerShell, modules, functions, standards etc in that one comment than I had did in the last 4 years.

What struck me was the way the members dealt with the people like me who weren’t familiar with a more professional way of development. The members understand that reacting the wrong way, I would’ve quit helping out with the project because it would be too overwhelming.

That’s one of the strengths of the project, to embrace everyone that wants to help out. Find a way to make everyone a functional member of the team being either a developer, QA, writing articles etc.

That made me more enthusiastic about the project and I started to contribute more. Now I’ve become one of the major contributors.

In the last year I learned more about PowerShell than I did in my history of doing PowerShell. I’ve become more precise when it comes to my code, I go over my tests in meticulous way and try to keep by coding standards. I looked back at some code I’d written over the years and imagined that some crazed out monkey with a brain fart high on crack made it. Now I go through all the code I’ve written over the years and redo everything that’s no longer up to my new standards.

Being a user of the module

The other reason I love dbatools is because it has made my life soo much easier. I see myself as one of the lazy DBAs that would rather spend a couple of hours automating his work, than having to do the same thing over and over again. The project has about 200 different functions and it’s close to releasing version 1.0. This is big deal due to a lot of standardizing, testing and new functions that are going to get released. With that amount of functionality in one single module there is bound to be a solution for you in there to make it easier to do your work. Nowadays I test my backups every day using the Test-DbaLastBackup function. I see the status of all my backups on all my database server within seconds. I retrieve information about many servers without having to go through each one of them. And migration have been a blast.

If you aren’t exited about the project yet, please visit the website and look what has already been accomplished. Go see all the commands and then decide if it’s worth implementing in your daily work. If you’re wondering if a command is there that could help you out, the command index can help you find more information. This is still in beta though but we’re working on getting the information in there.

Thank you!

I want to thank the dbatools team for making me feel welcome and to boost my knowledge to a point that it has made a significant impact in the way I do things in my daily life.
I also want to thank all the contributors that put in all the effort to get the project where it is today. Without all the people putting in their valuable time this wouldn’t have worked.

T-SQL Tuesday: The times they are a-changing


sql tuesdayThis months T-SQL Tuesday is is inspired by the blog post Will the Cloud Eat My DBA Job? by Kendra Little. Technology has changed a lot in the past years, especially with cloud/globalization/automation. What an impact has this had on your job? Do you feel endangered?

Over the years I’ve seen a lot of things change with SQL Server. I can remember that somebody told me when I started with SQL Server 200 that T-SQL was never going to be a big thing. How wrong was that guy right?!

I personally have not yet had the chance to do a lot with the cloud like Azure SQL Database.I did get the change to fiddle around with a trial period of Azure to see what I could do and to get a feel for the interface. Unfortunately this was just a trial and after it ended I didn’t pick it up again. I should because I like to learn new skills.

What an impact has this had on your job?

Some things have had a great impact on my work. Take PowerShell for instance. If I didn’t start with that when it first came out I would still be clicking around SSMS like crazy. I like to automate everything that’s repetitive because I like to spend my time efficiently doing stuff that gives me energy and not do things over and over again.

At this moment cloud has not an impact on me. Employers in the past did not see any benefit of it at that moment and my current employer does not yet see use of it either. Maybe something will change in the near future but for now it’s not going to happen 🙁
I would really love to start working with the cloud and move some of our databases because I think it is a valuable addition to traditional architectures.

Do you feel endangered?

No! And I’ll tell you why.

When I first started to work with servers and databases you had a physical server where you had to put in a CD or DVD and run the install from a console to get Windows Server installed. As soon as everything was set up you could remotely log in and do the rest of the work from the office.

The server room would look a little like this:

Servers in serverroom

Nowadays you, or your system administrator either logs in to VMWare or Hyper V. He or she creates a new server, and if they work smart, it’s done with cloning or scripting and the server is done within minutes.
The last time I physically touched a server to do database administrator work was about 8 years ago. If you have the same situation you kind of work in a cloud-like environment.

The second reason I don’t feel endangered, is because as a DBA I have to deal with everything within and around SQL Server. Bluntly said a VM for me is nothing more than a box of resources where my instances do their work. I know people will disagree with me but if you’re not the system administrator the underlying hardware layer is invisible to you.

If your employer is a company with one or more database servers there will always be a need for a DBA. There will be performance issues, new installations, high availability, reporting etc etc.

Or do you have more exciting features/toys to work with?

In the last couple of years so much has changed in SQL Server that it’s impossible to comprehend everything.

I’m excited to work with SQL Server vNext on Linux for example. There are a lot of new features for Business Intelligence with Analysis Services and Power BI.

You can create stretched databases where parts of the database are in the cloud and others are on premise. Imagine a hybrid environment where parts of the network are locally/on premise and other parts are deployed in the cloud.

Microsoft has embraced PowerShell for SQL Server and has now made it open-source. How cool is that! There are more and more people developing in PowerShell and creating modules like dbatools.

There is so much new stuff that I have to cherry pick what to do first.

Do you embrace the change and learn new skills?

I embrace change and love to learn new skills.

The world is changing for the DBA and we as DBAs have to change with it. We’re no longer the strange guy in the cubicle that only shows up when something goes wrong. I see more and more situations that we have to be on the forefront taking action and be visible to our colleagues.

If your employer wants to work in the cloud don’t be afraid of it, embrace it, learn everything you can about it.If you have processes that are inefficient, automate/optimize them and make your life easier.

Be the one that has the vision to get the company to higher level and you’ll see that everything will work out.



Testing of backups updated


Last week I showed how you can test your backups using the Test-DbaLastBackup function in the dbatools module.

The script makes it easier to iterate through multiple servers, export the data and to send the results to you using e-mail.

My good friend Rob Sewel wrote a nice post to take the function Test-DbaLastBackup a little further. In his example he uses COM objects to create the Excel document. I personally don’t like COM objects because they’re slow and you need the application installed on the running machine. Although COM objects have these disadvantages, they’re pretty solid en work smoothly most of the time.

I wasn’t satisfied with my script exporting to CSV because I would open the CSV file with Excel, so why not export it to an Excel file right away. I usually use the ImportExcel module to export data to Excel.
There was also a need to add the data and log directory to the command to make sure you would be able to use other drives to temporary save the database files.

To run the script you need the dbatools and ImportExcel module.

The last thing that was done is that this function is now available on Git and available for anyone to download via this link.

The function also can be seen below:

function Test-MyBackup
        Function to automatically test backups for SQL Server
        The function is a wrapper around the function Test-DbaLastBackup function
        from the dbatools module. It makes it easier to collect the information
        and send the results.
    .PARAMETER SqlServerSource
        The instance where the backups should be tested for

    .PARAMETER SqlServerDestination
        The instance where the backups should be tested on
    .PARAMETER DataDirectory
        Directory used to place the data files for the database

    .PARAMETER LogDirectory
        Directory used to place the tranaction log files for the database

    .PARAMETER OutputPath 
        Path where the results and log file should be saved

    .PARAMETER Databases
        Filter for the databases. This can be more than one

        E-mail server to use to send the e-mail

        E-mail address from
        List of people that need to receive the e-mail
    .PARAMETER MailCredential
        Credential for the e-mail server if a credential is needed

        Test-MyBackup -SqlServerSource 'SQL1' -SqlServerDestination 'SQL2' -OutputPath 'C:\BackupTest\Log' -SMTP 'mail.corp1.com' -From 'backuptest@corp1.com' -To 'user1@corp1.com' 



        [Parameter(Mandatory=$true, ValueFromPipeline=$false)]
        [Alias("ServerInstance", "SqlInstance")]
        [string]$SMTP = $null,
        [string]$From = $null,
        [string]$To = $null,

        # Test if the outputpath is available
        if(!(Test-Path $OutputPath))
                New-Item -ItemType Directory $OutputPath
                Write-Error $_.ErrorMessage

        # Clean up the output path variable
            $OutputPath += "\"

        # Create a timestamp
        $TimeStamp = Get-Date -format 'yyyyMMddHHmmss'

        # Set up the error log
        $ErrorLog = "$($OutputPath)BackupTest_$($TimeStamp).log"

        foreach($s in $SqlServerSource)
            if ($s)
                # Set the destination
                if($SqlServerDestination -eq $null)
                    $SqlServerDestination = $s

                Write-Host "Start testing backups for $($s)"

			    # Check the instance if it is a named instance
                $ServerName, $InstanceName = $s.Split("\")

                if($InstanceName -eq $null)
                    $InstanceName = 'MSSQLSERVER'

                # Execute the test
                try {
                    # Set up the output file
                    $OutputFile = "$($OutputPath)BackupTest_$($ServerName)_$($InstanceName)_$($timestamp).xlsx"
                    $OutputFileCsv = "$($OutputPath)BackupTest_$($ServerName)_$($InstanceName)_$($timestamp).csv"

                    # Test it!
                        if($databases -eq $null)
                            $results += Test-DbaLastBackup -SqlServer $s -Destination $SqlServerDestination -DataDirectory $DataDirectory -LogDirectory $LogDirectory 

                            $results += Test-DbaLastBackup -SqlServer $s -Destination $SqlServerDestination -Databases $Databases -DataDirectory $DataDirectory -LogDirectory $LogDirectory 
                        # Oh oh error
                        $ErrorMessage = $_.Exception.Message
                        $TS = Get-Date -format 'yyyyMMddHHmmss'
                        "$($TS): $ErrorMessage" | Out-File $ErrorLog -Append
                    # Export the results
                    if($results -ne $null)
                        $results | Export-Excel $OutputFile -Show -AutoSize -AutoFilter -FreezeTopRow -ConditionalText $(
                            New-ConditionalText Skipped -BackgroundColor Gray -ConditionalTextColor Black
                            New-ConditionalText Success -BackgroundColor Green -ConditionalTextColor Black
                            New-ConditionalText True -BackgroundColor Green -ConditionalTextColor Black
                            New-ConditionalText False -BackgroundColor Red -ConditionalTextColor Black
                            New-ConditionalText Failed -BackgroundColor Red -ConditionalTextColor Black
                            New-ConditionalText -ConditionalType Last7Days -BackgroundColor LawnGreen -ConditionalTextColor Black

                        #$result | Export-Csv $OutputFileCsv -NoTypeInformation

                    # Check if the results need to be e-mailed
                    if(($SMTP.Length -ge 1) -and ($From.Length -ge 1) -and ($To.Length -ge 1))
                            Write-Host "Sending e-mail"
                            Send-MailMessage -From $From -To $To -Subject "Backup Test '$($ServerName)\$($InstanceName)' Completed" -Body "Backup Test for server $($ServerName)\$($InstanceName)" -Attachments $OutputFile -Priority High -SmtpServer $SMTP #-Credential $MailCredential                            
                            # Oh oh error
                            $ErrorMessage = $_.Exception.Message
                            $TS = Get-Date -format 'yyyyMMddHHmmss'
                            "$($TS): $ErrorMessage" | Out-File $ErrorLog -Append
                        "Couldn't send email for backup test on instance $($ServerName)\$($InstanceName). Missing values in the e-mail parameters." | Out-File $ErrorLog -Append
                catch {
                    # Oh oh error
                    $ErrorMessage = $_.Exception.Message
                    $TS = Get-Date -format 'yyyyMMddHHmmss'
                    "$($TS): $ErrorMessage" | Out-File $ErrorLog -Append
                Write-Error "No instance given."

            $results = $null

        Write-Host "Done Testing."


## Load module and run functions now.. 
Import-Module dbatools
Import-Module ImportExcel

<#Test-MyBackup `
    -SqlServerSource 'SQL1' `
    -SqlServerDestination 'SQL2' `
    -DataDirectory "C:\BackupTest\Temp\Data" `
    -LogDirectory "C:\BackupTest\Temp\Log" `
    -OutputPath 'C:\BackupTest\Log' `
    -SMTP 'mail.corp.com' `
    -From 'backuptest@corp.com' `
    -To 'user1@corp.com' #>



Testing your backups with dbatools

Backup Yourselves Data Loss Is Coming


It has always said, you’re as good as your last restore, not your last backup. How many of you make your backups and think that everything is OK. There comes a time that you have to restore your database from a backup and you find out that one or more backup files is corrupt.

Testing your backups is a tedious job and it takes a lot of time which I as a DBA don’t have. I don’t have the time to restore a database, run a DBCC command for every database that’s backed up.

There is a solution and it’s called “Test-DbaLastBackup” which is part of the dbatools module.

It goes through the following steps:

  1. Restore the database with the name “dbatools-testrestore-[databasename]” by default. The prefix can be changed.
  2. Run a DBCC check on the database
  3. Remove the database

Test Backup Progress

You’ll see a progress bar how far the database restore is.
After the restore the DBCC command is executed. You’ll not see any progress for that step.

When the entire process is complete the command will output the results:

Test Backup Progress

But for me that’s not enough. This is one database and some of my servers have more than 20 databases on it with sizes ranging from 50 GB to 500 GB (not that large yet but large enough). I want to create a job that executes the test on all the databases and send the results to me.

It’s not going to be a SQL Server Agent job but a Windows Scheduled Task. I don’t need the SQL Server Agent for this and it makes more sense to do this outside of SQL Server.

To start testing I created a file called “backuptest.ps1” and put in the following code:

## Load module and run functions now.. 
Import-Module dbatools

## Execute the test
try {
    Test-DbaLastBackup -SqlServer SSTAD-PC -Databases AdventureWorks2014, AdventureWorks2014_2, AdventureWorks2014_3 |  Export-Csv C:\temp\backuptest.csv -NoTypeInformation
catch {
    $_.Exception.Message | Out-File C:\Temp\backuptest.log

I added a try/catch block to make sure I would be able catch what went wrong.

If you don’t know how to execute a PowerShell script from the Windows Task Scheduler please read the following blog post: Use the Windows Task Scheduler to Run a Windows PowerShell Script.

After the setup my action looks like this:

Test Backup Task

Make sure your task is set to run under an account that has privileges to access the SQL Server and write the file.

Test Backup Privs

A couple of things that could be in this script:

  1. Execute the script over multiple servers
  2. Mail the results
  3. Add checks and error catching to make sure everything runs

Unfortunately the command “Test-DbaLastBackup” doesn’t allow us to supply servers. I could copy the row that tests the backup but that’s not me. I want things to run in one go and repetitive things don’t work out for me.

I don’t want to log into my server and lookup the results. I want them in my e-mail box when I check my e-mail i the morning. For that you could use the Send-Message commandlet.

The original script is removed because the script is updated and is now available on GIT via this link. Also check the new post about the updated version of the script.

Executing the script:

$pass = ConvertTo-SecureString “blabla” -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential ("your@emailaddress", $pass)

TestMyBackups -SqlServer 'yourserver', 'yourserver2' -OutputPath 'yourpath' -SMTP 'yoursmptserver' -From 'youremailfrom' -To 'youremailto' -MailCredential $cred

The script works well although, I would have liked to put in functionality like values from pipeline etc.

I hope you can use this script to your advantage and that testing your backups is no longer a something to avoid but to embrace.


Export-DMVInformation Module Update

Last Friday I had the chance to show the Export-DMVInformation module to the Dutch Powershell user group. After the presentation I got a couple of suggestions and wanted to put them in place them into the module.


  1. Possibility to execute the module using the pipeline
  2. Get all the databases in one statement by assigning the value “ALL” to the database parameter.
  3. Replaced messages with verbose
Changes 1 and 2 reduce the amount of time needed to get information from multiple instances and multiple databases. Before you had to execute the module in a external loop and for each database . Now you can get the information from all the databases across multiple instances in a single line!
'server1', 'server2' | Export-DMVInformation -database 'ALL'

Change 3 gives the ability to choose whether to show all the messages or not. Use the switch “-Verbose” to see the various messages.

An example how the new module works is shown below:

Export DMV result

Please take a look how to install or use the module on the module page.
Happy exporting!


Pretty up your KeePass

It's So Beautiful

Having a password manager these days is almost required especially for people working in IT. One password manager that I’ve been using for years is KeePass. You can secure the database with a master password and additionally (and recommended) create a key file.
You can create folders to differentiate the entries and it has a ton of features to make password management easy.

To make things even easier several browsers like Firefox and Chrome support using KeePass to handle passwords for websites.

If you don’t already have some sort of password manager, and I don’t mean the Post-Its you have on your monitor, I would tell you to at least give it a try. There are other solutions out there but this one is free and has made my life a lo easier.

So back to the original subject, prettying up KeePass. You might have noticed when you use KeePass that you can assign different icons to an entry.

Entry Details

And when you click the icon button you can select many standard icons.

Icon selection

KeePass makes it possible to add a custom icon and save it into the database. You don’t have to have the icons available on your computer.

As you may see in the image above I also have a selection of custom icons that I’ve added for my entries. In example I have an entry for LinkedIn for which I have the official icon for.

These days almost every bigger companies website has it’s logo in the top bar of the browser. That’s called a “shortcut icon”. Most websites have them named like “favicon.ico” or something similar.

If you open the code of the website and look for the i.e. shortcut icon you will get an URL that points to the icon. Copy the URL, open the password entry, click the icon button, click the add button for custom icons and paste the URL into the filename textbox.

Add new icon

Click the “Open” button and give it a couple of seconds to get the icon file downloaded and imported into the database. After the import of the icon it’s selected in the icon picker window. Click “OK” and you’ll see that your entry has it’s own icon.

Icon selection result

Now what are you waiting for, go pretty up your KeePass entries.

No One Listens To Me! Why? Part 2 – Colleagues


In part 1 of the “No One Listens To Me! Why?” series I described a situation where I had to convince my manager to take action on some actions.

This story will again be a true story where this time I had to convince my colleagues.


I went on the the SQL Cruise, which is a fantastic way to learn and meet new people. I attended a session by Argenis Fernandez (tl) who touched a subject about page verification.

Coming back to the office I wanted to start implementing my new acquired knowledge. I first investigated some of our main databases. What I found was that some of our databases were not configured with the right page verification. They still had the page verification set to “NONE”.  These databases were not installed by me and I had not considered this to be set this way.
If you know anything about corruption you’ll know that this setting is crucial to find corruption in your database.

Due to the impact this could have I went to the application engineers and arranged a meeting with them. There was a reluctance to come to the meeting but it was important to discuss, so with upside down smiles they arrived 10 minutes too late to the meeting.

The following conversation took place:

Me: I did some investigating on a couple of databases and found that there are some settings that need changing. We’re now on SQL Server 2012 a we have databases with settings dating back to SQL 2000. One of the I found is that our main database is not being checked for corruption.

Application Engineer (AE) 1: How do you mean doesn’t get checked for corruption we do a weekly DBCC check don’t we?

Me: Yes we do but that has little use because the pages in the databases are not marked with a checksum. Besides it’s only once a week if we’re already doing it due to weekly and monthly processes. There is a setting that enables the verification of pages in the database and this helps us find corruption.

AE 2: Ok but do we need it? Why should we enable it?

Me: Just as I told you it’s for finding out if a corruption took place in the database. Any corruption in the database is a really bad thing and should be avoided at any cause. If we don’t enable this we’ll not know if there is any corruption in an early stage. We would only find the corruption if the page is severely corrupted or if someone accidentally selects the data. To ask you a question, do you want to vouch for a database which could potentially give you corrupt information?

AE 1: When was this setting first released and you still didn’t give an answer why we should enable this.

Me: You’re not listening to what I’m saying. The feature was first released back in SQL Server 2005 and was so important that Microsoft made it a default setting for all new databases. As I answered this question many times, we need this to find corruption at the earliest moment possible.

AE 1: What’s the downside to his? I don’t know if our processes will be hit if we change this.

Me: There is no downside, it’s a setting that helps us. The only thing it’s going to do is from this point on check all the data that’s written and create a checksum for it. The next time it’s being read, it will check the checksum with the page and see if it matches.
But I understand you, I want this setting to be enabled and tested thoroughly before we put it in production.

AE 1 and 2: I’m still not convinced.

Me: You know what I’ll do? I’ll setup a demonstration to show you how this process works and how easy it is to corrupt some data in a database. Based on that we can decide what to do from that point.

As you may read through the conversation I had to deal with people who were scared of change and were reluctant to listen to anything I told them. I still had to cooperate with them to get this change done so I moved on.

Like I said in the previous article, you have to have the facts to build your case and that was the reason I wanted to show them how a corruption could work and why this setting was so important.

I created a demo and did the presentation to the application engineers.
I showed them a healthy database, corrupted the database with the setting we’re currently using and we didn’t find the corruption. I did the same thing with a database with the new setting and of course it showed me the corruption.

The outcome was far from satisfying because they were still not convinced.

Up to this point, let’s check what went wrong:

  1. Strong arguments did not work
  2. Nobody cared about the fact that situation could actually happen.

Strong arguments did not work

Even with the evidence in front of them I couldn’t convince my colleagues. The question they asked my in the end was: “But how many times would this actually happen?”.

If I wanted to I could’ve just changed the setting and let nobody know. In previous testing I got no performance loss and had no other symptoms either.

But that’s not what I wanted to do. If something would go wrong for some reason I would get the blame and that wasn’t what I had in mind.

From this point on I should’ve gone higher up the chain of command to my manager.

Nobody cared about the fact that situation could actually happen.

This hit me right after I did the corruption demo, they didn’t care and as long as it all worked they were not going to keen on changing anything. I understand the last part because I don’t want to change that works, but this was different because we didn’t do any thorough checks.

You’ll never be able to convince these people. If you get to this point and you did all the work, laying out the facts (and even demos) than take your losses and go higher up.

I did everything but still they won’t listen

Ok, so this situation is an extreme one. I was not able to convince my colleagues and this was a scenario that I couldn’t have won.

Instead I went higher up the chain because in the end my supervisor would be responsible if something would go wrong. My supervisor first reacted the same way as my colleagues. To make sure I got my point across I went to the IT manager.
I explained the situation and showed him the possible downtime and how much it would cost if we didn’t put this setting in place.

In the end the IT manager was not pleased with the whole situation and my colleagues and supervisor were called in to explain their reasons. That last thing came back came back to bite me.
That didn’t stop me because if we didn’t put the setting in place I would have been responsible to fix everything even though my colleagues were the ones who decided not to change anything.


Like I said this situation was an extreme one. In normal circumstances, with people with right attitudes, this would never happen. I’ve had other situations where it was the opposite of this and that my colleagues would listen to reason.

In the end it was my responsibility to make sure the the data is protected. You’ll always have people who will not agree with you about something. In most situations you’ll be able to get them convinced with the cold hard facts.

If that still doesn’t work you have to go higher up the chain of command. I’ve always been rigorous when it came to my work and if someone was in the way of me doing my work properly I would go around them.

That doesn’t mean you have to do that all the time. Choose you battles wisely and put your effort where it would make the biggest impact for you to deliver the best work you can.

Work After Hours

After hours email

It’s 8 PM, the kids are in bed and the wife and I are finally able to get some time for ourselves until I get an e-mail from work. Do I open the e-mail after hours or will I get to it in the morning? But what if it’s important and I have to act now? These days it’s we have to deal with blurring boundaries between work and life which has an impact on the so called work/life balance.

I recently read an article about France introducing a new law that where French workers are no longer obliged to respond to work related e-mails or phone call after hours.

What’s not really clear is if this bill limits the communication to e-mail and phone calls only or that they also included messaging apps too. These days we also use apps like Slack, HipChat, WhatsApp to communicate with colleagues which could be a loophole.

The thing that comes up when I read this is that in most situations it will not work because you’re removing all the flexibility. The other thing is that most companies evaluate employees based on their availability and their flexibility.

There also a side note that employers are allowed to make different arrangements with employees.
Employers will probably adjust contracts from this point on that, if you’re in some sort of position where the availability is important, you’re obliged to answer which will render the law useless in a lot of situations.

I never had a problem responding to work related e-mails after hours because I think in my profession as a DBA it’s part of the job. Usually you’re the lonely DBA (or part of a small team) and when shit hits the fan there’s nobody else that is able to fix it. That’s not only in the field of the DBA but mostly with all the colors of the IT rainbow.

In my opinion I think you’re responsible for setting your own boundaries to make sure your work/life balance doesn’t get distorted in such a way that it will affect you personal life in a bad way.

One thing I’ve always done is that when I was in the situation where I had to be available after hours I would have a separate phone. If I’m on vacation and I’m not able to respond, or don’t want to respond, I leave my phone at home. I would give my personal phone number to a very limited group of people that only in the case of a real bad situation they could try to contact me.
This has always worked out for me and fortunately I’ve had employers that respected that arrangement.

Still France has passed this law on January 1 and the thought behind is admirable but if this will ever work is something we’ll see in the future.