T-SQL Tuesday #110 – Automate All the Things

Standard

Reading Time: 4 minutes

This month’s T-SQL Tuesday is about automating all the things.

I love automating processes, especially when it comes to managing SQL Server, and I think that most processes should be automated.

So technically there are two tasks for this month:

  • What is your go-to technology for automation?
  • What do you want to automate or what automation are you proud of completing?

For me, a process can be automated when:

  • It’s repetitive
  • Takes a long time to execute
  • And needs human interaction

I will go far to automate a process. I’d rather spend a week developing a process to get it automated, than to execute it twice and spend lots of time on it.

What is your go-to technology for automation?

In the last couple of years I’ve automated a whole bunch of processes using different technologies to accomplish that. My go-to technologies for automating are PowerShell, DevOps, tSQLt and dbatools.

You might say; “dbatools is PowerShell right?”, and you’re correct, but this module has grown so much that it has to be mentioned on it’s own.

PowerShell

I’ve been working with PowerShell since version 1. I fell in love with it especially because it filled a hole that we needed something else than a GUI to manage our infrastructure.

PowerShell is my hammer to automate about anything. It crosses multiple domains within the Windows universe, from managing your domains, your users, servers, services.

The PowerShell team has done a great job to make this platform as versatile as possible for anyone to use.

DevOps

This part is still kind of new for me and I find that this technology uses different technologies to accomplish the job.

I used Jenkins and Octopus Deploy within my CI/CD pipeline to automate releases. The thing is that within that pipeline I used several technologies to accomplish that.

Technologies like Groovy for the pipeline script, PowerShell to execute certain tasks, dbatools to execute some other.

I like Jenkins because it let’s me define exactly what I want to do and it doesn’t have a big learning curve.

Octopus Deploy hooks into the part where Jenkins is finished. Creating releases, deploying them to different servers makes it this CI/CD process complete.

tSQLt

This technology has saved my life. I never did a lot of development but I know how to develop a database. In the past I made changes, pushed the changes to a development server and crossed my fingers if I didn’t break anything.

With tSQLt I have the ability to create unit tests for SQL Server databases to test every little aspect of my database.

If a small change in the schema breaks a procedure I will know before I release it.

I will be presenting about this subject in the near future and help people along the way because I think this needs to be used a lot more.

dbatools

Last but not least, dbatools. This project changed my life in several ways.

First, it taught me how to develop in a lage open-source project. To have standards and how to handle commits. I did not know Git that well before I started contributing and now I’m one of the major contributors.

Second, it gave me the ability to help others. Having several scripts laying around, that could easily be transformed to proper functions/commands that could be used by other people. If I had that problem there is bound to be someone else that has the same and I could save this person a lot of headache.

Third, it made my automate a lot of processes for myself. Think of the backup testing command “Test-DbaLastBackup“. This command tests your backup by restoring it, doing a DBCC check and returns the result back to you.
You will know if you’re backup can be restored and if that data within the backup is not corrupted.

If you have not worked with dbatools yet, make sure you start with it today. It will change your life as a database administrator or developer.

Fourth, and this one comes with contributing to the project, is that I met a lot of incredible people from the community. I love the community how they share their knowledge and help other people out.

What do I want to automate

I’ve been working on automating our releases for database development.

This process has taken a considerable amount of time because I had no knowledge and experience in that area.

I have almost finished that process and I’m glad to say that this year I’ll have a CI/CD pipeline with Visual Studio, Git, Jenkins and Octopus Deploy.

My next project will be to implement data masking making it possible to ship databases to other servers and people without exposing production data.

What automation are you proud of completing

These are several projects that I’m proud of:

The pattern is getting obvious. Automate repetitive, pain-staking processes.

Change your mindset to embrace automation and you never want to go back.

I have less and less time to get things done.
My agenda is filled with meetings, project deadlines and we get more and more things on my plate.

Start automating your work today. You have to put in some energy in getting to automate your processes, but you’ll never look back.

 

TSQL-Tuesday #108: A Journey Into CI/CD

Standard

Reading Time: 4 minutes

This month’s TSQL Tuesday is hosted by Malathi Mahadevan (w | t) and she asks us to write about one thing you want to learn that is not SQL Server.

As a database administrator in SQL Server you might think it’s easy to think we only deal with that technology. But the role of the database administrator has been changing for a while now and we need to know more about other technologies besides SQL Server.

In my daily job I’m not just the database administrator, but I’m also a database developer. As a developer I had to make the shift from developing databases in SSMS to Visual Studio, which I still have to get used to but I’m getting there.

The reason I kind of had to do that was because it was time. Our IT team was going to switch from SVN to Git. We needed more collaboration, standards and processes to deploy new releases.

For me that was a BIG learning curve. As a DBA I never had to deal with source control for our databases. At previous jobs I never did a lot of development, mostly maintenance, and the responsibility for the databases was mostly with the third parties and development teams.

So now our projects are developed using the right tools, the source code is saved in a way that other can collaborate, but there is still the process of deployment that’s done manually.

Why use CI/CD?

CI/CD, or Continuous Integration/Continuous Development, is the cornerstone of DevOps. We’ve heard about DevOps for a while now and it never stuck with me until now.

If you know me, I’m a real advocate towards automation. I like to automate about everything that’s repetitive to make sure things run as smoothly as they can without the obvious PEBCAK (Problem Exists Between Chair And Keyboard) situations.

Handling all the releases by my own makes the development process prone to mistakes and delays. I want to have a process where I can wear a single hat, either the developer or the DBA. I do not want to be responsible for both the development and pushing he releases through the pipeline.

CI/CD can help me with that but saying some fancy buzzword is not going to help me.

How to learn CI/CD?

First of all, this was another big learning curve for me. For me to go from database development in SSMS to a CI/CD pipeline was something like climbing the Mount Everest. But as cliche this might sound, just start with a step and another and in the end you’ll get there.

The first thing I had to do was learn what CI/CD actually was. What does it entail? How does it integrate with my tools? What tools do I need to implement to get it to work?

I found that the best way to learn CI/CD is to just start and learn from my mistakes.

The following steps made me become familiar with CI/CD

  1. Read a lot
  2. Migrate database development to Visual Studio
  3. Build the architecture

Step 1: Read a lot

The first thing I did was spend a lot of time reading articles on the internet about what CI/CD actually is. If you google for that you get enough content to learn what it actually is.

I always want to know the ins and outs of a certain technology and I want to know enough to get me started. I find that diving into new technologies lets me make mistakes which makes me learn in a faster pace and will make it easier in the end.

Step 2: Migrate database development to Visual Studio

I already started developing tSQLt units tests for all my objects and I separated the data model and the data from the tests into separate projects within the solution.

This article helped me setup the solution to make sure everything was nicely separated and organized.

Step 3: Build the architecture

After investigating what I needed I decided to install Jenkins for the build server. Jenkins is free and it’s easy to install.

I setup a virtual machine to make sure I was able to create snapshots with every step to be able to revert back if something wen wrong. And I can assure you that I made a lot of mistakes.

This article helped me understand how to setup projects in Jenkins for SSDT.

I actually made some other changes to the project in Jenkins to me able to push the results to a server in conjunction with dbatools and a remote Git server. That’s going to b another article that will be released soon.

What’s the next step?

The next step is to implement a service that will be responsible for the delivery of the artifacts to different destinations like a test, acceptance and finally production server.

I’m still trying out a couple of solutions but I think I’ll probably go with Octopus Deploy.

After that it’s time to put all the knowledge into production. Getting the projects loaded and get the process implemented within the company.
The latter is going to be the most difficult one because it’s a huge change but I’m confident that in the end it will make things go smoother.

DevOps is not going to go away and CI/CD will make your life easier in the long run.

To Fail Is To Succeed

Standard

Reading Time: 2 minutes

There must be a 100 articles about this subject but I always like to talk about things that happen to me or where I ave an opinion about.

I was reading a blog post by Ben Kubicek about allowing failure and I agree with a lot of things.

I’m a parent of two adorable kids. The oldest is 4 years old and the youngest is almost 2. The youngest one is walking around but the oldest will not stop running, climbing on stuff and see how far she can go.

I’m not of one of those parents that constantly warns their kid that something can happen. If she wants to climb on the jungle gym to the highest point, go ahead. I will of course make sure the doesn’t make a death fall whenever she slips but I do tend to let my kids fail.
If they don’t fail, they will not learn. It’s hard though because you want to protect them as much as you can, but protecting them all the time does not help them either.

If Thomas Edison had believed in failure… we would still be living in darkness. If Henry Ford had given up, we would still be riding on horseback…if Alexander Graham Bell had given in to the clutches of failure, we would be spending less time staring at those small plastic things we call phones that rule our lives (which might not be a bad thing!).

Did I ever fail

Oh yes I did!

In my career I had my share of fails and from each and every one of them I learned something. Sometimes it was a hard lesson and sometimes it was something small, but every time I tried to learn something from it.

I did things like updating rows in a production database without a WHERE clause. I remember executing a change script in a hurry and because the boss was watching over my shoulder all the time and I didn’t pay attention to the code.
This was in my early years and nowadays I would not get flustered when there would be a lot of pressure.

That’s all fun and all but you just can’t fail when you’re dealing with production issues in environments with high volume transactions or when lives are at stake.

If you’re scared to fail in those environments than there is another problem. You probably don’t have a good disaster recovery plan (or haven’t tested it), you’re not confident about your backups or you’re just not confident enough in yourself.

There is going to be a time that something goes wrong and there is failure. Be prepared for it and make sure that when disaster strikes.

There are a lot of solutions that will test your backups, articles how to setup a disaster recovery plan, processes how to prevent unwanted changes (like Continuous Integration/Continuous Delivery).

So don’t be scared to fail. Learn from your mistakes and prepare yourself so it doesn’t happen again.

Working hard pays off

Standard

Reading Time: 2 minutes

On Monday, October 1st, 2018 I was awarded the Cloud and Datacenter Management MVP award. I did not know about it until a few days after because I busy renovating my office. During the renovation I did not regularly check my e-mail and the notification slipped through the cracks.

Two days later at 6 AM in the morning I checked my e-mail and jumped up. I saw an email that started “Congratulation…” from the MVP Global Administrator. I got the MVP award.

How did I get here

I remember when I first started contributing to the community. It was after SQL Cruise, now Tech Outbound, that I got introduced to a couple of the famous people like, Grant Fritchey (t), Argenis Fernandez (t), Tim Ford (t) (organizer of the event).

During the cruise I met up with Catherine Wilhemsen (t) who asked me why I was not presenting about all the things I knew. I did not think I was that knowledgeable compared to the people on the cruise but she thought otherwise. She, and the others on the cruise, suggested me to start presenting and I got into contact with Aaron Nelson who gave me an opportunity to present on the virtual chapter of PASS.

I got into contact with Chrissy LeMaire (t) who is the godmother of dbatools. My first task was a new command and that got me started developing for dbatools and contributing more to the community.

I started doing more presentations and spend countless hours developing PowerShell commands to help me and other simplify DBA work. It was a lot work but the reward I got from the community worth it.

Over the years I’ve met some wonderful people and that certainly was a reason to continue.

I want to thank my boss, Joe Sheffield (t), from Waypoint Analytical LLC for supporting me to contribute to the community. This would not have been possible without him.

I’m not going to stop and this only let’s me be more active in the community.

Thank you all that supported me and keep working hard because it pays of in the end.

TSQL-Tuesday #104: Code You Would Hate To Live Without

Standard

Reading Time: 3 minutes

This month’s TSQL Tuesday is hosted by Bert Wagner (w | t) and he asks us to write about code you’ve written that you would hate to live without.

Over the years I’ve used other people’s code regularly. It could have gone from Github, Stack Overflow, blogs etc.

I have always appreciated the way the SQL Server community (and other communities) share their code to help other.

I have worked alongside lots of people who were not very keen on sharing. People actually told me that their code was theirs and only theirs and that nobody is to see it.

Although I don’t agree, I do understand people not wanting to share their code. They may have worked hard on it and the person could have an advantage to other people who don’t have that “special” piece of code.

There is a couple of caveats to that:

  1. Your code could have bugs that you never noticed
  2. Your code could be improved by other people with different views

I am completely the opposite. If my code can help one person to make their life easier, I have already won. In the last few years that has been proved because I have been contributing to dbatools. Because I contributed I have learned a lot about open source projects, learned a lot about SQL Server. I met new people from all over the world who in their turn shared their code to make my life easier.

Code that I couldn’t live without

Back to the task at hand and tell you about code that I’ve written that I cannot live without.

Anything I create and is is usable for other DBA’s tends to end up in the dbatools module. The reason I do that is because it’s an amazing platform and it makes it easy for me to reach a large audience.

Testing Backups

One that I cannot live without is the dbatools command called Test-DbaLastBackup. Testing your backups is one of the most important things you can do as a DBA. I did not write that command but I did use it in one of my commands.
It’s basically a wrapper around that command that sends me an email with all the tested backups and the results in a csv file. The code is freely available on my blog.

Log Shipping

Some other things I’ve written is are the log shipping commands in dbatools. At my current workplace we have log shipping enabled. Everybody who has set up log shipping knows it’s a PITA if you have lots of databases.

I recently presented a session about the log shipping commands on the PASS DBA Virtual Chapter. The session was recorded and if you’re interested you can watch it through this link.

One log shipping command in particular that I use regularly is the Test-DbaLogShippingStatus. This command makes it easy for me to check up on my log shipping servers and see the status of each database.

SQL Server Agent

I don’t like to click around and if I can do something in a console I choose to do that. In most cases it is faster than the alternative.

The same goes for the SQL Server Agent. You have your jobs, schedules, job steps etc. For all these objects I have created commands in dbatools. The commands enable you to create, edit or remove any of those objects with ease.

So these are the script I cannot live without.

Maybe these scripts can help you in your journeys through SQL Server.

 

 

 

 

 

Reflect on 2017 And My Learning Goals for 2018

reflection
Standard

Reading Time: 4 minutes

nfortunately I missed the last T-SQL Tuesday of 2017 but I’ll share my learning goals for 2018. I also want to reflect on last year because I think it’s good to see if I was able to improve.

Reflect on last year

At the end of 2016 I set myself the following goals:

  • Do more presentations
  • Get better in PowerShell development
  • Play a bigger part in the SQL Server community

I got really excited in 2016 by getting drawn into the SQL Server community by presenting and help people out. That gave me a lot of energy and I wanted that to continue in 2017. And oh boy did it continue!

I got really involved with the dbatools project. I developed a bunch of functions to support the SQL Server Agent, log shipping and a bunch of others. With that I hit two birds with on stone; Play a bigger part in SQL Server community and getting better in PowerShell development.

I also wanted to do more presentations. I submitted sessions to lots of events and I got to present a couple. Not the smallest ones because I got a session in SQL Saturday Oregon and a lightning talk at the PASS Summit.

One thing I really wanted to get better at was unit testing. As a DBA with development experience I never came in contact with this concept. Though I got excited about it due to Rob Sewell (b | t) who showed my some interesting uses of PowerShell unit testing with Pester.
The unit testing concept was a real eyeopener and a big learning curve. At the end of 2017 I was able to create sophisticated unit tests for SQL Server (tSQLt) and PowerShell (Pester) which helped me a lot with my development.

Goals for next year

So the T-SQL Tuesday post had the following questions:

  • What do you want to learn?
  • How and when do you want to learn?
  • How do you plan to improve on what you learned?

What do you want to learn?

I want always want to learn a lot of things but if I would make a shortlist it would contain the following

  • Query Store
  • SQL Server 2017
  • Continuous Integration
  • More consistent blogging

I’m a database administrator that has knowledge in database development and other processes. I’ve done small projects with business intelligence but that subject does not really suit me. I understand the technology and the thought process, and that’s good, but I could not switch to it full time.
I get excited when I see an inefficient process or code and when I’m able to improve that process by automation or by redeveloping the procedures.
That’s where my strength lies and that’s where I want to continue at.

Query Store

I went to PASS Summit 2017 and there were a of of talks about Query Store. I have not yet had the chance to take a deep dive into the material but I will very soon. I think this is one of the things that will impact my work the most

SQL Server 2017

That’s a big topic and something I’ve been looking forward for. I have been doing tests with SQL Server 2017 when it was first released but this year we’re going to put into production.

Continuous Integration

You already read about me being active with unit testing with tSQLt and Pester. Now I want to take the next step and implement CI for the database development.

It costs me a lot of time to create new release scripts and to make sure everything works as as expected. I just want to automate that and make my life and that of the other developers easier.

More consistent blogging

Last year was a really busy year for me in both the work and personal aspect. Because of all the pressure I was not able to blog as much as I wanted and the only thing that I was able to do was to create some fast drafts. I have about 50 drafts laying around that I could finish and I probably should.

For me to improve this year is to at least create a new blog post at least every two weeks.

How and when do you want to learn?

As with many other people in the tech industry we have a full time job and finding time to study can be tough. We never have a 40 hour work week and sometimes we make 10 or more hours a day.

I have the privilege that my employer enables me to do self-study. That means that I can spend about 10% of my time learning new things or improve my current skills.

But that’s mostly not enough. When I want to take a deep dive I go and get into the details of a subject. I’m going to ask questions that in most cases would not make sense or I’ll try to break things to see what would happen in a production environment.

That answers the question about when I learn things but now how. The “how” I learn largely depends on the topic at hand. In most cases I go and search already published articles and google for any information that I can find.

In most cases that will give me enough information to get started. Another place that does have a a lot of content is the Microsoft Academy. If all of that still does not satisfy me, which does not happen very often, I will turn to the SQL Server community and ask questions. One place that has lots of IT professionals present is the SQL Server Community Slack channel.

How do you plan to improve on what you learned?

The subjects I explained earlier are important to me because they can make my life easier. They will improve my skills as a DBA and will make sure less mistakes are made.

I already mentioned that I want to spend more time on my blog. I think blogging is an important part of my career and it enabled to connect to the rest of the world.
If you blog about something you need to know all the ins-and-outs of the subject. You cannot just write about something and not be confident about the information you’re presenting. Making sure that you have all the information by diving into the subject will make me better at that.

Someone ones told me, if you can tell me in children’s language about a particular subject you’ll know enough about it to present it to others. That’s true for both blogging or presenting a session.

Talking about presenting, I’m also going to spend more time developing presentations and submitting them to events. I love to present about the stuff I do and if I can help just one person I’ve already that’s already good for me.
All the new stuff I’ll be learning could also end up in a session so expect me to submit lots of sessions this year.

This is going to be a busy year but I like that. Lots of things to get better at.

Thoughts on PASS without a SQL Saturday

SQL Saturday
Standard

Reading Time: 2 minutes

I was kind of surprised when I heard about PASS’s plans to no longer spend money on SQL Saturday and I want to share my thoughts about it.

Constantine wrote a blog post about this matter and I felt that I should do a blog post about it too.

There is a quote in the #SQLSaturday channel in the SQL Community Slack that blew my mind.

They [the PASS board] are already signaling they don’t want to spend $$ on SQL Sat and a few of the board members would just as soon SQL Sats die.

Let me be clear, if there were no local user groups and/or SQL Saturdays I do not think that PASS would exist and let me explain why.

I went to my first SQL Saturday a long time ago and I immediately felt that I was part of something big. The amount of effort that was put in by all the volunteers and sponsors to let us professionals grow was incredible.

For any data professional to grow you need content that’s up-to-date. You need interaction with other experts to get new ideas.
The SQL Saturday volunteers, the presenters, the organizers, the sponsors and all the people in the background make it possible to get the most up-to-date content there is.
There is no organization in the world that gives that amount of content for free (with sometimes a small fee) and let you interact with professionals in such a pleasant manner where I don’t have to spend thousands of dollars to attend the PASS Summit.

Although PASS does offer lots of content through virtual chapters I think that’s only partially important to data professionals. SQL Saturdays give us a platform to connect with other people from our field, provide networking opportunities and also job opportunities.

I can surely say that if the SQL Saturdays would not be there I would not be at the point I’m right now.

The volunteers have a really tough job arranging venues, food and drinks etc etc to organize the events and they all do it out of love for the SQL Server community.
Some of the SQL Saturdays would not survive without the help of the PASS organization. If PASS would cut the funding of those events I bet they would disappear really fast.

To conclude, cutting funds for SQL Saturdays will cause the events to disappear.
The disappearing events will cause the organization to no longer have the platform they had to communicate to the data professionals.
No longer having the platform will let people find other ways to get the content and will probably skip PASS Summit causing a loss in revenue and that in the end could be the end of the PASS organization.

Looking back at my first lightning talk

Standard

Reading Time: 2 minutes

I had the opportunity to speak at the PASS Summit 2017 with a lightning talk.

This was the first time I ever did a lightning talk and it was different than a normal session.

It all boils down to the fact that you only have about 10 to 15 minutes for the entire talk.
This brings a couple of complications because suddenly you have to make sure your content fits within that time frame. You also have to make sure with the amount of content that you don’t go too fast. Going to too fast in a lightning talk is disastrous because attendees will not be able to follow you.

I normally do 60 minute sessions where I have the time to send to dig a little deeper than originally planned.

So how can I do such a short session and still make sure that the audience gets a bang for their buck?

After thinking about it I made a couple of steps:

  1. Write down the subjects I wanted to talk about
  2. Make the content for the subject and made sure it was short and to the point
  3. Present the content out loud and record it

During the recording I would watch the timer in PowerPoint to see when I would hit the 10 minute mark.

If I had gone over I would go into the content again and try to adjust it.
If I made it within those 10 minutes I would watch the recording and pay attention to how fast I was talking. I had to adjust multiple times to make sure I wasn’t going too fast.

After a couple iterations I was satisfied with the preparation and I could go into the presentation with confidence.

How did the session go?

This was actually really funny. I noticed on my itinerary for my flight that I would have to leave to the airport before 10:00 AM. The lightning talk sessions were from 9:30 AM to 10:30 AM so I had to make sure I was the first one to present because otherwise I wouldn’t be able to make it to my flight.

Fortunately the other presenters were so kind to let me go first and I thank them for it.

Because I prepared this session pretty well everything went smooth. I was able to do my talk and show some a couple of demos and finish within the 10 minute frame.

A couple of weeks later I got the feedback from several people from the audience and I was excited about that. Lots of people liked the content and the overall session so that was a big win for me.

Conclusion

Looking back this was a very good experience for me. I find doing a lightning talk is way more difficult than a normal session. It all comes down to preparation and placing yourself into the audience.

If you attended the lightning talks as PASS Summit 2017, please leave a comment because I’d really like to know your opinion about what went well and what didn’t. There is always an opportunity to learn and I like to get better with every session I do.