The yes/no e-mail and the rubber duck

yes no rubberduck
Standard

Reading Time: 4 minutes

I recently read an article about delegation of work and this gave me an idea the yes/no e-mail and the rubber duck.

What is this article about

This article is about delegation of work. When you manage a team you could get lots of e-mails from people asking all sorts of questions for you to figure out.

Mostly those questions are in the line off “What do I need to do?”

This article describes that situation and how you could handle it in a more efficient way for both you and the employees.

What is the problem

The problem is that at some point you end up with a ton of e-mail, asking you the what-to-do question, and this can consume a considerable amount of time.
There is nothing wrong with those kinds of e-mails but I like to work efficient and try to avoid it.

Some questions may take 15 minutes, but some could take up to an hour and that’s where it becomes a problem.

At that point you’re no longer managing but you’re doing all the work which in turn does not benefit the rest of the people who you should manage.

What do you know? You’re not a manager

That’s correct, I’m not a manager,  I’m a database administrator. I have managed junior database administrators in the past and that’s where I started to get experience with management.

I too would receive the what-to-do questions and dive into them explaining everything which lead me to abandon my work getting into trouble with my deadlines.

The junior DBAs would also come into my office asking all sorts of things without explaining the details, which you may understand, when you get disturbed it takes some time to get back to whatever you were doing.

From that moment on I spend time on improving myself to make sure things ran smoothly. So I do have some experience in that field.

How did I solve this problem

The yes/no e-mail

There are days where I would get a ton of e-mail with all sorts of questions how to solve certain scenarios.

I would get a question with something like this from one of the DBAs

I noticed in the monitoring that one of the data disks only has 15% of disk space left. What should we do?

This is not a complex question but it would take me some time to investigate the disk, talk to the SAN admin (if there is one) and this would probably take about an hour of my time.

Instead the DBA could have done this

I noticed in the monitoring that one of the data disks only has 15% of disk space left. I looked at the databases which caused the growth and there wasn’t anything special going on. A normal growth event has happened because the data file was full.
I talked to the SAN admin to see if we can increase the disk space and he says he/she says that we add another 500GB to the 1TB disk.
Would that be a good idea?

Do you see the difference? Instead of asking what to do, the DBA went out of his way and learned something there. He/she investigated the problem and came up with a solution.
Whether or not I agree with it is up to me, but now I can just reply with a YES or a NO, be it with some explanation, and do not have to spend lots of time on it.

Do not get me wrong, In some situations this will not work. If you have employees that do not have enough knowledge and/or privileges to do things this will be harder to do.

It will take some time to adjust to the new way of communication, but in the end it will be worth it.

This approach does enable the employee to grow. It gives the person more responsibility and makes their work more efficient. It saves time for both the manager and the employee which makes it a win-win situation.

Even if the solution provided by the employee wasn’t the right one, I would not shoot it down. I would praise him/her for their effort and make this a teachable moment.

The rubber duck

This is another method I used which was normally used for people coming to your desk to ask a question.

The rubber duck you say, what does that do? The rubber duck was literally a rubber duck on my desk which acted as a symbol..

Whenever one of the DBAs would come and ask a question I would point them to the rubber duck. He/she would explain the problem in such detail that even the rubber duck would understand.

What did that do with the person asking the question? What happened was that the person was forced to explain the problem in such that everybody would understand it. In about 90% of the situations the answer would be clear to them and didn’t need me to answer it.

This worked so well that at some point they would enter my office, look at the duck and leave, already having figured out what to do.

Remember, I never turned them away, I only asked them to provide enough information to let anyone understand the problem. There was never a feeling they couldn’t ask me anything and that’s important.

Conclusion

The methods described above can:

  • increase productivity
  • let the a department work more efficient
  • help give responsibility
  • let employees be more responsible for their work

I hope this gave you some ideas, it surely worked with me.

 

The double-edged sword of open-source

Standard

Reading Time: 4 minutes

I got involved into a discussion about open-source software. Apparently a maintainer of an open-source project handed over the rains to another person and the other person changed the software to include a coin/mining exploit.  This really got me thinking about the double-edged sword of open-source.

Where did open-source originate from?

A little history lesson about open-source projects. Open-source came to be in 1998 where it was developed after Netscape’s announcement that the software for the Navigator software was going to publicly released.

The term got more momentum during the Freeware Summit organized by Tim O’Reilly in April 1998.

The highlight was when the Open Source Summit which is known as the birth of open-source.

So what’s the problem with open-source?

At the same time there is one and there isn’t one.

Open-source is a blessing in my opinion. It enables developers to help out with projects and collaborate. It gives companies the choice to either go buy proprietary software or to choose the option to go with open-source which is mostly free.

The problem with open-source, and which most people have forgotten, is that as soon as the software is released the maintainer has no responsibilities towards anyone. The software is provided “AS IS” which is mentioned very clearly in the license.

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.

This also is both a good and a bad thing.

It’s a good thing because it releases the maintainer of any duties that have to do with the software product. Other people can contribute and create their own features which the maintainer is not reliable for. I certainly would not want to be responsible for a bug in one of my open-source projects and having to deal with large claims from companies because they lost revenue.

On the other hand, because the maintainer has no responsibilities, you are the one responsible from the moment you download the software.

The maintainer of the open-source project doesn’t even to reply to any issues, give you support or any other interaction with the software user. Everything that a maintainer does outside of releasing the software is big bonus and shows the character and passion of the maintainer.

What happened in the discussion?

Apparently the new maintainer changed the code of the project and included a bitcoin miner into it. So everyone who actually used that software and updated it, would get a bitcoin miner installed which would cause a lot of problems.

Immediately you would think that the original maintainer would be the person responsible because he didn’t check the person that wanted to take over the rains.

It is very scary that open-source projects are vulnerable to these kind of practices because at some point we trust the software and put it into our environments. That causes a threat to our production environments.

In reality, maintainers of an open-source project, are in no way responsible for these kind of issues. As soon as a piece of open-source software is released, the maintainer has done it’s work.

How can we avoid this?

I don’t think we can actually avoid these kind of problems. If for some reason such a thing happens mostly of the time people will find out quickly.

News like this will go quick especially with social media platforms like Facebook and Twitter, projects that are compromised will become flagged very soon.

I think we need to be diligent when we start using open-source software. Make sure you use software that is used a lot or is developed by someone with a good reputation.

It’s pretty easy to look into a Github repository and get basic information like who the maintainer is and do some research on the person. If you don’t trust it, don’t install it. Get more information, go ask on forums and ask questions before you start to rely on these projects.

My two cents on the responsibilities

I’m a contributor to a couple of open-source projects and I rely heavily on them in my day-to-day work.

I’ve contributed a bunch of commands to the dbatools projects. I feel responsible for these commands but I’m completely fine if someone else finds a bug and fixes it. When someone finds a bug in a command and notifies me, I will try to fix it. I feel responsible because I created them but it’s in no way only my responsibility.

Maintainers and contributors have no obligation to help after a piece of software is released. Everything that’s done after the release is a big bonus.

I do think that the way the projects are run reflects the types of people who are involved. Regularly I get questions from people all over the world about a command I created.

I will always reply to these questions because I think that it’s important to give support in these projects and it helps the projects being.

Conclusion

Open-source software is great and most of the time you get a reliable version for free or a small amount of money. You, as the user, have to make sure that the project can be trusted.

Closed software is great too but it can be costly. The cost for the software gives a sense a security and a lot of companies prefer to pay for software instead of using open-source.

Both have their pros and cons and you must decide what’s best for your use-case. I use both open-source and closed source and use them for my use-cases.

 

T-SQL Tuesday #110 – Automate All the Things

Standard

Reading Time: 4 minutes

This month’s T-SQL Tuesday is about automating all the things.

I love automating processes, especially when it comes to managing SQL Server, and I think that most processes should be automated.

So technically there are two tasks for this month:

  • What is your go-to technology for automation?
  • What do you want to automate or what automation are you proud of completing?

For me, a process can be automated when:

  • It’s repetitive
  • Takes a long time to execute
  • And needs human interaction

I will go far to automate a process. I’d rather spend a week developing a process to get it automated, than to execute it twice and spend lots of time on it.

What is your go-to technology for automation?

In the last couple of years I’ve automated a whole bunch of processes using different technologies to accomplish that. My go-to technologies for automating are PowerShell, DevOps, tSQLt and dbatools.

You might say; “dbatools is PowerShell right?”, and you’re correct, but this module has grown so much that it has to be mentioned on it’s own.

PowerShell

I’ve been working with PowerShell since version 1. I fell in love with it especially because it filled a hole that we needed something else than a GUI to manage our infrastructure.

PowerShell is my hammer to automate about anything. It crosses multiple domains within the Windows universe, from managing your domains, your users, servers, services.

The PowerShell team has done a great job to make this platform as versatile as possible for anyone to use.

DevOps

This part is still kind of new for me and I find that this technology uses different technologies to accomplish the job.

I used Jenkins and Octopus Deploy within my CI/CD pipeline to automate releases. The thing is that within that pipeline I used several technologies to accomplish that.

Technologies like Groovy for the pipeline script, PowerShell to execute certain tasks, dbatools to execute some other.

I like Jenkins because it let’s me define exactly what I want to do and it doesn’t have a big learning curve.

Octopus Deploy hooks into the part where Jenkins is finished. Creating releases, deploying them to different servers makes it this CI/CD process complete.

tSQLt

This technology has saved my life. I never did a lot of development but I know how to develop a database. In the past I made changes, pushed the changes to a development server and crossed my fingers if I didn’t break anything.

With tSQLt I have the ability to create unit tests for SQL Server databases to test every little aspect of my database.

If a small change in the schema breaks a procedure I will know before I release it.

I will be presenting about this subject in the near future and help people along the way because I think this needs to be used a lot more.

dbatools

Last but not least, dbatools. This project changed my life in several ways.

First, it taught me how to develop in a lage open-source project. To have standards and how to handle commits. I did not know Git that well before I started contributing and now I’m one of the major contributors.

Second, it gave me the ability to help others. Having several scripts laying around, that could easily be transformed to proper functions/commands that could be used by other people. If I had that problem there is bound to be someone else that has the same and I could save this person a lot of headache.

Third, it made my automate a lot of processes for myself. Think of the backup testing command “Test-DbaLastBackup“. This command tests your backup by restoring it, doing a DBCC check and returns the result back to you.
You will know if you’re backup can be restored and if that data within the backup is not corrupted.

If you have not worked with dbatools yet, make sure you start with it today. It will change your life as a database administrator or developer.

Fourth, and this one comes with contributing to the project, is that I met a lot of incredible people from the community. I love the community how they share their knowledge and help other people out.

What do I want to automate

I’ve been working on automating our releases for database development.

This process has taken a considerable amount of time because I had no knowledge and experience in that area.

I have almost finished that process and I’m glad to say that this year I’ll have a CI/CD pipeline with Visual Studio, Git, Jenkins and Octopus Deploy.

My next project will be to implement data masking making it possible to ship databases to other servers and people without exposing production data.

What automation are you proud of completing

These are several projects that I’m proud of:

The pattern is getting obvious. Automate repetitive, pain-staking processes.

Change your mindset to embrace automation and you never want to go back.

I have less and less time to get things done.
My agenda is filled with meetings, project deadlines and we get more and more things on my plate.

Start automating your work today. You have to put in some energy in getting to automate your processes, but you’ll never look back.

 

TSQL-Tuesday #108: A Journey Into CI/CD

Standard

Reading Time: 4 minutes

This month’s TSQL Tuesday is hosted by Malathi Mahadevan (w | t) and she asks us to write about one thing you want to learn that is not SQL Server.

As a database administrator in SQL Server you might think it’s easy to think we only deal with that technology. But the role of the database administrator has been changing for a while now and we need to know more about other technologies besides SQL Server.

In my daily job I’m not just the database administrator, but I’m also a database developer. As a developer I had to make the shift from developing databases in SSMS to Visual Studio, which I still have to get used to but I’m getting there.

The reason I kind of had to do that was because it was time. Our IT team was going to switch from SVN to Git. We needed more collaboration, standards and processes to deploy new releases.

For me that was a BIG learning curve. As a DBA I never had to deal with source control for our databases. At previous jobs I never did a lot of development, mostly maintenance, and the responsibility for the databases was mostly with the third parties and development teams.

So now our projects are developed using the right tools, the source code is saved in a way that other can collaborate, but there is still the process of deployment that’s done manually.

Why use CI/CD?

CI/CD, or Continuous Integration/Continuous Development, is the cornerstone of DevOps. We’ve heard about DevOps for a while now and it never stuck with me until now.

If you know me, I’m a real advocate towards automation. I like to automate about everything that’s repetitive to make sure things run as smoothly as they can without the obvious PEBCAK (Problem Exists Between Chair And Keyboard) situations.

Handling all the releases by my own makes the development process prone to mistakes and delays. I want to have a process where I can wear a single hat, either the developer or the DBA. I do not want to be responsible for both the development and pushing he releases through the pipeline.

CI/CD can help me with that but saying some fancy buzzword is not going to help me.

How to learn CI/CD?

First of all, this was another big learning curve for me. For me to go from database development in SSMS to a CI/CD pipeline was something like climbing the Mount Everest. But as cliche this might sound, just start with a step and another and in the end you’ll get there.

The first thing I had to do was learn what CI/CD actually was. What does it entail? How does it integrate with my tools? What tools do I need to implement to get it to work?

I found that the best way to learn CI/CD is to just start and learn from my mistakes.

The following steps made me become familiar with CI/CD

  1. Read a lot
  2. Migrate database development to Visual Studio
  3. Build the architecture

Step 1: Read a lot

The first thing I did was spend a lot of time reading articles on the internet about what CI/CD actually is. If you google for that you get enough content to learn what it actually is.

I always want to know the ins and outs of a certain technology and I want to know enough to get me started. I find that diving into new technologies lets me make mistakes which makes me learn in a faster pace and will make it easier in the end.

Step 2: Migrate database development to Visual Studio

I already started developing tSQLt units tests for all my objects and I separated the data model and the data from the tests into separate projects within the solution.

This article helped me setup the solution to make sure everything was nicely separated and organized.

Step 3: Build the architecture

After investigating what I needed I decided to install Jenkins for the build server. Jenkins is free and it’s easy to install.

I setup a virtual machine to make sure I was able to create snapshots with every step to be able to revert back if something wen wrong. And I can assure you that I made a lot of mistakes.

This article helped me understand how to setup projects in Jenkins for SSDT.

I actually made some other changes to the project in Jenkins to me able to push the results to a server in conjunction with dbatools and a remote Git server. That’s going to b another article that will be released soon.

What’s the next step?

The next step is to implement a service that will be responsible for the delivery of the artifacts to different destinations like a test, acceptance and finally production server.

I’m still trying out a couple of solutions but I think I’ll probably go with Octopus Deploy.

After that it’s time to put all the knowledge into production. Getting the projects loaded and get the process implemented within the company.
The latter is going to be the most difficult one because it’s a huge change but I’m confident that in the end it will make things go smoother.

DevOps is not going to go away and CI/CD will make your life easier in the long run.

To Fail Is To Succeed

Standard

Reading Time: 2 minutes

There must be a 100 articles about this subject but I always like to talk about things that happen to me or where I ave an opinion about.

I was reading a blog post by Ben Kubicek about allowing failure and I agree with a lot of things.

I’m a parent of two adorable kids. The oldest is 4 years old and the youngest is almost 2. The youngest one is walking around but the oldest will not stop running, climbing on stuff and see how far she can go.

I’m not of one of those parents that constantly warns their kid that something can happen. If she wants to climb on the jungle gym to the highest point, go ahead. I will of course make sure the doesn’t make a death fall whenever she slips but I do tend to let my kids fail.
If they don’t fail, they will not learn. It’s hard though because you want to protect them as much as you can, but protecting them all the time does not help them either.

If Thomas Edison had believed in failure… we would still be living in darkness. If Henry Ford had given up, we would still be riding on horseback…if Alexander Graham Bell had given in to the clutches of failure, we would be spending less time staring at those small plastic things we call phones that rule our lives (which might not be a bad thing!).

Did I ever fail

Oh yes I did!

In my career I had my share of fails and from each and every one of them I learned something. Sometimes it was a hard lesson and sometimes it was something small, but every time I tried to learn something from it.

I did things like updating rows in a production database without a WHERE clause. I remember executing a change script in a hurry and because the boss was watching over my shoulder all the time and I didn’t pay attention to the code.
This was in my early years and nowadays I would not get flustered when there would be a lot of pressure.

That’s all fun and all but you just can’t fail when you’re dealing with production issues in environments with high volume transactions or when lives are at stake.

If you’re scared to fail in those environments than there is another problem. You probably don’t have a good disaster recovery plan (or haven’t tested it), you’re not confident about your backups or you’re just not confident enough in yourself.

There is going to be a time that something goes wrong and there is failure. Be prepared for it and make sure that when disaster strikes.

There are a lot of solutions that will test your backups, articles how to setup a disaster recovery plan, processes how to prevent unwanted changes (like Continuous Integration/Continuous Delivery).

So don’t be scared to fail. Learn from your mistakes and prepare yourself so it doesn’t happen again.

Working hard pays off

Standard

Reading Time: 2 minutes

On Monday, October 1st, 2018 I was awarded the Cloud and Datacenter Management MVP award. I did not know about it until a few days after because I busy renovating my office. During the renovation I did not regularly check my e-mail and the notification slipped through the cracks.

Two days later at 6 AM in the morning I checked my e-mail and jumped up. I saw an email that started “Congratulation…” from the MVP Global Administrator. I got the MVP award.

How did I get here

I remember when I first started contributing to the community. It was after SQL Cruise, now Tech Outbound, that I got introduced to a couple of the famous people like, Grant Fritchey (t), Argenis Fernandez (t), Tim Ford (t) (organizer of the event).

During the cruise I met up with Catherine Wilhemsen (t) who asked me why I was not presenting about all the things I knew. I did not think I was that knowledgeable compared to the people on the cruise but she thought otherwise. She, and the others on the cruise, suggested me to start presenting and I got into contact with Aaron Nelson who gave me an opportunity to present on the virtual chapter of PASS.

I got into contact with Chrissy LeMaire (t) who is the godmother of dbatools. My first task was a new command and that got me started developing for dbatools and contributing more to the community.

I started doing more presentations and spend countless hours developing PowerShell commands to help me and other simplify DBA work. It was a lot work but the reward I got from the community worth it.

Over the years I’ve met some wonderful people and that certainly was a reason to continue.

I want to thank my boss, Joe Sheffield (t), from Waypoint Analytical LLC for supporting me to contribute to the community. This would not have been possible without him.

I’m not going to stop and this only let’s me be more active in the community.

Thank you all that supported me and keep working hard because it pays of in the end.

TSQL-Tuesday #104: Code You Would Hate To Live Without

Standard

Reading Time: 3 minutes

This month’s TSQL Tuesday is hosted by Bert Wagner (w | t) and he asks us to write about code you’ve written that you would hate to live without.

Over the years I’ve used other people’s code regularly. It could have gone from Github, Stack Overflow, blogs etc.

I have always appreciated the way the SQL Server community (and other communities) share their code to help other.

I have worked alongside lots of people who were not very keen on sharing. People actually told me that their code was theirs and only theirs and that nobody is to see it.

Although I don’t agree, I do understand people not wanting to share their code. They may have worked hard on it and the person could have an advantage to other people who don’t have that “special” piece of code.

There is a couple of caveats to that:

  1. Your code could have bugs that you never noticed
  2. Your code could be improved by other people with different views

I am completely the opposite. If my code can help one person to make their life easier, I have already won. In the last few years that has been proved because I have been contributing to dbatools. Because I contributed I have learned a lot about open source projects, learned a lot about SQL Server. I met new people from all over the world who in their turn shared their code to make my life easier.

Code that I couldn’t live without

Back to the task at hand and tell you about code that I’ve written that I cannot live without.

Anything I create and is is usable for other DBA’s tends to end up in the dbatools module. The reason I do that is because it’s an amazing platform and it makes it easy for me to reach a large audience.

Testing Backups

One that I cannot live without is the dbatools command called Test-DbaLastBackup. Testing your backups is one of the most important things you can do as a DBA. I did not write that command but I did use it in one of my commands.
It’s basically a wrapper around that command that sends me an email with all the tested backups and the results in a csv file. The code is freely available on my blog.

Log Shipping

Some other things I’ve written is are the log shipping commands in dbatools. At my current workplace we have log shipping enabled. Everybody who has set up log shipping knows it’s a PITA if you have lots of databases.

I recently presented a session about the log shipping commands on the PASS DBA Virtual Chapter. The session was recorded and if you’re interested you can watch it through this link.

One log shipping command in particular that I use regularly is the Test-DbaLogShippingStatus. This command makes it easy for me to check up on my log shipping servers and see the status of each database.

SQL Server Agent

I don’t like to click around and if I can do something in a console I choose to do that. In most cases it is faster than the alternative.

The same goes for the SQL Server Agent. You have your jobs, schedules, job steps etc. For all these objects I have created commands in dbatools. The commands enable you to create, edit or remove any of those objects with ease.

So these are the script I cannot live without.

Maybe these scripts can help you in your journeys through SQL Server.

 

 

 

 

 

Reflect on 2017 And My Learning Goals for 2018

reflection
Standard

Reading Time: 4 minutes

nfortunately I missed the last T-SQL Tuesday of 2017 but I’ll share my learning goals for 2018. I also want to reflect on last year because I think it’s good to see if I was able to improve.

Reflect on last year

At the end of 2016 I set myself the following goals:

  • Do more presentations
  • Get better in PowerShell development
  • Play a bigger part in the SQL Server community

I got really excited in 2016 by getting drawn into the SQL Server community by presenting and help people out. That gave me a lot of energy and I wanted that to continue in 2017. And oh boy did it continue!

I got really involved with the dbatools project. I developed a bunch of functions to support the SQL Server Agent, log shipping and a bunch of others. With that I hit two birds with on stone; Play a bigger part in SQL Server community and getting better in PowerShell development.

I also wanted to do more presentations. I submitted sessions to lots of events and I got to present a couple. Not the smallest ones because I got a session in SQL Saturday Oregon and a lightning talk at the PASS Summit.

One thing I really wanted to get better at was unit testing. As a DBA with development experience I never came in contact with this concept. Though I got excited about it due to Rob Sewell (b | t) who showed my some interesting uses of PowerShell unit testing with Pester.
The unit testing concept was a real eyeopener and a big learning curve. At the end of 2017 I was able to create sophisticated unit tests for SQL Server (tSQLt) and PowerShell (Pester) which helped me a lot with my development.

Goals for next year

So the T-SQL Tuesday post had the following questions:

  • What do you want to learn?
  • How and when do you want to learn?
  • How do you plan to improve on what you learned?

What do you want to learn?

I want always want to learn a lot of things but if I would make a shortlist it would contain the following

  • Query Store
  • SQL Server 2017
  • Continuous Integration
  • More consistent blogging

I’m a database administrator that has knowledge in database development and other processes. I’ve done small projects with business intelligence but that subject does not really suit me. I understand the technology and the thought process, and that’s good, but I could not switch to it full time.
I get excited when I see an inefficient process or code and when I’m able to improve that process by automation or by redeveloping the procedures.
That’s where my strength lies and that’s where I want to continue at.

Query Store

I went to PASS Summit 2017 and there were a of of talks about Query Store. I have not yet had the chance to take a deep dive into the material but I will very soon. I think this is one of the things that will impact my work the most

SQL Server 2017

That’s a big topic and something I’ve been looking forward for. I have been doing tests with SQL Server 2017 when it was first released but this year we’re going to put into production.

Continuous Integration

You already read about me being active with unit testing with tSQLt and Pester. Now I want to take the next step and implement CI for the database development.

It costs me a lot of time to create new release scripts and to make sure everything works as as expected. I just want to automate that and make my life and that of the other developers easier.

More consistent blogging

Last year was a really busy year for me in both the work and personal aspect. Because of all the pressure I was not able to blog as much as I wanted and the only thing that I was able to do was to create some fast drafts. I have about 50 drafts laying around that I could finish and I probably should.

For me to improve this year is to at least create a new blog post at least every two weeks.

How and when do you want to learn?

As with many other people in the tech industry we have a full time job and finding time to study can be tough. We never have a 40 hour work week and sometimes we make 10 or more hours a day.

I have the privilege that my employer enables me to do self-study. That means that I can spend about 10% of my time learning new things or improve my current skills.

But that’s mostly not enough. When I want to take a deep dive I go and get into the details of a subject. I’m going to ask questions that in most cases would not make sense or I’ll try to break things to see what would happen in a production environment.

That answers the question about when I learn things but now how. The “how” I learn largely depends on the topic at hand. In most cases I go and search already published articles and google for any information that I can find.

In most cases that will give me enough information to get started. Another place that does have a a lot of content is the Microsoft Academy. If all of that still does not satisfy me, which does not happen very often, I will turn to the SQL Server community and ask questions. One place that has lots of IT professionals present is the SQL Server Community Slack channel.

How do you plan to improve on what you learned?

The subjects I explained earlier are important to me because they can make my life easier. They will improve my skills as a DBA and will make sure less mistakes are made.

I already mentioned that I want to spend more time on my blog. I think blogging is an important part of my career and it enabled to connect to the rest of the world.
If you blog about something you need to know all the ins-and-outs of the subject. You cannot just write about something and not be confident about the information you’re presenting. Making sure that you have all the information by diving into the subject will make me better at that.

Someone ones told me, if you can tell me in children’s language about a particular subject you’ll know enough about it to present it to others. That’s true for both blogging or presenting a session.

Talking about presenting, I’m also going to spend more time developing presentations and submitting them to events. I love to present about the stuff I do and if I can help just one person I’ve already that’s already good for me.
All the new stuff I’ll be learning could also end up in a session so expect me to submit lots of sessions this year.

This is going to be a busy year but I like that. Lots of things to get better at.