The Ozar Effect

Back in February one of my blog posts was highlighted on Brent Ozar Unlimited’s Weekly Links. Wow, what a bump in traffic! I usually receive a hand full of views a week, but on February 26th it reached 1,212. The cool part was it was actually the #2 link on Brent Ozar Unlimited’s newsletter for the day. If you don’t currently subscribe, here is a link to start.

This particular blog post was about my daily routines under the Getting Things Done (GTD) time management methodology. Based on the hit count and activity on Brent’s newsletter, I realized that time management & personal productivity are topics I need to spend more time on with my writing. Like other DBAs, I wear multiple hats and trying to keep all projects moving forward along with daily DBA operations can be difficult. I’m guessing there are few other DBAs with the same issues so I’m hoping I can help.

If you’re looking for other #sqlfamily that have posted about time management and personal productivity, check out K. Brian Keely (b|t) and Marlon Ribunal (b|t). They each have their own technical blogging sites but head over to their other sites focusing on personal productivity. Brian is also known as the Goal Keeping DBA and Marlon has posted around improving the life of knowledge workers. I would suggest following both of them and reading back through the archives.

Doug Purnell
@SQLNikon
sqlnikon@gmail.com
https://sqlnikon.com

SQL Life 2026

This month’s #tsql2sday is being hosted by Adam Machanic (t|b):

Your mission for month #100 is to put on your speculative shades and forecast the bright future ahead. Tell us what the world will be like when T-SQL Tuesday #200 comes to pass. If you’d like to describe your vision and back it up with points, trends, data, or whatever, that’s fine. If you’d like to have a bit more fun and maybe go slightly more science fiction than science, come up with a potential topic for the month and create a technical post that describes an exciting new feature or technology to your audience of June 2026. (Did I do the math properly there?)

Here is my vision…

As I step into my home office and close the door, the walls and ceiling automatically bring up a live landscape of Grand Teton National Park. My priority tasks are displayed and see today is the day we migrate our entire production environment over to the latest update of SQL Server to take advantage of the latest AI engine built on top of Azure Machine Learning.

The AI engine is going to be our DBA in the back pocket. We will no longer have to deal with monitoring and performance tuning. Ever since the initial automation database tuning release back in 2017, Microsoft has been freeing up DBA resources with each update. Notifications for transactions, indexing, backups, DR/HA, and security tuning are displayed under the integrated PowerBI console on one of the walls of my office. I remember the day I had multiple screens on my desk, now I just move around the room to the various virtual workspaces around the office depending on the project I’m working on.

The migration has already been done by the older AI engine countless times under our dev and test instances. With dbatools having been integrated into the SQL engine years ago and expanded to over 3,000 commands, PowerShell is now my primary interface to SQL Server. The developers are still coding in T-SQL when the ORM tool doesn’t behave properly, but for me, PowerShell is where I’m comfortable.

As I start the production migration, open transactions are frozen, databases are moved between memory spaces, and then the transactions are brought back online. Users never even noticed the tiniest of hiccups within their applications. The new SQL AI is already online and bringing up the status of the new instances on my wall, all is well.

The update cycle of SQL Server is still bringing value to us. The AI engine takes cares of the weekly cumulative updates and leaves the monthly feature releases for me to approve. Automatic DR/HA with Azure makes the older AG’s seem like the stone age. Our database, application, and web servers still reside under the control of our data centers, but DR/HA is handled for us.

Now that the migration is done, it’s time for a break. Catch you on the next #tsq2sday!

Doug Purnell

I’m Speaking at SQL Saturday Raleigh

I’m honored to have been selected to speak at SQL Saturday Raleigh #721 on April 14th. I’m speaking about choosing the right high availability (HA) and/or disaster recovery (DR) solution for SQL Server. I’m given this talk just once before at the Companero Conference back in the fall of 2017. With the release of SQL Server 2017, there are few more HA/DR options that we have available to us. There are plenty of other great sessions being offered that day so it is not too late to sign up!

For those of you not familiar with SQL Saturday events, they are organized by SQL Server professionals with help with sponsors and volunteers within the SQL Server community. Lunch is generally 10-15 dollars or you can choose to bring your own for an entire day of free training. There are multiple events every Saturday so find one near you!

I also signed up for Kalen Delaney’s (b|t) pre-con on SQL Server internals. I’ve always enjoyed digging into the internals of SQL Server. If you have a better understanding of what is under the covers it will help you build and support better solutions.

You’ll see me roaming the event with my camera so be sure and say hello and I hope to see you there!

Doug Purnell
@SQLNikon
sqlnikon@gmail.com
https://sqlnikon.com

Daily Routines

I’m a big fan of Getting Things Done (GTD) by David Allen (t|b). I follow the GTD principals both at work and personal life. The key part of making GTD successful are routines. In this post, I’m going to review my daily routine that keeps my GTD system running like a well-oiled machine.

For the daily routine, the morning fits my personality the best. I love getting up early and setting a goal for the day. You have to find the best time that works for you. Others like the evening to reflect on the day and set goals for the next, it is entirely up to you. Regardless of the time of day, the daily routine needs to hit on a few key areas: inbox zero, your calendar, and tasks for the day.

Inbox Zero
I’ve been a fan of this approach for over a year. I used to keep ‘to-do’ items in my email inbox that were usually just read emails that require some additional action. It got to be overwhelming and I found myself with hundreds of emails that required my time. By using Inbox Zero, I treat my email inbox as just another entry point into my trusted system (OmniFocus). I have multiple inboxes in my life, mail from home, drive by tasks at work, meetings, and blog posts or websites that I want to read. Once you realize that there are virtual inboxes everywhere in your life, you quickly understand that you need one system to organize them all.

Calendar
Each day your time is being demanded by multiple sources. 8am-5pm is usually your work commitments and after 5 pm is family/personal time. By reviewing your calendar ahead of the day you are able to better estimate what you can accomplish that day. If you have a big block of time available, that might be a perfect place to work on your big project or start working on one of your goals for month/quarter/year. Smaller blocks of time might be best for quick tasks or phone calls that need to be returned.

Tasks
Now that you have an understanding of what time you have available, you can decide which tasks you can work on for the day. Most people have hundreds of little tasks that need to be completed, but which ones are the key tasks that keep you moving in the direction of your goals? Those are the tasks that you want to prioritize first. By reviewing your tasks each day, you stay current on your most important tasks.

If you would like to know more about GTD, start with David’s book. He did a second revision a few years ago and this will give you the foundation of GTD and the advantages it gives you in your life. The key advantage of GTD for me is clearing my head of those ‘things’ that I can’t forget to do that will keep me up at night or cause me to go into firefighting mode when those ’things’ are due. GTD gives me the freedom to be creative with my brain instead of using it as a storage device that it wasn’t meant to be used for.

Doug Purnell
@SQLNikon
sqlnikon@gmail.com
https://sqlnikon.com

Taking a photo walk to a BBQ

This month’s #tsql2sday is being hosted by Aaron Bertrand (t|b). Aaron gives us a choice this month to talk about our passions outside of the SQL Server community or our T-SQL bad habits. I don’t have enough time to bore you with all my T-SQL bad habits so I’ll try to inspire you with my #SQLBBQ & #SQLPhoto passions.

When I not architecting SQL Server solutions at Elon University, I’m usually behind a camera or BBQ cooker. Both of these passions allow me to exercise the creative part of my brain when I’m away from the office.

I got the photography bug right before my son was born. I started with a point-and-shoot Sony Cybershot but was so frustrated with the delay in taking a picture. I love the digital photos when they turned out, but getting a consistent focus was difficult. My next purchased a Nikon D70 DSLR. This camera was a life changer for me. Having the ability to shoot 3 frames per second was eye-opening. I no longer had to deal with missed shots or not knowing where the focus point was set. I used the D70 for another 6 years before upgrading to my current setup, the Nikon D300s.

 

I continued taking photos over the years but with the onset of smartphones I don’t pull out my Nikon as much. The iPhone has been an asset to my photography just because it is always with me. What I really enjoy doing is taking photos at PASS events. Back in 2012, before my first PASS Conference, I reached out to Pat Wright (b|t) about joining the photo walk during the conference. What ended up happening was a full day “pre-con” of photography on the Sunday before the conference. We ended up at Snoqualmie Falls and the surrounding area and I had blast taking photos with Pat. We continue to have our photo “pre-cons” before PASS and other SQL Saturday events. This is a great time to talk shop about photography and SQL Server. I’m looking forward to PASS 2018 in Seattle and future photo walks.

Photo Albums:

2012 PASS Summit
https://www.flickr.com/photos/purnellphotography/albums/72157632003357227

2014 PASS Summit
https://sqlnikon.smugmug.com/2014/2014-11-02-SQL-Photo-Walk/
https://sqlnikon.smugmug.com/2014/2014-11-05-Denny-SQL-Karaoke/
https://sqlnikon.smugmug.com/2014/2014-11-06-SQL-PASS-Karaoke/

2016 PASS Summit

https://sqlnikon.smugmug.com/2016/2016-10-23-SQL-PASS-Photo-Walk/
https://sqlnikon.smugmug.com/2016/2016-10-25-SQL-Photo-Walk/
https://sqlnikon.smugmug.com/2016/2016-10-27-SQL-Summit-SQL-Dinner/

2017 SQL Sat Charlotte
https://sqlnikon.smugmug.com/2017/2017-10-14-SQL-Sat-Charlotte/

 

My other passion is cooking, specifically BBQ. I own a few grills but my favorite is a cooker that has a 3×6 foot cooking surface. It is fueled by propane but has a shelf for a piece of wood like hickory or oak. I bought the cooker from a friend about 15 years ago and have loved experimenting with different cuts of meat. My go-to meal is whole chickens, baby back ribs, and mac & cheese. The best part about the mac & cheese is it is just store bought Stouffer’s, but when it is left on the cooker for a few hours, it comes out with an amazing smokey flavor. During last year’s #sqlvacation, Kevin Kline made a stop at our user group in Greensboro, NC and I brought in the cooker for the meeting. I love sharing the food and wish I had a bigger cooking surface to cook for an entire SQL Saturday event.

Hope to see you on a photo walk or cookout very soon!

Doug Purnell
@SQLNikon
sqlnikon@gmail.com
https://sqlnikon.com

MySQL RPC with Linked Server

Regular blogging is one of my goals this year. One of the areas I struggle with is what content to blog about. While I was listening to SQL Server Radio (Episode 84) with Guy Glantser (t) and Matan Yungman (t), they mentioned a new blog post series they are starting titled “Something Cool Learned Today”. These are meant for sort 1-2 paragraph blog posts about things you learned. This is the spark I needed to build some additional content for my blog.

We have a door access system that uses MySQL. We interact with the data via a SQL Server linked server so we can join MySQL tables with other SQL Server tables. Because of a bug in the vendor application, I needed a way to update some records, every few minutes, based on some records having specific column values. I knew I wanted to build a Stored Procedure (SP) similar to SQL Server but didn’t know the specific syntax for doing so. Then there was the ability to execute the MySQL SP from SQL Server, didn’t know the syntax for this either.

Off to ‘The Google’. My first stop was the MySQL documentation for CREATE STORED PROCEDURE. I didn’t know the exact MySQL version but I wasn’t trying to do anything complicated so I figured it would suffice. MySQL enforces the semicolon delimiter for all commands, something SQL Server says it will enforce in future releases (start using them now). The issue with trying to create a new SP under MySQL is the SP needs a delimiter as well as the statements inside of the SP. In order for MySQL to recognize the new SP, you need to change the delimiter temporarily to something like //, create the SP, and change the delimiter back to a semicolon. I wonder how this will pan out with SQL Server?

Now that my SP is built and execution rights were set I needed to figure out how to call the MySQL SP from a SQL Server Linked Server. Most of the time we interact with the MySQL via the OPENQUERY call. This allows us to build the statements in native MySQL so they execute faster. I found that you execute MySQL SPs with ‘call’ instead of ‘exec’ as with SQL Server but when I tried to use the CALL syntax, the query threw an error.

Msg 7357, Level 16, State 2, Line 13
Cannot process the object “CALL BugFix();”. The OLE DB provider “MSDASQL” for linked server “MySQL” indicates that either the object has no columns or the current user does not have permissions on that object.

Off to ‘The Google’. Searching for “executing MySQL stored procedures from SQL Server linked servers” landed me on a sqlservercentral.com thread with the exact issue (score!). Buried in the replies was the syntax I was looking for, EXEC(‘CALL BugFix’) AT MySQL. I was not familiar with using EXEC this way. Looking through Books On Line for EXECUTE, there it was, you are able to Execute a pass-through command against a linked server. Feeling good I headed over to SQL Server to run the command and bam, error:

Msg 7411, Level 16, State 1, Line 7
Server ‘MySQL’ is not configured for RPC.

I remembered that linked servers have lots of options under the hood. I headed over to the linked server definition and found a few RPC settings but which ones should I enable? A quick search took me to a blog post that answered the exact question. I ran the following statement:EXEC master.dbo.sp_serveroption @server=N’MYSERVER’, @optname=N’rpc out’, @optvalue=N’true’;

I re-ran my EXEC statement (no restart required) and it returned the results I was looking for: (2 rows affected).

Going into this problem I was a little hesitant to get started, procrastination set in because of the fear of the unknown. I’ve identified this problem before and the best way to get over it is to just start a small task like doing a quick Google search. Most times, the momentum of small tasks will help me get over the fear and get excited about the task at hand.

The other part of this exercise was the various error messages along the way. You can’t just give up and think it can’t be solved, someone has usually gone through the same issue before. Because of search engines, product documentation, and community bloggers, solutions are out there waiting for you to find them.

Doug Purnell
@SQLNikon
sqlnikon@gmail.com
https://sqlnikon.com

Spectre/Meltdown Baselining

This month’s #tsql2sday is being hosted by Arun Sirpal (b|t). The topic this month is how we conquered a technical challenge. The area I’ve been spending the most time in recently is baselining for the Spectre/Meltdown patches that are coming. We have a monitoring tool from Idera, SQL Diagnostic Manager, on our key production boxes but I wanted something consistent across all boxes that we could use for tracking and baselining wait types.

The Spectre/Meltdown can have a significant performance impact on servers running Windows Server OS that have intensive IO-intensive applications. SQL Server falls right into this area. We need a way to identify CPU changes across all our instances (dev/test/prod) while we apply the patches

There are plenty of queries to help identify wait types but I was looking for something that could be put into place quickly as patches from VMware and Microsoft are being released this week. I found the perfect one, sp_BlitzFirst from Brent Ozar Unlimited.

Their solution gives you the wait stat breakdown and reporting to help you identify baseline changes. When you have limited time to put something in place, these types of solutions from community partners are a great place to start. Having baseline wait stats has been on my list for years. There was always another project that was pushed to the front of the list, but now Spectre/Meltdown is at the top of my list!

I used the following blog post to get me started. We have all of our instance registered and organized under a Central Management Server. This feature allows us to run a query across multiple instances at once. I pulled down the latest First Responder Kit which includes the script to schedule and a Power BI report to analyze the results.

The next phase was how to identify a process to baseline. We don’t have any automated tools to regression test our apps (something else to work on) so I needed something to stress our SQL Server instances and record the wait types throughout the patching process. Once again the fine folks at Brent Ozar Unlimited had a blog post for this problem as well. Using SQLQueryStress, we can simulate a load on our SQL Server Instances and record the wait types. After each patch, we can execute the same load and compare the results using Power BI.

When faced with a technical challenge like Spectre/Meltdown, having a process in place to baseline your SQL Server Instances is important to show how changes to your environment can have lasting effects on performance.

 

T-SQL Tuesday #97 – I am an Obliger

Goal Setting
This month’s T-SQL Tuesday (#tsql2sday) host is Malathi Mahadevan (b|t).
Goal setting has always been a difficult process for me. I always do a great job of researching/organizing my goals for the upcoming year but fall short in the execution phase. I tend to help others with their projects & tasks before mine; I have a hard time holding my self-accountable. For this reason, I decided to take advantage of a professional development program at work that allows us to talk with a career/life coach for a few sessions.

The timing of Malathi’s T-SQL Tuesday post was perfect. I had my coaching session scheduled and it’s review time at work so all these things coincided to make this a perfect time to understand why I have a hard time setting goals before I started building another list of for 2018.

During my coaching session with Barbara Demarest (http://www.barbarademarest.com) she mentioned a book by Gretchen Rubin (b|t) “The Four Tendencies”. Each of us can be grouped into four categories when it comes to completing goals; I’m an Obliger. You can take a 5-minute quiz and see which category you fall in. Go ahead, I’ll wait…

Any surprises? Now that you have an understanding of your tendencies (Upholder, Obliger, Questioner, & Rebel), will you change your strategy for setting/completing your goals for 2018? I know I’m going to. I need external accountability to complete my goals. Because of my tendencies, I now know that I just can’t build a list of goals and hope to complete them. For 2018 I’m working on finding others to help me be accountable. It may be my supervisor, friend, or relative. By having external accountability I’ll be more successful in completing my goals. How about you?

Doug Purnell

T-SQL Tuesday #51 – Don’t bet against CHECKSUM when performing backups

Automating repetitive tasksThis month’s T-SQL Tuesday is being hosted by Jason Brimhall (blog | Twitter).  Jason’s theme is about gambling with SQL Server.  What areas have you gambled with?  Are there tasks that you should be doing that you are betting won’t be needed?  One came to mind for me, CHECKSUM.  I’ve always been a fan of Maintenance Plans.  They are an easy way to get basic maintenance tasks established on an instance of SQL Server.  The major down side of the built-in maintenance plans is no support for the CHECKSUM media option for BACKUP DATABASE.  If you are performing backups without this option enabled, you run the risk of the pages in the backup media not matching the pages stored in the database.  This is because without the CHECKSUM option, SQL Server doesn’t know if the page written to the backup media was written out correctly.  The CHECKSUM option ensures that every page written out to your backup media has a checksum that matches the checksum of the page in the database.  The CHECKSUM option also goes as far as doing a checksum on the entire backup that the RESTORE statement can take advantage of.  There is a performance impact for using the CHECKSUM option but as everyone says, your backup is only as good as your restore.  If you bet against your restore, your odds of a resume updating event just increased.

Now that you understand the benefits of CHECKSUM for your backups, how do you integrate it into your backup process?  It’s time to graduate from the built-in Maintenance Plans and move to Ola Hallengren’s SQL Server Maintenance Solution.  The is by far the best maintenance scripts available within the SQL Server community.  Do yourself a favor and pull the scripts down to a test instance and give it a try.  I made the move a few months ago and have never looked back.

Don’t bet against CHECKSUM when performing your backups!

T-SQL Tuesday #50 – Automating Tasks

This month’s T-SQL Tuesday topic is the reason I love being a DBA, automating repetitive tasks.  Hemanth Damecharla (b|t) picked a timely topic, as you start the new year, are there certain tasks that you need to automate that tend to sit on the corner of your desk yelling at you?

This topic also reminds me of a chart I came across in the past describing the time savings associated with automating repetitive tasks.  As at most places, we are all asked to do more with fewer resources.  This often leads to additional workloads without the time in the day to complete them.  How many of these can you automate?

Automate some tasks!
Automate some tasks!

As I approach each task or even large projects, I like to start with a manual approach.  This gives me time to understand the business problem I’m solving and plan my attack for automation.  Some tasks don’t need to be automated, they may be solved by using simple T-SQL statement can be built to solve the problem.  As soon as I run across tasks that contain multiple data sources, SSIS is my go to tool.  SSIS allows me to build modular packages that can be reused and organized for automation.

Back in 2012 when PASS was in Seattle, WA, I attend a pre-con for SSIS Design Patters hosted by Andy Leonard, Tim Mitchell, Matt Mason, Michelle Ufford, and Jessica Moss.  This really opened my mind to what’s possible with SSIS.  I recommend the book  that sparked the pre-con if you’re interested in learning more.

As you start your automation effort, realize the upfront time spent may seem counter productive.  But the time invested now will pay benefits down the road.  Instead of having to manually run scripts (and having to be reminded to do so), your automated process will be running in the background making you look like a rock star!

Happy Automating

 

Doug Purnell