Virtual Book Club: SQL Server Execution Plans, Chapter 2

SQL Server Execution Plans

Everyone loves reading a good book, especially about SQL Server.  I’ve been on a training kick lately and the timing of this book club was perfect.  Denis Gobo (b|t) has experience starting a book club where he works and decided to bring it to the SQL Server community.  The goal of the club is to read a chapter each week and discuss things you learned, liked and disliked.  You can ready Denis’s original post here.  The first book chosen was written by Grant Fritchey (b|t) and is titled “SQL Server Execution Plans, Second Edition”.  You can get a free electronic copy or purchase a print copy from simple-talk.com.

Continuing with my review of Chapter 1, here are my thoughts on Chapter 2.  You could think of Chapter 1 as walking into the shallow end of the pool and Chapter 2, swimming into the deep end, but in a good way.  I never felt like I was drowning in the material, all the topics were presented in a way that made it easy to comprehend.

Things I learned

Blocking/Non-Blocking determines when an operator can begin processing data.  Blocking operators need to wait for all the data before they can begin outputting data (Hash-Join is an example of a blocking operation).  Non-Blocking, on the other hand, can begin processing as soon as the data starts arriving (Merge-Join is an example of a non-blocking operation).  This subtle idea can help you better understand how data flows between your operators.

Sorting is something I always used as a presentation technique but in the world of query plans and table joins it can have a big impact on the optimizer’s plan generation.  For example, a merger-join will only be used on sorted data, period.  But if you have two non-sorted lists, hash-join may be a better option.  Because hash-joins builds temporary table with in tempdb, too many of them can cause tempdb performance issues.

Depending on the data sets, hash-join may be a good fit (think small data sets).  They should also be warning sign.  Pay attention to missing indexes, a missing WHERE clause, or calculations in the WHERE clause that can make it non-sargable (unable to use an index).

When using SET STATISTICS IO ON to better understand logical and physical page reads, scan count was something that I was aware of but didn’t fully understand.  There is a great blog post on MSDN that explains when you will see scan count 0, 1 or N.  Just think of it as how many times a scan is started looking for data in a data set.

When generating a query plan for an UPDATE statement, the optimizer has to first determine what rows need to be updated.  This is accomplished by first doing scan/seek operation.  Pay attention to the WHERE clause of your UPDATE statement as this will determine how efficient the select portion of the UPDATE statement will be.

Likes

Even though there was plenty more information presented in chapter 2 over chapter 1, everything was clear and concise.  Following along with the scripts on my local instance of SQL Server made the query plans easier to follow and I was spend more time understanding how the optimizer behaves.

Dislikes

There were only a few things that I would have liked to see presented better.  Under the Nested Loops Join section, the tool top defines it as “For each row in the top (outer) input, scan the bottom (inner) input, and output matching rows.  But Grant explained it as “… scanning the outer data set (the bottom operator in a graphical execution plan) once for each row in the inner set”.  Grants explanation seemed on contradict the tool tip so I had to spend extra time reading through this section to understand what was actually happening within the nested loops join.

Scan count was referenced under the Filtering Data section but no time was given to explain if a low/high scan count was a sign of possible problems.  But, as I do with other blog posts, when I hit a term and I don’t understand I search for it.  I was able to find a blog post under MSDN that did a great job of explaining how it works as well as samples to show what would cause 0, 1, and N scan counts.

On to chapter 3!

[contact-form][contact-field label=’Name’ type=’name’ required=’1’/][contact-field label=’Email’ type=’email’ required=’1’/][contact-field label=’Website’ type=’url’/][contact-field label=’Comment’ type=’textarea’ required=’1’/][/contact-form]

Virtual Book Club: SQL Server Execution Plans, Chapter 1

SQL Server Execution Plans

Everyone loves reading a good book, especially about SQL Server.  I’ve been on a training kick lately and the timing of this book club was perfect.  Denis Gobo (b|t) has experience starting a book club where he works and decided to bring it to the SQL Server community.  The goal of the club is to read a chapter each week and discuss things you learned, liked and disliked.  You can ready Denis’s original post here.  The first book chosen was written by Grant Fritchey (b|t) and is titled “SQL Server Execution Plans, Second Edition”.  You can get a free electronic copy or purchase a print copy from simple-talk.com.

This is actually my first book review.  Following Denis’s lead, I’m going to discuss what things I learned, what I liked about the chapter and areas I disliked.  As with speakers, feedback is important for authors.  Hopefully our discussion of the book will help Grant on his third edition.

Things I learned

Did you know that you can save a XML version of the execution plan and share them with others?  Well, neither did I.  This would be a great way to get feedback on better ways to build queries.  In fact SQLPerformance.com offers the ability to upload your plan directly from SQL Sentry’s Plan Explorer to their Q&A site.  The SQL Server community has a wealth of information and there are always folks available to help you with your questions.

I’m a big fan of Idera’s SQL Diagnostic Manager.  This tool makes it easy to see trouble areas within my SQL Server instances.  The only down side is that I’m not using the DMO statements on a daily basis.  With DMOs you’re able to pull a specific plan from the plan cache.  This is useful when you’re developing queries and need to view a plan that may have been saved from a previous execution of your query.  To interact with some DMOs, you need to use the APPLY operator because the DMOs are table-valued functions.  I’ve known about APPLY but have not had the need to use it in my daily work.

Likes

Grant did a great job introducing the query engine and how a plan is ultimately built and saved for later use.  Having a strong understanding of how the execution plan is built will help me write better queries.  Grant didn’t throw too much material at me during the first chapter.  This allowed me to comprehend the material better without feeling overwhelmed.

Dislikes

I didn’t find anything in this chapter that came across as a dislike and I’m guessing this trend will continue for the remainder of the book.

I’m looking forward the chapter 2 next week and the rest of the book.  My goal is to be able to read execution plans better than seeing how thick the lines are between operators.

[contact-form][contact-field label=’Name’ type=’name’ required=’1’/][contact-field label=’Email’ type=’email’ required=’1’/][contact-field label=’Website’ type=’url’/][contact-field label=’Comment’ type=’textarea’ required=’1’/][/contact-form]

Working smarter, not harder, as a DBA

The butterflies were flying!  This was my first presentation for Triad SQL PASS user group, and any group for that matter.  I started attended our local SQL PASS chapter back in the fall of 2011 while I was between jobs.  Kevin Goode was the one who got me thinking about doing a presentation, and after I had a few months under my current position as DBA, I came up with a topic: how to work smarter as a DBA.  I found that starting a new DBA position requires working on understanding your environment quickly without disrupting the day-to-day business.  This presentation gives you insight into tools that keep our servers running smooth and the training that keeps me up to date of the latest trends.

Here is the link to the PowerPoint slides.

Here is the link to the DBA Standards Document.

Here is the link to the server specific install settings and configuration.

Here is the link to the backup script I use each night.

Here is the link to the server side security audit script.

Here is the link to the database side security audit script.

Here are the links to the setup scripts that are used after the install of SQL Server 2008 R2.  Just edit the files based on your environment:

Database Mail Configuration Error 264

As most of you know, the Database Mail feature in SQL Server 2008 is the life blood of any notification system for your instances.  If Database Mail fails to send notifications, you may not be warned if a job fails, or worst, a backup job.  We had a server last week that was exhibiting a weird behavior within Database Mail.  I was receiving mail from all parts of the Database Mail environment except for Agent Job Notifications.  I was able to send test emails from Database Mail, able to receive maintenance plan summary reports and even receive emails from within maintenance plans using the notification task.  Why was Agent Job Notifications the only area that was mis-behaving?  As I started diagnosing the issue I checked the main areas that usually stumped me in the past:

  • Verified database mail profile was enabled under SQL Server agent (Alert System Page)
  • Verified profile security was public and default
  • Restarted SQL Server Agent

The above three items usually resolved any issue I have with Database Mail.  Did some more testing with my Agent Job and sure enough, still not working.  Then I started digging into the logs, under SQL Server 2008 you can see logs from SQL Server, SQL Server Agent, Database Mail, and Windows all under the same screen.  I dove into the Database Mail Log and there was an error for “[264] An attempt was made to send an email when no email session has been established”.  Never seen this before.  I started searching Google and found similar issues but none seemed to fit my situation or solve my problem.  Because we have a simple setup for Database Mail, one account and one profile, I decided to rebuilt the configuration.

I deleted the profile and account, and then went back in and went through a complete rebuilt of the configuration.  I used the same account and same profile settings as before. After I had the Database Mail configuration complete, I needed to make sure the profile was enabled under SQL Server Agent.  Right click on SQL Server Agent and click Properties.  Under the Alert System page, make sure Enable Database Mail is enabled and that your mail system is selected along with the correct profile name.  With that done, I need to restart the SQL Server Agent.  Upon restart the 264 error did not appear and I was able to receive emails from my Agent Job Notifications.  Don’t know what caused the issue in the first place, but the rebuild seemed to clear things up.

#meme15 for Twitter

Jason Strate (blog|Twitter) has started a new meme project about social networking for SQL Server professionals.  This month’s theme is all about Twitter.  We have been asked to answer the following two questions:

  1. Why should the average Jane or Joe professional consider using Twitter?
  2. What benefit have you seen in your career because of Twitter?

I consider myself an average Joe SQL Server professional.  I’m a DBA for a private University and enjoy solving business problems with data.  Up until June of 2010, I did not have a Twitter account and was not involved in the #SQLFamily.  When I was getting back into the DBA field after going to the dark side (management) for a few years, I found Twitter as the catalysts for learning.  I was able to follow a few big hitters in the #SQLFamily and with in the first day, started reading tweets for webcasts and blog posts.

This is the key for me using Twitter, there is so much good information posted everyday to keep you informed about SQL Server topics.  The beauty of Twitter is most of the #SQLFamily don’t post the same as your friends do on Facebook.  Twitter has become a medium for disseminating information.  Now, there are plenty of tweets from celebrities and even your friends that are just plain noise, this is where the #SQLFamily stands out.  Most of the people I follow tweet about topics related to SQL Server.  I enjoy reading new blogs and being notified of webcasts where I can learn new things.

As I stated earlier, I found Twitter as my means of getting back into the DBA world.  I started with Brent Ozar’s Twitter Book and created an account.  From there it was as easy as following Brent Ozar (blog|Twitter), Glenn Berry (blog|Twitter), Thomas Larock (blog|Twitter) and Aaron Bertrand (blog|Twitter).  From just those four people, I was introduced to so many more and started following them.  Each time I started following someone new, I was introduced to few other people and the Twitter snowball kept growing.  As of Jan 2012, I’m following ~150 people and it’s hard to keep up during the day and usually spend nights catching up on all the tweets for the day.

The benefits of Twitter helped me land a great DBA job in higher education.  I had plenty of experience under SQL Server 2000 & 2005 but only limited direct interaction with SQL Server 2008 & 2008 R2.  This is where the webcasts and blogs that I was notified about on Twitter helped me fill the gaps and use that knowledge to practice in my home lab.  Without Twitter, I would have not had the #SQLFamily behind me and would have struggled getting back in the game.

2012 To Do’s for #SQLFamily

As I sat down and wrote out my goals for 2012 for my job, they didn’t take into account my blogging, user groups and #SQLFamily.  This is why this is a perfect #mememonday project.  You can read more about #mememonday here from Thomas Larock (Blog, Twitter).

I’m fairly new to the #SQLFamily.  I didn’t start getting involved till June 2011 and up to now, didn’t participate in monthly blogging like #mememonday or #tsql2sday.  For the past 8 months have just been digesting information like I was eating at Golden Corral.  Twitter, webcasts and RSS feeds can take hours a day to stay updated.  I still find it hard to get through all the Twitter feeds a day and finally had to limit the number of RSS subscriptions under Google Reader so those didn’t get behind as well.  As I write this, I can see TweetDeck notifications going off every few minutes.

So here are my personal goals list for #SQLFamily in 2012:

  1. Continue to attend Triad PASS and Triad BI PASS.  The part that I’m going to work on this month is to blogging about my experiences.  I’ve seen a few other posts related to this item and thought it was a great idea.  This will help others that have not started, to attend their local PASS chapters to see the topics discussed how much fun they are.
  2. Start attending the virtual PASS chapters.  There are some great virtual chapters held every week that discuss topics that you may not otherwise receive at your local events.  Here to, I aim to blog about each one to keep my writing skills fresh.
  3. Present at Triad PASS in June 2012.  Our local Triad PASS chapter leader, Kevin Goode, asked me to present and I decided to jump in.  This will be my first time ever presenting anything outside of lunch and learns through work.  I have not picked a topic yet, but because I started a new job in November 2011, I’m thinking about walking though my management processes for my new SQL Server instances.
  4. Stay involved in #mememonday events.  This was started by Thomas Larock  (Blog, Twitter) and I’m looking forward to these monthly blogs. This is a great way to blog about something that you otherwise may not write about.  I feel too many times and get in my comfort zone of daily DBA tasks and don’t stretch my boundaries to learn new things with in SQL Server.
  5. Start blogging for #tsql2sday.  This was started by Adam Machanic (blogTwitter) and just like with #mememonday, will give me the opportunity to participate in blogs about items I’m unfamiliar with.  This will require researching a topic and writing about it.  I’m looking forward to this one.
  6. Midnight DBA.  Sean (blog, Twitter) and Jen (blogTwitter) make this live broadcast a blast.  I’m guilty of falling asleep before the show starts but my goal this year is to catch at least 2 shows a month live.  The pre and post shows are just as fun as the main show.  They cover technical topics with a fun twist and their sense of humor keeps me laughing through the entire show.  This will be another event that I plan to blog about.
  7. Brent Ozar PLF’s Tech Triage Tuesday.  This is a great half hour weekly webcast that covers all aspects of SQL Server.  Brent Ozar (blog, Twitter) and his team, cover topics from performance tuning to SANs, each week is a must see.  As you can guess, I’m going to blog about these as well.

My goals this year for #SQLFamily are all around bringing awareness to the training opportunities that are available to everyone.  Whether you are able to attend these events live or watch them from the comfort of your home, my goal is to spark interest in all the events.  This is how I was introduced to #SQLFamily back in June 2011 and I hope I can do the same for other SQL Server DBAs and developers out there.

Have a great year #SQLFamily,

2011 was the year of #SQLFamily

As the name suggests, SQL Family feels like family.  There is no other professional organization in the world that supports a product line as well as #SQLFamily.  My introduction to #SQLFamily was in the summer of 2011 when I decided to get back into SQL Server full-time after going to the dark side, management, for the two previous years.  I had a strong background in SQL Server 2000 & 2005 but not the full-time experience under 2008 & 2008 R2.

As I started searching for training opportunities for SQL Server, I came across Pragmatic Works.  Every Tuesday and Thursday throughout the year, they have a one hour web cast on all areas of SQL Server.  This allowed me to catch up on what was new under 2008 and brush up on the daily DBA tasks that I was accustomed to.  Each of the presenters had a personal blog and twitter address that had even more content over the session that was offered.  This got me interested in blogging for myself and starting to use twitter.  As I started searching for ways to get started in blogging and using twitter, I came across Brent Ozar (blog, twitter).  He built a great guide to help understand what twitter was all about.  I wasn’t interested in following celebrities or sports figures, I just wanted to use it for SQL Server.  It just so happened that Brent was a DBA and a photographer.  This one-two punch was just the right mix to start me on my way into WordPress and Twitter.

As I started following Brent, I started reading posts from him and other SQL Server professionals about the passion the SQL Server community had for helping others.  This was perfect for me as I started on my way to becoming a full-time DBA.  Each new blog entry or twitter post gave me a new understand of SQL Server and how strong the community was.  This also introduced me to PASS and the local user groups that were offered in my area.  As I started attending the local events, that same passion within the on-line community was equally as strong at the local level.  This allowed me to network with other SQL Server DBAs and get their input on ways to get back into the field full-time.

The local PASS events lead me to SQL Saturday.  I was able to attend the Atlanta #89 event in the fall of 2011.  This was very eye-opening for me.  There were over 400 people gathered for a full day of free training on a Saturday.  I was finally able to meet a few folks that I had only meet via twitter.  The highlight for me was hearing Bob Ward from Microsoft talk about wait types.  His session was level 500 and then some.  It was cool to see the inter-workings of SQL Server from one of the people who has access to the source code.

By the fall of 2011, I was already talking to a few companies about DBA positions and felt confident about finding the perfect DBA job.  As I accepted my current DBA role, I thought back to the family that got me there.  With out #SQLFamily, this would have not been possible.  This has given me the drive to give back to the community so others out there can find their perfect DBA role like I did.  My first step is our local PASS chapter and presenting during the summer of 2012.  I’m also working on blogging more regularly throughout the month, so others can learn from my view of being a SQL Server DBA.

I love being a part of the #SQLFamily.  Looking forward to a great year in 2012!

September’s Inaugural Triad SQL BI PASS User Group

I attended the inaugural Triad SQL BI PASS User Group last week hosted by Inmar in Winston-Salem NC.  DataMasters provided the pizza before the meeting and Wayne Snyder [twitter] from Mariner USA was the guest speaker that talked about the top 10 reasons to upgrade to SQL Server Denali.  Melissa Wittner [triadsqlbi@gmail.com] is the Chapter Leader for the new group and did a wonderful job getting the group start and the meeting organized.  You can visit the chapter’s website here.

Future meetings:

  • October 23rd Performance Point
  • November 29th Dashboards
  • December 8th Social @ Foothills Brewing Company
Top 10 Reasons to Upgrade List:
  1. SSIS Enhancements
  2. Master Data Services Enhancements
  3. Data Quality Services
  4. Column Indexes
  5. SSRS w/ SharePoint
  6. SSRS Data Alerts
  7. PowerPivot Enhancements
  8. DAX (Data Analysis Expression) Language
  9. SSAS Tabular Model
  10. Project Crescent
Wayne spent the next hour doing a deep dive demo for some of the items in the top 10 list.  Wayne has spent the last 9-12 months working in Denali and it shows.  He was able to show the group all aspects of SSIS, column store indexes, and PowerPivot.  This has energized my interest in Denali and trying a few of these new features in my lab.  I’m looking forward to attending this user group on a regular basis.

September’s Triad SQL PASS User Group

I attended a great session September 15th 2011 at the Triad SQL PASS User Group with Jason Hall from SQL Sentry demonstrating the advantages of SQL Sentry’s Plan Explorer over the built-in SSMS plan explorer.  You can find Jason under @SQLSaurus on Twitter. The meeting was hosted by Dynamic Quest, and everyone’s favorite pizza and soda was served before the meeting started at 6:30pm.

Jason went through 7 example queries to show how the SQL Sentry Plan Explorer reporting results compared to SSMS.  Sentry’s product is free and integrates directly into SSMS.  Here are some other advantages with Sentry’s Plan Explorer:

  • Features color scaling from yellow to red so you can easily identify problem areas.
  • Features separate I/O and CPU costs to identify disk vs processor intense queries.
  • The plan viewer is must more condensed than SSMS’s view.  This enables you to see the whole plan much easier.
You can download Plan Explorer for yourself here and test things out.
Doug Purnell
sqlnikon@gmail.com

Understanding SQL Server Instances

Being raised within SQL Server 6.5 & 7.0, SQL Server instances were not available in those versions.  I never fully understood the advantage of having multiple instances on the same server so I never embraced it.  I always had a single, default instance per physical server.  Once I got involved with virtualization a few years ago, having multiple instances of an OS running on a single physical server finally got me thinking about multiple instances of SQL Server.

My lab at home has been the perfect place to install, configure and understand multiple SQL Server instances.  I have two servers running ESXi 4.1 U1 and VM’s to support a small Windows Domain.

I started with a copy of Windows Server 2008 R2 64bit as the guest OS.  I installed my first copy of SQL Server 2005 Standard under a named instance of SQL03\SQL2005Standard (I currently don’t have a standard instance of SQL Server installed on this VM).  I then proceeded to install SQL Server 2005 Enterprise, SQL Server 2008 Standard & SQL Server 2008 Enterprise all as named instances on the same VM as SQL03\SQL2005Ent, SQL03\SQL2008Standard, SQL03\SQL2008Ent, respectively.

Here is what I learned about supporting multiple instances of SQL Server from a DBA point of view:

  • The default instance of SQL Server, known as MSSQLSERVER, runs under TCP port 1433
  • Under Windows Server 2008, the firewall is enabled by default so you need to create inbound rules to allow outside connections to be able to reach SQL Server.
  • SQL Server Browser service is responsible for taking requests for named instances and forwarding them to the correct port, whether it’s dynamic or statically configured.  If there are no named instances of SQL Server, the Browser service is not required.
  • The SQL Server Browser service runs under UDP port number 1434.  This will need to be opened under the firewall.
  • With out the SQL Server Browser service running, you would need to provide the IP and Port number (xxx.xxx.xxx.xxx:yyyy) in order to connect to a SQL Server instance.
  • To connect to a SQL Server instance by name, you use the <computer name>\<instance name> syntax.
  • By default, named instances are set to dynamic ports.  This causes an issue trying to open ports on the Windows firewall because each time SQL Server is restarted a new port can be set.  With the firewall enabled, it’s best to switch to static ports for all named instances and then create inbound rules for those TCP ports.
  • If you prefer to use dynamic ports, you can also exclude the sqlserver.exe process from the firewall.  This would allow inbound connections for just the SQL Server process and thus all clients to connect to dynamic or static ports without having to create individual rules for each TCP port number.
  • To set an instance to a static port, open SQL Server Configuration Manager (SSCM) and drill through SQL Server Network Configuration and expand the Protocols tree for your named instance.  Right click on TCP/IP and select Properties.  Click on the IP Address tab and scroll all the way to the bottom of the list until you see ‘All IP’.  Clear out the TCP Dynamic box and under TCP Port, provide the desired TCP port number.  To switch back to dynamic port allocation, just clear out the TCP Port box and place a zero in the TCP Dynamic box.  Any time you make a change to the Protocol settings, you will need to restart the SQL Server service for the named instance.
  • TCP and UDP ports are registered with a national directory.  You can reference them by using this Wikipedia article.  I chose to use TCp ports 1435 through 1438 for my instances.  Well known ports are from 0 – 1023, Registered ports are from 1024 – 49151, and Dynamic ports are from 49152 – 65535.
  • To see a list of ports the server is using, run ‘netstat -an’ from the command prompt.  Use this data along with available ports from Wikipedia to determine what ports are available on your system.
I hope you find this useful when you try to install SQL Server with multiple instances when the Windows Firewall service is enabled.