Challenge Accepted

My life for the last 2 years has been a constant battle of putting out fires with system performance; finally user complaints have moved getting this resolved as my top priority.

Let’s see how I tackled the problem…

Symptoms:rubix4

  • Very High Disk Latency as high as 300,000 milliseconds (ms) is not unusual
  • Average: 900 – 15,000ms
  • Memory Pressure
  • Slow User Experience

Problem:

  • Bad hardware
  • Over-provisioned VM Hosts (what happens on one VM effects the other)
  • Old NetApp SAN
  • No infrastructure budget for new hardware

Challenge: Make the system viable with no hardware changes or tweaks

Step 1: Brain Storming (in no particular order)

  • Reduce I/O
    • I can probably tune a ton of old stored procedures
    • I need to do a full review of all indexes
  • Reduce blocking
  • Investigate daily data loads
    • How is the data loaded?
    • Can it be improved?

rubx3Step 2: Reduce I/O & Investigate daily data loads

After doing some research, it was found that we were truncating 48 tables daily with over 120 million records as part of our morning load. The process was taking over 2 hours to complete each morning and would often cause blocking. During this time users would run reports and complain data would not return in a timely manner. So I thought maybe this would be a great place to start.

I also noticed we were loading 8 tables to keep them “real time for reports” once every hour.  This resulted in a total of 9.6 million records being truncated and subsequently reloaded, taking approximately 17 minutes of every hour.

Solution: Implement transactional replication instead of doing hourly and morning truncate and reloading of tables.

Outcome: Once implemented the disk I/O dropped drastically and disk latency reduced to an average 200ms. The morning load times dropped from 2 hours to 9 minutes and the hourly load went to 5 seconds down from 17 minutes. Now, the disk latency is not optimal still but better. Best practices say it should be below 20ms.

This solution was difficult to accomplish because of all the work that went into it. Once the replicated tables were stable, I first identified which stored procedures were utilizing those tables (I used Idera’s SQL Search for this). Then I changed each procedure to read tables from new location.

Next, I had to change any SSRS reports that had hard coded calls to those old tables (Note: don’t do this. Always use a stored procedure). Finally, I looked for any views that called the tables and adjusted those as well.

In two weeks’ time, over 500 stored procedures, reports and views were manually changed.

It is probably worth noting that this was all done in Production simply because we do not have a test environment for this system.  Yes, I did get a few bumps and bruises for missing a few table calls in store procedures or typo’s or nasty collation errors that arose.  These were bound to happen and some changes I was not able to test during the day.  All in all it went really well. Having a test environment would have alleviated these, but not all of us have the luxury.

rubix2

The OOPS: Unfortunately, not long after I implemented the first couple of tables I began to notice blocking. When I investigated I found it to be replication. I forgot a very important step, which thanks to a blog post by Kendra Little I was able to quickly identify and solve. I needed to turn on Allow Snapshot Isolation and Is Read Committed Snapshot On. Her blog was a HUGE help. You can read at her blog all the details as to why this is important here: http://www.littlekendra.com/2016/02/18/how-to-choose-rcsi-snapshot-isolation-levels/ . Once those to options were implemented the replication ran seamlessly and the blocking disappeared.

Step 3: Index Review

First of all, I always preach as a Lone DBA don’t waste your time reinventing the wheel, use what is out there. So I turned to the trusted scripts from Glenn Berry (B|T). You can find them here: https://sqlserverperformance.wordpress.com/2016/06/08/sql-server-diagnostic-information-queries-for-june-2016/ . I am not going to supply snippets of his code, feel free to down load them directly from his site to review.

I started by reviewing duplicate indexes and deleted\adjusted accordingly where needed. Then I went on to looking for missing indexes (where some magic happens). This reduced the amount of I/O because it lessened the amount records that had to be read due to using proper indexing.

Now just because these scripts stated they were missing I didn’t just create them; I evaluated their usefulness and determined if they were worth the extra storage space and overhead. Glenn’s script gives you a lot of information to help decide on the index effectiveness. As you can see with the first one in the result set, if the index was added over 45,000 user seeks would have utilized it and query cost would drop on average by 98.43%.  Again I didn’t arbitrarily add this index because it was in the list.  Once I determined I would not be creating a duplicate or similar index on the table and given the potential of better performance with the suggested index, it was added.

index

Oh one more OOPS…(why not, learn from my mistakes)

After going thru the indexes exercise and adding indexes to the tables (in the subscriber), I lost all of them minus the Primary keys. Yep, made one change to a replicated table and the replication reinitialized; all my indexes were dropped. Needless to say I was not a happy camper that day. Lucky for me each index I added was scripted and put into a help desk ticket. I was able to go back thru all my tickets and resurrect each index I needed. Now, to be smart, I have scripted all of them and place those into one file, so I can re add them all if needed in future. I haven’t found a way around this yet, so if anyone has any information on how to feel free to let me know.

Step 4: Performance Tune Slow Stored Procedures (the fun part for me)

Armed with Grand Fritchey’s (B|T) book on Execution plans for reference I began tuning any stored procedure I was aware of that was taking more than 2 minutes to run. In total, I tuned about 77 of them, most were report related or part of data loads. I found many benefited from indexes being placed on temp tables within the procedures. Others were doing too many reads based on bad WHERE clauses or joins.

Another thing I ran across was functions used in where clauses or joins. Example of which is date conversion functions that were converting both From and To Dates used a BETWEEN statement. The functions caused each date value to be processed by the function before being evaluated by the WHERE clause, causing many more reads then necessary. To work around this I read in the data and converted the dates into temp table, then did my JOINS and WHERES on the already converted data. Alternatively, depending on what the statement was I also converted the value and placed in variable for later evaluation.

There were so many more things I came a crossed and tuned such as implicit conversions, table spools, and sorts that were not optimal. All of these were fixed by little code changes. I am not going into all of that because this post would be quite long, but you get the point.

Happy Side Effects: After cleaning up the tables and implementing replication I actually free up 300 GB of storage and greatly reduced our backup and restore times.rubix1

Summary:

Things are running much better now; introducing Replication reduced enough disk I/O to keep the system viable. For now latency now hovers on average between 2 and 200 milliseconds, which is a vast improvement. I do, however, still see spikes in the thousands of milliseconds and users still complain of slowness when they run large ad-hoc queries within the application (JDE Edwards E1). Unfortunately, that goes back to hardware and the application itself which are things that I cannot improve upon.  The good news is, I am hearing a rumor that we will be installing a Simplivity solution soon. I am very excited to hear that. I’ll blog again once that solution is in place and let you know how that goes.

The Shield

small shieldHow many of you are known as the “Grumpy DBA” or have a bad reputation with users because you are always saying no or they have to wait? I know many DBAs that have this reputation. To avoid this, I use my manager as a shield and suggest you do too. As a Lone DBA, with an extremely full plate, I learned that having that shield is necessary. It prevents me from being seen as the bad guy and protects me from work overload.

We all experience what I call, “Drive Bys”, when people are asking for stuff on the fly. Telling someone “No” while they are waiting in your office can be hard to do and can reflect poorly on you.  So how do you avoid that? While you probably cannot prevent the drive by, you can however; fix the perception the user has as they walk away. When drive bys occur I take time to listen to the user’s needs, let them know I will look into it, and then follow up with my manager without giving a yes or no to the work.  I’ve found this to be not only the best way to keep from becoming a “Yes Man” and trying to fulfill every request, but also keeps me from having to say no.

Using your manager as a shield puts management of the workload on their shoulders instead of your own.  This, in turn, keeps them apprised of the work load, and prevents your plate from getting too full without negative user perception.  My manager has no issues saying no to users or prioritizing requests appropriately.  Doing this removes you from being the bad guy and prevents the opinion that the user’s needs aren’t important to you.

The key to maintaining a healthy user relationship is to make sure their needs are heard and you are doing your best to give them what they need to be effective at their jobs. It’s easy to become the Grumpy DBA when you’re forced to be the nay sayer. With my shield in place, I can tell the users that I passed the request along and their work is being prioritized. If they have any questions they can follow up with my manager to see where their request stands.

So far this works well for me, as a Lone DBA, and has become vital in preventing me from becoming over worked, over whelmed, and burnt out.  If you don’t already have a shield in place, I would recommend talking to your manager and seeing if you can work towards one.

Good luck!

SQL Family: The Wonder Years

Last week, Bill Wolf aka @SQLWareWolf and I somehow got onto the topic of High School pictures. So in jest, I decided to post mine and hash tagged it with #SQLHSPics on Twitter. I challenged others to do the same, only really expecting @SQLWareWolf to respond in kind.  I was floored with over 100 picture responses from #SQLFamily. Many of them went searching through attics, yearbooks, called relatives, and other great lengths to be part of it. As always the response was heartwarming and hysterical to say the least.

Tweethspics

The reason why I am taking the time to blog about it is to reiterate how great it is to be part of this amazing community of SQL professionals. If you’re not already involved, then I encourage you to get involved. These wonderful people not only provide me with mentoring, education, laughter, and mental breaks, but also a true sense of family. Not many know, but I am going through some big things in my life and that week was more difficult than most. The #SQLFamily, unknowingly, helped me get through it with a smile and I am grateful more than you know.

Exhibit A: David Klee, @kleegeek (our winner for most laughs, re-tweets, and memes by far)

THEN &  NOW

CaUAoeVWQAAQLLl

klee

 

 

 

 

 

 

 

 

After this picture was posted it lead to a slew of responses and new hashtags including #myfirstklee, which showed pictures of peoples reactions to David’s picture.

CaptureCapture2Capture3

react2 react react 3

Love to all my #SQLFamily and thank you!

Where else would you find highly professional people posting pictures of their most awkward growing years for us all to comment freely about?

You know that, I can’t end this without posting some of the pictures from that week!

Enjoy!

Hs1 hs2 hs3 hs4 hs5 hs6 hs7 hs8 hs9 hs10 hs11 hs12hs13hs14hs16

Back to Basics: Why not parameterize?

I think sometimes those of us that have been doing database administration/development for a while take it for granted that everyone knows the basics. One such basic is parameterizing stored procedures. This allows us to potentially consolidate multiple stored procedures into a single procedure.  It’s as simple thing to do that many don’t.

I try to parameterize as many stored procedures as possible. This not only minimizes the amount of procedures I need to maintain, it in my opinion is a much cleaner way to code. It disturbs me when I see multiple stored procedures that pull the exact same data, but may have slight differences between them. Whether it be a sort, a where clause, or even just an extra field or two that makes it different, some developers think you need a different procedure for each one . Why not consolidate and parameterize?

Exhibit A

The code below is an example of a real work scenario.  Originally, it was 8 stored procedures and with 8 correlated reports. By simply adding a Report Type parameter I was able to make it one stored procedure and as well as consolidate to a single report.

To add a new dataset just right click on Datasets and choose Add Dataset. Since the report is a stored procedure we set the dataset connection string to the stored procedure name and its parameters. This is just my preferred method. You can also choose the stored procedure from the drop down.

rep

rptTrackMonthlyStats @ReportType, @year, @startdate, @enddate

rp

In the Report Type parameter, choose add Available Values. I typed in each option so the user could choose which report layout/data they wanted to see from drop down. That parameter will be passed to the stored procedure upon execution and the proper dataset will be returned. The users will never see the T, TD etc. they only see the label so it doesn’t make any difference to them what those are.

Parareport connectiom

You can even go as far as using these parameters to hide and show different report elements, but that’s for another time. Stay tuned for more back to the basics.

NOTE: There are some reasons not to do this, like the reuse of the execution plans and parameter sniffing but in these cases consolidating would not be an issue as they use the same parameters.

QUICK & DIRTY: Table name change for all stored procedures

Recently, I was tasked to find all stored procedures in a database that use a particular table and change them to utilize another table. I needed to get this done quickly so I decided to think simple. Since it was a quick and dirty answer I figured I would share, it’s not rocket science, but useful none the less.

The quick answer that I came up with is to script out all of the stored procedures into a single query window.  This can be done easily through the GUI.  Once that is complete, I can easily do a “Find & Replace” on the table name and we’re done!

Let me show you how!

Step 1: Script out all stored procedures for a database

Right Click on Database > Choose Tasks> Choose Generate Scripts

Generatescrpts

If you have not used Generate Scripts before it has a great wizard to walk you through it. You can create scripts for the entire database or just certain objects. You can even narrow it down to a single stored procedure. In this case I wanted all stored procedures.

scripts1

Step 2: Choose Stored Procedures all or specific ones

scripts2

Step 3:

Then I went to the ADVANCE Options and set Continue on Error to TRUE so it will  skip any issues that might surface. If there are issues, I will address them at a later time. I also chose to create the scripts to a new query window.  By doing this, it’ll be very easy to do a find & replace on the table names.

scripts3scripts4

Step 4: Once the new query window was populated I did a simple find and replace of the table name I wanted to change. I also changed the word CREATE PROCEDURE to ALTER PROCEDURE for all of the procedures so that I would not need to drop and replace.

Step 5: Back up your database!!! Another option is to simply keep a copy of the script before any modifications had been made. In the event of an error, I can just rerun the original script to put things back to what they were.

Step 6: Run Script

In this case, 365 stored procedures were updated and I had the process completed in about 15 minutes. Not too shabby.

*Remember it’s best to always do this in test first.

November #SQLChat – How to Build your Name Recognition and SQL Network

For those who don’t know Idera Software sponsors a SQLChat on Twitter once a month. This month I got the privilege to host and had come up with a topic and questions for discussion. Below are the questions and answers I provided as well as some from others that chimed in. I think these are worth noting and give some valuable information especially to newbies of the SQL Community.

Q1: How have you benefited from networking with SQL professionals?

My Answer: As a lone DBA networking has gained me “co-workers”. I now have people to bounce ideas off of. I use these connections daily. Many have gained job opportunities based on just networking and getting to know other SQL professionals. I actually know several companies that hire based on SQL Networking relationships instead of utilizing recruiters. Networking exposes you to so many other facets of SQL you may not have otherwise looked into. My follow- up responses are depicted in italics.

A1: I have met incredible people who opened doors for my career. I wouldn’t be where I am today without them – @SQLDBA Kendal Van Dyke

I too have experienced this.  Getting connected and familiar with the SQL Community can really launch and expand on your career.

A1: If you don’t talk to folks in the same prof. you in a silo in many ways that can be a serious obstacle to progress – ‏@sqlmal  Malathi Mahadevan

I totally agree with this one. I found myself before I started networking in my own little world not really expanding my knowledge.

A1: I’ve built my network & ID’d a Go-To SQL Person for a variety of problems: Backups, mirroring, index, optimize, etc – @IrishSQL Rie Irish

I actually talk about this is my SQL Saturday presentation. Getting Go-To experts in all aspects of SQL Server is key, especially for a lone DBA like me. You cannot be an expert in everything, but you can build a network of those who are experts in their own realm of SQL Server.

A1: I was burnt out & wanted to change professions before I went to my first UG mtg and started meeting people and it re-energized me –@sqlgator Ed Watson

It is so easy in this field to get burnt out. I truly love how inspiring and motivating our community is.

Q2: What avenues have you used to build your SQL network?

My Answer: Twitter first and foremost. I am on it almost every day talking to SQLFamily and building those relationships. Even if it is just to say “Good Morning!”, people get used to seeing you every day and you becoming more involved. I also blog now and speak at SQL Saturdays. Now that I am an Idera ACE for 2016 you will be seeing more of me this year as I travel around to more SQL events. I am extremely excited to get even more involved.  Another great aspect of this community is that it’s easy for introverts to mingle their way in because it’s so inviting. We have lots of introverts in this community. There is a place for everyone.

A2: I’ve found that while SQL Sats, Summit etc are helpful, adding in Twitter is like rocket fuel for the process- @sqlstudent144 Kenneth Fisher

This is very true, add Twitter to your networking tools and you will see how much of an impact it will have.

A2: On Twitter since an amazing #SQLPass at the recommendation of @GlennAlanBerry been blown away by the SQL Love and support on Twitter – @_adamnichols Adam Nichols

I love this one. It goes to show how inviting our community is to new comers. The passion for what we all do shines through even in just 140 characters.

A2: Twitter! SQLSaturdays, user groups and events like PASS Summit and SQLBits. Newest is Slack – @cathrinew Cathrine Willhelmsen

The WIT (SQL Women In Technology) group has just started a new Slack channel, for those ladies that want to get involved, drop me an email or direct message I’ll get you invited.

A2: I joined the #sqlchat today and from reading the Tweets, it feels like a great SQL resource – @crhanks Cary Hanks

This is exactly why we do these types of things. It helps to get more involved and share our experiences with each other.

Q3: How can newcomers get started on networking within the SQL community?

My Answer: Get a Twitter account! Just start interacting don’t be afraid to jump into conversations, we don’t bite and I hear we have cookies.  Make sure when you setup your Twitter account and use SQL in your handle or at least in your Bio. It helps us recognize family members. In addition, change your avatar to an actual picture of yourself, start getting yourself out there. It’s great to put a face with a name.

A3:  Advice for to newcomers is to start with your local user group, attend SQL Saturdays and talk to the organizer and Tweet. – @LindsayOClark Lindsay Clark

If you don’t know if your town has a local user group, visit the PASS website and look it up.  If there is not one near you, try a virtual chapter those too are a great resource and way to get involved.

A3: Agreed if it is on Twitter and you can see it, it’s a public conversation –jump in. – @DanielGlenn

This a great tip to remember and several others chimed in and stated the exact same thing. Jump in to conversations freely, people will respond to you. Don’t be afraid to do so.

A3: Getting out of lurker mode on Twitter helps. Introduce yourself! I  often suggest new users of Twitter give a look to @BrentO’s free ebook on the topic brentozar.com/twitter/book@vickyharp Vicky Harp

This is fantastic resource for those new to Twitter. Great advice Vicky! I also completely agree with getting out of lurker mode. You can gain a lot by watching conversations and reading the information shared but you again even more by participating.

Q4: Do you have name recognition? Why do you think that’s important?

My Answer: I am working on building name recognition, I’ve begun using SQLEspresso on my blog, cards, and emails. I think it’s easier to remember then a name. I think building name recognition just opens doors it is not about becoming “SQLFamous”.  I’ll admit it floors me when someone recognizes me as SQLEspresso, I get a kick out of it.

A4: I heard “Oh, hey, you’re @AMtwo!” more than once at PASS Summit. Name recognition helps build relationships – @AMTwo  Andy

I’ve had this same thing happen, many others in the chat said they did too.  Since we are all located all over the globe social media, blogs etc. are our personalities and only interaction with many #SQLFamily members.  It’s important to build that name recognition and keep building it in order for people to remember you because of the lack of in person interactions.

A4: I have some name recognition. Enough for me. It helps when I need answers & gives more weight when I give answers – @IrishSQL Rie Irish

A little goes a long way. I agree with Rie, the more your name is out there the more credence you responses and questions get.

A4: I try to use the same photo of myself everywhere to help with self-brand recognition –  @johnsterrett  John Sterrett

This too is great advice. If you are trying to build a brand or name recognition consistency across all platforms is a must.  I’ve actually just started doing this myself.

Q5: What names or brands do you recognize? Why do you think that is?  

My Answer: There are so many names in the SQL Community I recognize because they make themselves visible and give back to the community.  They also promote others to get involved; you can see their passion for SQL Server and its family.

A5: Brent Ozar (wicked marketing chops) Paul Randal, Kimberly Tripp etc. They all have excellent branding –@sqlrus John Morehouse

These are a few of the “big” names you see every day. Why are they big names… because they give their time and knowledge to our community. They are active and are consistent in their brands image.

A5: I recognize people who write books/blogs, who speak at UGs & SQL Saturday type events, & engage, on Twitter. –@SQLDBA Kendal Van Dyke

I think Kendal’s response enforces the idea of noticing those that get involved and give back.

A5: Leaders are “created” by their efforts and community acknowledgement. The most referenced names are that was for a reason. – @tomsql Tom Staab

A5: I think cheerful helpers in the SQL community gain name recognition whether they seek it or not. –@vickyharp Vicky Harp

I can’t agree more with Tom and Vicky. You don’t have to seek name recognition it is naturally created. You probably have more name recognition than you think.

Q6: How do you find time to network and build your personal brand? Are you able to do it as much as you’d like?

My Answer: I make time. Even just a little here and there makes a difference. I take a minute every day to pop into Twitter and say Hi.  I have started writing a weekly blog, as time allows and I give my time to my SQL user group. For those who know me personally they know I have a crazy schedule and as a lone DBA my work load is tremendous but the SQL community is important to me and I find time to network and get involved. It’s worth every minute of my time.

A6: I spend time building my brand and networking without knowing I am doing it. I focus on things I am passionate about so it’s really just my hobby time – @johnsterrett  John Sterrett

Many people have a brand and don’t even know they do. It’s just something the freely develops and can be cultivated if wanted.  I also find and think most people will agree if you’re passionate about SQL server it is a hobby for you and you make time. It’s one of the great things about our careers; we find it fun and don’t think of it as a job.

A6: I wish I could spend time participating in #SQLChat today. A meeting’s preventing me from it. Just wanted to say I <3 #SqlFamily –@DBAArgenis Argenis Fernandez

This is exactly my point. You make time. Argenis wanted to support me in this #SQLChat and made it a point to make time. Thanks Argenis!

All and all the chat session went really well. There are a lot of take a ways from this. The few I have I highlighted here helps drive home my point.  My notifications on Twitter blew up with so many responses; I wish I could include more in this post. We actually broke a record for Idera on the most tweets and involvement for a #SQLChat with over 370 tweets.  Thanks to all that played a part in the conversation, I hope it was as fun for you as it was for me. I am looking forward to next month’s topic.

Everything is coming up ACE’s

I am thrilled to announce that I have been chosen as one of the 2016 Idera ACE’s. It is truly an honor to be part of this great program and give back to the SQL community.

What is an Idera ACE?

According to Idera.small ace duck

“ACEs (Advisors & Community Educators) are active community members who have shown a passion for helping the community and sharing their knowledge. We help the ACEs pursue that passion by sponsoring travel to
select events and offering guidance for soft skill training.

Requirements to become an Idera ACE:

  • Enthusiastic members & leaders of the SQL community
  • Accomplished contributors to the SQL community
  • Good speaker, writer and presenter
  • Demonstrated a passion for educating fellow community members

My SQL Saturday Addiction

Recently I joked on Twitter, that I am now addicted to SQL Saturday’s and need a GO FUND ME just to pay for them all. Well, that is no longer the case. As an ACE, Idera will generously sponsor some of my 2016 speaking engagements. This year I will have attended 5, next year with Idera’s help I hope to attend even more.  This amazing gift will allow me to not only grow in my career but also help others to as well.

What I Love About This

Being an ACE doesn’t mean we have to be sales people for Idera. Instead we are given means to enrich our knowledge about Idera along with opportunities to give feedback. We get to participate in Beta testing and tell them how we have used their products in the past, to help them continually improve. I cannot wait to start working with the product teams.

The Need for Tools

As a lone DBA, I rely on products such as those from Idera to juggle my daily work load. As I say in my session about Survival Techniques for a Lone DBA, I have to be an octopus to get all the work done. Products like the ones from Idera act as my extra arms. They allow me to quickly monitor my servers, perform administrative tasks, and perform health checks among many other things without having to write my own scripts. The time these tools save me is invaluable so I am happy to be able to contribute my input on them.

Thanks

I thank Idera for investing in me. I will fully take advantage and make the most of it. I am humbled by the fact I have been chosen as a 2016 Idera ACE. Congrats to the other newly appointed ACE’s, I look forward to working with you!

 

Initial SQL Server Configurations

Wonder if I Do Things Differently?

I am always wondering what other DBA’s do and if I am doing things differently. One such thing is my initial server setups, basically, what I configure for each of my new servers. So, why not blog about it and see what others chime in with after they read this. Keeping in mind that everyone has different requirements and different ways that they like to do the actual configurations.

For now, I am not going to go into what each one of these configurations do and why I choose the value I do. That’s for another time. If you want that information you can always go to Books Online or just Google it. In this case, I am just going to give you a running list with scripts; that I’ve added too over the years based on best practices and experience.

How Does Yours Compare?

I’d really love to hear what others do and if I may have missed something that you like to implement that may benefit me as well.  So leave a comment, tweet to me, or send me an email let’s compare notes.

The List

So here are the basics setups I do on every server post install in no particular order.

* Value varies based on server configuration

  1. Min and Max Memory *

  1. Enable and Configure Database Mail ( This is only to enable, full script will be in later post)

  1. Set Default Database Locations

  1. Set SQL Agent Job History Retentions

  1. Set Cost threshold for Parallelism *

  1. Set Max Degree of a Parallelism *

  1. Set Optimize for Adhoc work loads

  1. Change Number of Error Logs

  1. Create Cycle Error Log Job

  1. Add Additional TempDB Files All With Same Size and Growth Rates *

  1. Set Media Retention

  1. Set Backup Compression Default On

  1. Change to Audit Successful and Failed Logins

  1. Set Default Growths in Model Not Be Percentages

  1. Set AD Hoc Distributed Queries off 
  2. Set CLR Enabled off 
  3. Set Ole Automation Procedures off 
  4. Set Scan For Startup Procs off 
  5. Set xp_cmdshell off
  6. Setup Operators

  1. Set Up Alerts 17-25 and Error codes  823,824,825 (Remember to add the alerts to the operator)

Note: Most of these can be set using GUI as well as the scripts above. Also, in addition to these configurations, I make sure that the server is brought up to the most current stable CU or Service Pack. Everyone’s environment is different, my list may not be right for you.

Master of None

Being a Lone DBA gives you so much exposure to so many facets of SQL Server. Since I am just one I get to work on Replication, Administration, Security, Business Intelligence, Disaster Recovery, Reporting Services, Integration Services, Analysis Services, Database design, Development, Performance… you name it I get to dabble in it. However, being able to work on every facet also means I will never be a Master at any of it and that’s okay by me.
jack

For a Type A personality, like me, this is a hard thing to come to terms with. I‘ve learned with time to be fine with not knowing everything. I relish in the fact that I get to do and experience MUCH more than most. Those that are not Lone DBAs have to divide and conquer or are responsible for just a hand full of areas (like security, or DR, or Change management). However in our line of work, there is always a need for GO TO Experts. Through networking, I have gained several friends that have become my experts. I have an expert for things like PowerShell, Database Internals, Storage, Availability Groups, T-SQL etc… If I need expert knowledge on something they are always willing to lend a hand. If you don’t have a network of GO TO experts whether you are a Lone DBA or not, I strongly suggest you start building those relationships.

So, that being said, I will never be one of those GO TO experts. However, if someone asks a question if I have ever done something or had a particular issue…in most cases the answer is yes.  How do I accomplish that? The answer is by creating a broad skill set. I self-teach by dabbling in things. I am not afraid of trial and error. I learn all the SQL Tools I can and use them where appropriate.  I attend as many SQL training events I can manage.  learmingI am always trying to further diversify my knowledge base.  I attend my user group meetings (now run them), virtual training sessions, watch 24HOP sessions, I get the Summit Sessions on USB every year to watch when I have time, and finally I attend SQL Saturdays.  All of these avenues are great ways to further my knowledge base.

The most important tip I can give is learn just what you need to get most jobs done and don’t try to master it. It’s okay to be a master of none, revel in it, and embrace you get work on so many things. It will make you very marketable; there are not many of us that are given that opportunity.

Admit You Can’t Do Everything

As most of you know, I have been a Lone DBA for 15+ years and during that time I have learned a thing or two about how to survive on my own in relatively large environments.  One of those things is knowing when to admit you cannot do it all.  Working alone on 56 servers you can imagine how the workload can seem insurmountable.  There are times when in one week I will do 70+ tasks, not including project work and daily monitoring.  To manage and get this type of workload accomplish you have to learn to work smarter not harder. That’s when you have to enlist help and hire consultants.

But I am Afraid

Many people think that hiring consultants is admitting you are incapable of doing your job. Some think that if you hire consultants, it opens the door for the company to think that they may not need an “in house” DBA. It may lead them to just hire a consulting company to do the work. At the last company I worked for employees frowned and complained every time a consultant was brought in for anything.  Some even refused to share knowledge hoping to protect their jobs somehow.   I think this is nonsense.  You shouldn’t worry about being replaced by consultants.  A consultant only has superficial knowledge of the company.  You are the one that knows the whys, how’s, and understand the needs of the business.  The consultants don’t.  Don’t let it scare you.

Free Up Your Time

The biggest opponent I have to contend with as a Lone DBA is time. I have no time; every minute of my work day is used.  My world is all about prioritizing what needs to get done. Sometimes there is just not enough time in the day.  Hiring a consultant doesn’t mean you can’t do the work; it means you are managing your work load.

I hire consultants from time to time to free up my plate and cover some of the workload so I am able to focus on higher priorities. At times, I use them to do the normal redundant or routine admin work, little things that add up to a lot of time in a week.  On other occasions, I admittedly give them stuff I don’t want to do, or get tired of doing (but if you know me, I never really get “tired” of doing anything DBA related, I am just proving a point).  I will also give them the big projects that take too much time. Time is invaluable. For example, I may need to build a new cube. That as you may know, takes a lot of time. I know how to build and design cubes, but why should I spend hundreds of hours working on that when I can farm that out?

Do You Want to Take Vacation Ever?

Vacation, what’s that? Most DBA’s can take vacation without having to do work, because there is someone to cover and share your responsibilities. When it’s just you; you take work on vacation with you. One of the best benefits of hiring a consultant or a DBA service is to be able to leave that work load at work and take a real vacation. It took me years to realize this. I took my first vacation without work just earlier this year, it was wonderful to hand the reins over for a week and not have to worry about it.

Gotachas

However, there are a few gotchas to admitting you can’t do everything and hiring a consultant. One of the main one for me is giving up what you like to do. I love the core DBA stuff; turning that over to someone else to do is not easy for me. Relinquishing that can be very tough.  I also find that having to spend time hand holding the consultant is another gotcha. Consultants do not know the ins and outs of your environment. Getting them started on a project can take time away from you but in the end it’s worth it.

Embrace It

The moral of the story is I think it’s hugely important to admit to yourself that you can’t do it all. It took years for me to realize that I don’t have to do it all.  If you are juggling a workload for many when you are just one consider hiring help. You’ll thank me for it.