Why you have to be on top of your game

I just heard from a DBA at my last gig who was keeping me in the loop about things I had done while I was there.
He said that they had to take away the 2 1TB LUNs I had them put on our DB server because they caused performance problems. The SAN guys warned us that having LUNs that big would hurt us and I guess they were right. Well, not really. This was the report I got back from the DBA, so not my words. When I grilled him about it, he did say that they had given us a single spindle that was 1TB for each of them instead of giving us several drives in the array.
So let me get this right… I asked for 1TB LUNs, and instead of doing it right you gave me a single drive and used it as justification for showing me you know more about disks than I do? I find that incredible.

So guys, yeah… you really have to be on top of your game and know your stuff so that when they do stuff like this you can call them on it. And I must admit that I’m not sure it would have crossed my mind to even check that it was a single spindle. Why would that even cross my mind? These are enterprise SAN guys… they know their way around and should definitely know better. What I didn’t realize was that they were into playing games. It just goes to show that you try so hard to be effective in some places and they just won’t let you.

Database Science kicks in

OK, I’m a DBA down this week because one of my guys ‘resigned’… but I’m making time to blog because honestly I need a short break this morning.
I’d like to talk about database science today and how it’s better than throwing yourself in the middle of application knowledge.

For starters, concentrating on the DB is a skill you can carry over to the next place you go. It’s highly doubtful that you’ll be supporting the same apps at your new gig. So it really shouldn’t matter to you whether the app counts warts on a giant’s butt or processes orders for fake vomit. A DB is a DB.

At no time has that rung true more than at my current gig. And that’s one thing that the DBA I just lost couldn’t come to grips with. He wanted to have his apps that he supported and nothing else. He thought that you had to know the app team and the product intimately in order to support it well, and that’s just not the case. And in fact, I honestly believe it can hurt you. It doesn’t have to, but it can. If you get too close to an app you start to make assumptions about performance and usage and you stop relying on numbers. And you should always investigate an issue before applying a known fix. Even doctors in the heart of Africa in the middle of a TB outbreak, still run the test when someone comes in with TB symptoms. Because it might not be TB and they could really hurt the patient if they give them the wrong medicine. Of course, he could have said, I’m in the middle of TB country and everyone around me has TB, so this guy must also have it. But he doesn’t because he knows to rely on the numbers, not what he’s become familiar with. So yeah, you can get familiar with the problems an app has and it can speed you to a solution, but never stop confirming it before you do anything.

So this has helped today in another way. By not getting too close to my apps I’m able to step right in and pick up where he left off. I don’t have to know the nature of the app or how the app team likes to do things to setup a good backup routine. I don’t have to know what the app does to diagnose a blocked spid, etc. It’s all just DB work.

I don’t know why some people think that you have to be intimate with an app in order to support it. There can be some mild benefit sometimes, but in general I find it clouds my thinking.

A sad goodbye to Database Underground

For many years now I’ve considered InfoWorld Magazine my home.  I’ve blogged and raised many issues that have gotten both good and bad responses from the database community.  However, as of very recently InfoWorld has pulled my access and cancelled my blog of several years because they didn’t think I was writing often enough.  And while it’s true that I could write more often, quite often when I go to blog something the editor would deny it because it didn’t fit in with some image he was trying to make InfoWorld fit into.  This ‘image’ is not the image that made InfoWorld what it was and it’s not the image that built my blog as big as it was. 

This kind of treatment for someone who has worked hard for you for many years is just ridiculous.

So Database Underground is officially dead.  The blog may stay online for a little while, but there’ll be no more updates.  I’m told by a fairly reliable source that the guy who killed it is likely to delete all the content.  I don’t understand thinking like that.  Why wouldn’t you want to keep the content that has gotten so many hits over the years?

Anyway though I’m currently in talks with another publication to move my blog over there.  I’ll let you guys know when that happens.

Oh, and if you want to write a nastygram to the “admin” at InfoWorld who thinks good content should come down just because he wants it to then you can send as many emails as you like to Galen_Gruman@infoworld.com.

24-hrs of PASS is Online

If you haven’t gotten your email yet, or if it’s just gone into your spam, the 24-hrs of PASS recordings are ready.

You’ll find the main page here (it’ll make you login): http://www.sqlpass.org/LearningCenter/24Hours.aspx

And don’t forget to look specifically for my session here: http://passfiles.sqlpass.org/uploads/24Hours/05_20_2010_05AM_downloadHFP/Engine/Default.htm?http%3A%2F%2Fpassfiles.sqlpass.org%2Fuploads%2F24Hours%2F05_20_2010_05AM_downloadHFP%2F

If the link above doesn’t work then mine is Session #18.

Exciting News

Hey guys I’ve got some exciting news!! As it turns out you can use powershell to admin your servers.  That’s right!

And to prove it, here’s a nice little script I wrote this morning to change the service acct and passwords on all the boxes I wanted.

Actually, I’m just going to show you the single box version, but you all know how easy it is to turn this into a multi-box script.  Here it’s the method that’s important.

I’ll show you 2 ways to do it and I think they’re both equal really.  It just depends on your preference.

$service = gwmi win32_service –filter “name=’MSDTC’”
$service | %{$_.Change($null,$null,$null,$null,$null,$null,”NewUserAcct”, “NewPassword”)}

Here’s a screenshot of the same code.

image

That’s it.  That’s all you need to change the user acct and password for a service.  But we’ve still got a few things to talk about so you understand everything that’s happening here.

  1. If you want to use this against a remote box, then just use the –computername parameter.
  2. Notice I used the –filter param also.  This filters the results on the server itself so you don’t pass all the services across the wire and then filter them.  That method would look like this: > $service = gwmi win32_service | ?{$_.Name –eq “MSDTC”}
  3. The Change() method takes a fixed number of params so you have to put in all the params at least as placeholders.  That’s why all the $null values are in there.
  4. I’ve seen other methods for doing this where they name the params and don’t actually pass in all of them physically, but I haven’t had any luck with these.
  5. Here’s a site that explains all the other parms and the error codes you can get when using this method.  http://www.scriptinternals.de/new/us/Support/Internal/WMI_Win32_Service_Change.htm

Ok, I know I promised you 2 ways to do this but I’m out of time so this one will have to do.  The other one uses the same method, it’s just a different way to script it so you’re not really losing anything.

Good Press is Awesome

I know Jen’s already blogged on this, but this is my little contribution to the whole, how cool are we, theme.

Brad Mcgehee just came to speak at our user group and to my surprise he thought enough of what we’re doing to write a blog about it.

So I thought I’d take this time just to explain a little more so maybe some more of you can start doing it at your user groups too.

I started doing this last year when it really started bothering me that the user group model is so limited.  The problem is that you bring in a speaker to give a session on a topic, and next month you do the same thing with a different speaker and a different topic.  The problem is that you don’t have time to go deeper into any of the topics.  You only get to barely touch on the subject and there’s no time to really learn it.  So I decided to start up a progressive class before the user group.  I lasts like 6-8 mos and each session builds off of the previous ones.  The downside however is that it takes you 6mos to complete the class, but it’s better than nothing and it’s a free class.  I actually treat it like a real class that you would go to at a training center only I like to think I go into more depth sometimes.

It’s a really good model because if you hold it before or after your main user group meeting the people are coming anyway so you’ve got a built-in audience.  And you’d be surprised how many people show up to the group that never did before.  So anyway, that’s all I’ve got on that.  I hope some more of you start doing this and if you do I’d like to hear how it works for you.  And I’d like to also give a special shout-out to Brad for recognizing my genius.

Petri for us all

The Petri IT KnowledgeBase just opened up its SQL section and Jen and I are the authors.  I know, Jen also wrote about this too… what can I say, I have nothing original so I have to steal others’ material.  Anyway, it’s really a nice site they’ve put together for us so come check it out.  Currently we’ve got some nice barebones “I know nothing about SQL” articles so if you’re a complete newbie then you can start here and we’ll explain what all this stuff actually means.

I’ve also got a couple techie blog planned so I’ll get back on the horse here in a bit and start giving you some real content again.

Data Explosion

Hey everybody… it’s been a while since I’ve blogged, but that’s what starting a new gig will do to you. I’ve had so many new things to rant about here and I’m sure I’ll be getting to those in the future, but right now I want to share a funny story with you.

We had a group of devs who were trying to run a fairly simple insert and it kept filling up the disk. And when I say it was filling up the disk, i mean like 50K rows were filling up like 200GB. So they came to me to see if I could fix it. This is where it gets fun.

It really didn’t take me too long in my investigation to find that they had the data file set to autogrow by 131,000%. That’s right, I said it! 131,000 percent. So now that I found the problem I was able to set it to something more reasonable, shrink the file and let them get on with their insert.

So it started me thinking about what other DBs in the place had similar issues because I’ve been hearing about disk space problems here and there. So I wrote a powershell to go out and check the file growth rates for every SQL box and I found a lot of the same type of stuff. There were a lot of them set for several thousand percent growth, several of them set for 1MB growth, and everything in between. In general, as a good generic setting to start from, I like to grow my files 1GB at a time. It’s a good round number that works in a lot of cases and then you can snipe the ones that need something else. And then I altered the powershell to go out and change the growth rates of all the DB files out there to 1GB. Life is good again and we have the beginnings of a happy shiny environment.

Oh y, and in the same discovery I also found 110 DBs set to autoshrink and I also took care of those.

Powershell 2.0 Get-Member Enhancements

The other day I blogged on a cool way to alter database file properties.  And I promised I would film the solution for you.  Well, I got home that night and started filming but when I pulled up sqlps I was stopped dead in my tracks by something I couldn’t explain.  The methods to get and set were missing.  I checked my other box and they were there, so what happened?  The only difference was that the box I wrote the blog from was XP and the box I was filming from was Win7.

So I pinged a couple really smart guys as MS (and I cc’d Buck).  So after a couple of days we have an answer.  There’s a new feature in powershell 2.0 that hides the getter and setter methods from you by default in get-member.  They’re officially calling it an enhancement.

There are 2 new parameters that get-member supports that allow you to control what you see.  You can now use –view and –force.

-view takes the parameters Extended, Adapted, Base, All

-force doesn’t take any additional parameters.

However if you’re looking specifically for getter and setter methods, you’ll have to use –force because –view doesn’t display them.  Here’s what that’ll look like:

dir | gm –force

Now you can see everything.  It’s interesting though is that by defalt, PS loads all the methods, it just doesn’t show them to you.

Here’s the MSDN blog that explains this and so much more.

And I’d like to give a special thanks to Robert Hutchinson at MS for chasing this down for me while Buck and I sat on the sidelines helpless.

Powershell to Alter Database File Properties

Man, I still just get so jazzed when I work with powershell. Every time I need to do something long and hard(that was for you Buck), I look in powershell and it lets me just lay back and enjoy the long and hard.

So this time we’ve got like 85 DB files for our main DB. I know that’s a lot but it’s been spread across so many LUNs for so long and it’s just grown and grown as it’s filled up the LUNs. Now we finally got a couple really large LUNs and we’re trying to consolidate the files. And now that we’ve moved most of them to a single LUN, we want to stop the autogrowth on most of them so they don’t grow anymore before we empty them and then drop them. Because what’s the point really in putting them all on a single LUN if you’re not going to get rid of some of them. So we definitely don’t want them to grow anymore before we can empty them.

The task at hand was to set all of these files (most of them really) to not autogrow. The choices were do it in the GUI, or write a T-SQL cursor to go through all the files and create an alter database command in dynamic SQL for each one. Neither of those is very attractive considering that there are so many files. Of the 2 of them though, the cursor is the more attractive solution. Then I thought of powershell. I hadn’t specifically done anything like this in powershell before, but I figured I’d be able to get some love. I’m not going to teach you powershell here, or explain the script too much. I’m just gonna give you the code and if you have any questions you’re free to write, but it’s pretty straightforward. I’ll be making a video on this tonight for those of you who like to see things in action.

PS SQLSERVER:\SQL\MYSQLBox\DEFAULT\Databases\MyDB\filegroups\data\files> gci | where{$_.Name -ne “MyDB_Data”} | %{$_.set_GrowthType(“None”); Alter();}

OK, so a quick glance at the important parts.

1. Make sure you’re in the right node. Notice I’m in data\files.
2. Here I’m filtering out the one file I don’t want to be effected. So the main data file I want to still be able to grow. I’ll talk about that more in a min.
3. Don’t forget to use the Alter() at the end. Usually whenever an object has an alter method it wants you to use it. If you don’t it’ll change it in cache in your powershell session but you won’t see the change persisted to the DB. So you’re not really making the change to the object in SQL, just in your powershell pipeline. You can tell if something has an alter method by doing this at the cmdline ‘gci | gm’. If there’s an alter method there, then use it.

OK, you don’t have to filter. If you find it too much trouble or if your files are named in such a way that it would be too hard to snipe a single file for some reason, then you can change them all and then just go back to the GUI to change the last one back to what it was.

So there you go. We have a nice 1-line script to alter as many files as you need. And you can clearly tack as many file changes on there as you need. And the best part about it is that the code doesn’t really change if you need to do another file operation. All you do is change the method call. Ahhh, the power of powershell.

Instead of working, I blog.