I don't think I'm alone in having wished time and again for a good SQL formatting option from within SSMS. I've posted about using Oracle's SQL Developer as a decent SQL formatting tool, but that's a second-rate option, given that it doesn't understand all of TSQL's unique syntax.
I stumbled across a better option yesterday, one that (a) is free and (b) integrates nicely within SSMS.
Thursday, December 22, 2011
Wednesday, August 17, 2011
Estimate recent activity in a SQL Server Database
It's a common problem SQL Server (especially) DBAs have: how do I find out when the last time this database was used? SQL Server is so prone to sprawl, and so many applications install so many databases; it's hard to keep track of what is in use and what is not. This is especially true when you're new on the job.
Here we'll look at a quick-and-dirty method for getting a guess at whether a database has been used. This comes with a lot of caveats and cautions, but if you're looking for some kind of evidence that a system has been used, index usage stats are one place to look.
Here we'll look at a quick-and-dirty method for getting a guess at whether a database has been used. This comes with a lot of caveats and cautions, but if you're looking for some kind of evidence that a system has been used, index usage stats are one place to look.
Labels:
SQL Server
Monday, April 4, 2011
Keeping a Minimum Number of SQL Server Backups Online using a SQL Agent Job
Keeping a Minimum Number of SQL Server Backups Online using a SQL Agent Job
First, a plug: there's a great "maintenance solution" that is a collection of stored procedures available at http://ola.hallengren.com/. This is the basis for a lot of my database maintenance jobs. If you haven’t taken a look at this, I highly recommend that you do; it’s free, under active maintenance, robust, and easy to implement.
When you run the script from that site, you get a variety of stored procedures and you can have it create SQL Agent jobs for you, as well. I’d recommend doing that, as it can give you a good idea of how the stored procedures work.
I prefer a retention policy based on redundancy: I want to keep at least n copies online at all times, regardless of how old they are. It’s true that a combination of the two policies would be the best-case scenario: keep five days’ worth of backups online, and make sure that we never have fewer than five backup files available at any given time. This would allow us to, for instance, have five backups run in a single day without deleting the older backups that we also want to have available.
But I’ve gone for the simpler route in this case: I want five backup files online at all times. SQL Server doesn’t give us an easy built-in way to do this, so we’ll turn to PowerShell for our process.
if ( $backup -ne $null ) # only do the delete if the backup dir exists
{
Note that the $BACKUP_DIR variable needs to be set to the correct directory for the backups.
Any file or directory in the backup directory that does not have the archive bit set will be removed. Do pay attention to this fact. You can put a file mask in the get-childitem cmdlet call to modify that behavior, if you choose.
The verify switch is quite nice: after each backup, it runs a 'RESTORE VERIFYONLY FROM DISK=..." to ensure that each file is recoverable.
Note: the DatabaseBackup stored procedure appends \computername to the backup directory by default, but if you're using a named instance, it uses \computername$instancename, instead. This makes sense, because you can have multiple instances on a server. If you're using a named instance, you'll need to account for this in the scripts below.
Much thanks to Porro for pointing this out in the comments below!
Basically, we’re backing up all of the system databases, all of the user databases, and then the t-log files from all of the user databases, and we’re saving those backups to a directory structure at the @backup_dir variable location we specified at the beginning.
Note that this stored procedure puts its files in directory structure starting in the @backup_dir. The start of this structure is the server name, with directories under it for each database.
This PowerShell script is more complicated, so we'll go through it in some more detail. Note that this needs the 7zip executable (and associated .dll). This script will look for it in the c:\utils directory. Note that you can copy just the 7z.exe and 7z.dll to a directory; you don’t have to install the entire package in order to use the 7-Zip command line.
First, here's the whole of our PowerShell script:
Next we turn on the archive bit for the directory created during the previous step, and then we run 7z.exe to create the zip file. All pretty straightforward up to this point, though do note that we’re using the –mx1 switch in 7Zip. This is important because 7Zip is optimized for compression, not for speed. Using the –mx1 switch tells 7zip to use its fastest (and least CPU-intensive) compression routines. Especially for large files, this is really important.
Our next step is to check to make sure that 7zip succeeded. We do that with the $LASTEXITCODE variable:
If the exit code is zero, then we know 7Zip succeeded, and we'll continue.
The next step is to turn off the archive bit on the directory we just zipped up; that way it'll be deleted when the job runs next.
We also want to keep the five most recent backups online on the server. We don't want to just delete files that are older than five days, though: if the backup was failing, and there aren't backups from days 1-4, we'd suddenly have lost all of our backups. So we loop through the files and sort them by date. Then, if there are more than five files in the directory, we take the oldest files #6 - n and turn off the archive bit on them. That way, those files will be deleted the next time the job runs.
This is the code that does this (thanks, BTW, to Spiceworks for the example script on which this is based.):
So now we have a backup job that will keep the backups around not based on age but on the number of copies. When you put the scripts above together in a job, the steps look something like this:
When you run the script from that site, you get a variety of stored procedures and you can have it create SQL Agent jobs for you, as well. I’d recommend doing that, as it can give you a good idea of how the stored procedures work.
The backup job
When we perform a backup in a SQL Server instance, we want to perform a number of tasks:- Delete old backups
- Backup all of the current databases
- Zip up the backup files
- Copy the zipped backup file(s) to a share on a backup server
I prefer a retention policy based on redundancy: I want to keep at least n copies online at all times, regardless of how old they are. It’s true that a combination of the two policies would be the best-case scenario: keep five days’ worth of backups online, and make sure that we never have fewer than five backup files available at any given time. This would allow us to, for instance, have five backups run in a single day without deleting the older backups that we also want to have available.
But I’ve gone for the simpler route in this case: I want five backup files online at all times. SQL Server doesn’t give us an easy built-in way to do this, so we’ll turn to PowerShell for our process.
Delete old backups
Our first step in the job is to delete the backups that aren't used anymore. In subsequent steps, we have turned off the archive bit on the files we do not need anymore, so we simply delete the files that don't have the archive bit set. The PowerShell script is below:
$backup_dir="path:\to\Backup\dir"
$files=get-childitem -path $backup_dir
# we'll delete all files that don't have the archive bit set
if ( $backup -ne $null ) # only do the delete if the backup dir exists
{
Foreach($file in $files)
{ If((Get-ItemProperty -Path $file.fullname).attributes -band [io.fileattributes]::archive)
{ Write-output "$file is set to be retained" }
ELSE {
Write-output "$file does not have the archive bit set. Deleting."
remove-item -recurse $file.fullname
$output =$_.ErrorDetails } }
#end Foreach
} #End If
Any file or directory in the backup directory that does not have the archive bit set will be removed. Do pay attention to this fact. You can put a file mask in the get-childitem cmdlet call to modify that behavior, if you choose.
Run DatabaseBackup
DatabaseBackup is the name of the stored procedure (in the master DB) that backs up each of the databases on the instance. It is installed as a part of the maintenance solution referenced at the beginning of the page. Usage is as follows:
EXECUTE [dbo].[DatabaseBackup]
@Databases = 'USER_DATABASES',
@Directory = @backup_dir,
@BackupType = 'FULL',
@Verify = 'Y',
@CleanupTime = 24,
@CheckSum = 'Y'
@Databases can be one of:@Databases = 'USER_DATABASES',
@Directory = @backup_dir,
@BackupType = 'FULL',
@Verify = 'Y',
@CleanupTime = 24,
@CheckSum = 'Y'
- 'USER_DATABASES' backs up all user databases
- 'SYSTEM_DATABASES' backs up all system databases (master, model, msdb)
- 'FULL' performs a full backup of the database data files
- 'LOG' backs up the transaction log files
- 'DIFF' creates a differential backup from the last full backup
-- change the backup directory/drive appropriately
declare @backup_dir varchar(100) ='path:\to\backup\dir'
EXECUTE [dbo].[DatabaseBackup]
@Databases = 'SYSTEM_DATABASES',
@Directory = @backup_dir,
@BackupType = 'FULL',
@Verify = 'Y'
EXECUTE [dbo].[DatabaseBackup]
@Databases = 'USER_DATABASES',
@Directory = @backup_dir,
@BackupType = 'FULL',
@Verify = 'Y'
EXECUTE [dbo].[DatabaseBackup]
@Databases = 'USER_DATABASES',
@Directory = @backup_dir,
@BackupType = 'LOG',
@Verify = 'Y'
declare @backup_dir varchar(100) ='path:\to\backup\dir'
EXECUTE [dbo].[DatabaseBackup]
@Databases = 'SYSTEM_DATABASES',
@Directory = @backup_dir,
@BackupType = 'FULL',
@Verify = 'Y'
EXECUTE [dbo].[DatabaseBackup]
@Databases = 'USER_DATABASES',
@Directory = @backup_dir,
@BackupType = 'FULL',
@Verify = 'Y'
EXECUTE [dbo].[DatabaseBackup]
@Databases = 'USER_DATABASES',
@Directory = @backup_dir,
@BackupType = 'LOG',
@Verify = 'Y'
The verify switch is quite nice: after each backup, it runs a 'RESTORE VERIFYONLY FROM DISK=..." to ensure that each file is recoverable.
Note: the DatabaseBackup stored procedure appends \computername to the backup directory by default, but if you're using a named instance, it uses \computername$instancename, instead. This makes sense, because you can have multiple instances on a server. If you're using a named instance, you'll need to account for this in the scripts below.
Much thanks to Porro for pointing this out in the comments below!
Basically, we’re backing up all of the system databases, all of the user databases, and then the t-log files from all of the user databases, and we’re saving those backups to a directory structure at the @backup_dir variable location we specified at the beginning.
Note that this stored procedure puts its files in directory structure starting in the @backup_dir. The start of this structure is the server name, with directories under it for each database.
Zip the backup files
We don’t want to keep the uncompressed backups online all the time, so we’ll compress them using 7-Zip.This PowerShell script is more complicated, so we'll go through it in some more detail. Note that this needs the 7zip executable (and associated .dll). This script will look for it in the c:\utils directory. Note that you can copy just the 7z.exe and 7z.dll to a directory; you don’t have to install the entire package in order to use the 7-Zip command line.
First, here's the whole of our PowerShell script:
$backup_dir="path:\to\backup\dir"
$day= get-date -format "yyyyMMdd_HHmm"
# Turn on the archive bit on the current backups directory
# (so it won't get deleted at the next run if the zip process fails)
attrib $backup_dir\$env:computername +a
# Zip up the current backup(s)
# destination for the zip file is $backup_dir\SQLBACKUP-<servername>-DATE_TIME.zip
C:\utils\7z.exe -tzip -mx1 a $backup_dir\SQLBACKUP-$env:computername-$day.zip $backup_dir\$env:computername
# if 7zip succeeded, we'll continue
if ($LASTEXITCODE -gt 0)
{Throw "7Zip failed" }
ELSE {
# When the zip is complete, turn off the archive bit on the current backup directory
attrib $backup_dir\$env:computername -a
# Now let's change the archive bit, such that only
# the last five zipped backups will be kept online
$delfiles=0
$delfiles= (dir $backup_dir\SQLBACKUP*.zip).count-5
if ($delfiles -gt 0)
# If there are more than 5 zipped backups, we'll turn off the archive bit on them
{dir $backup_dir\SQLBACKUP* | sort-object -property {$_.CreationTime} |
select-object -first $delfiles |
foreach-object { attrib $_.FULLNAME -A} }}
So. The first line sets the backup directory to use. Next, we set a variable to hold today's date and time to use in creating the zip file.$day= get-date -format "yyyyMMdd_HHmm"
# Turn on the archive bit on the current backups directory
# (so it won't get deleted at the next run if the zip process fails)
attrib $backup_dir\$env:computername +a
# Zip up the current backup(s)
# destination for the zip file is $backup_dir\SQLBACKUP-<servername>-DATE_TIME.zip
C:\utils\7z.exe -tzip -mx1 a $backup_dir\SQLBACKUP-$env:computername-$day.zip $backup_dir\$env:computername
# if 7zip succeeded, we'll continue
if ($LASTEXITCODE -gt 0)
{Throw "7Zip failed" }
ELSE {
# When the zip is complete, turn off the archive bit on the current backup directory
attrib $backup_dir\$env:computername -a
# Now let's change the archive bit, such that only
# the last five zipped backups will be kept online
$delfiles=0
$delfiles= (dir $backup_dir\SQLBACKUP*.zip).count-5
if ($delfiles -gt 0)
# If there are more than 5 zipped backups, we'll turn off the archive bit on them
{dir $backup_dir\SQLBACKUP* | sort-object -property {$_.CreationTime} |
select-object -first $delfiles |
foreach-object { attrib $_.FULLNAME -A} }}
Next we turn on the archive bit for the directory created during the previous step, and then we run 7z.exe to create the zip file. All pretty straightforward up to this point, though do note that we’re using the –mx1 switch in 7Zip. This is important because 7Zip is optimized for compression, not for speed. Using the –mx1 switch tells 7zip to use its fastest (and least CPU-intensive) compression routines. Especially for large files, this is really important.
Our next step is to check to make sure that 7zip succeeded. We do that with the $LASTEXITCODE variable:
if ($LASTEXITCODE -gt 0) {Throw "7Zip failed" }
This says, if 7Zip failed (returning an error code that is greater than zero), end (throw) with failure text "7Zip failed".If the exit code is zero, then we know 7Zip succeeded, and we'll continue.
The next step is to turn off the archive bit on the directory we just zipped up; that way it'll be deleted when the job runs next.
We also want to keep the five most recent backups online on the server. We don't want to just delete files that are older than five days, though: if the backup was failing, and there aren't backups from days 1-4, we'd suddenly have lost all of our backups. So we loop through the files and sort them by date. Then, if there are more than five files in the directory, we take the oldest files #6 - n and turn off the archive bit on them. That way, those files will be deleted the next time the job runs.
This is the code that does this (thanks, BTW, to Spiceworks for the example script on which this is based.):
$delfiles=0 $delfiles= (dir $backup_dir\SQLBACKUP*.zip).count-5
if ($delfiles -gt 0) # If there are more than 5 zipped backups, we'll turn off the archive bit on them
{dir $backup_dir\SQLBACKUP* | sort-object -property {$_.CreationTime} |
select-object -first $delfiles |
foreach-object { attrib $_.FULLNAME -A} }
What this does is the following:if ($delfiles -gt 0) # If there are more than 5 zipped backups, we'll turn off the archive bit on them
{dir $backup_dir\SQLBACKUP* | sort-object -property {$_.CreationTime} |
select-object -first $delfiles |
foreach-object { attrib $_.FULLNAME -A} }
- Count the number of .zip files in the backup directory
- If the number of files in the backup directory is > 5, then:
- Sort the directory (SQLBACKUP*) by creation time (oldest first)
- Take the first n files in the sorted list (where n is the number of files that are greater than 5) and turn off the archive attribute on them.
Copy files to the backup server
Finally we'll copy the files to the backup server:
# Make sure you change the backup directory appropriately
$backup_dir= "path:\to\backup\dir"
$day= get-date -format "yyyyMMdd_"
# This will copy all of today's backups to the backup server
copy-item $backup_dir\SQLBACKUP-$env:computername-$day*.zip \\server\sharename -force
Note that the SQL Agent service account needs to have access to the share in order for this to succeed. Note, too, that this job will copy all files from today, so if there were multiple runs today, all of those files will get copied again (overwritten; that’s the need for the –force switch).$backup_dir= "path:\to\backup\dir"
$day= get-date -format "yyyyMMdd_"
# This will copy all of today's backups to the backup server
copy-item $backup_dir\SQLBACKUP-$env:computername-$day*.zip \\server\sharename -force
So now we have a backup job that will keep the backups around not based on age but on the number of copies. When you put the scripts above together in a job, the steps look something like this:
Labels:
Powershell,
SQL Server
Wednesday, March 2, 2011
Formatting SQL Server TSQL with Oracle's SQL Developer
Note that in the time since I wrote this post, another (much better) option for formatting SQL Server script has come up.
If you do much troubleshooting on Microsoft SQL Server, you inevitably will end up having to deal with a poorly-formatted (and hard-to-read) SQL statement from a query using sys.dm_exec_sql_text or the like.
If you do much troubleshooting on Microsoft SQL Server, you inevitably will end up having to deal with a poorly-formatted (and hard-to-read) SQL statement from a query using sys.dm_exec_sql_text or the like.
There are lots of online formatters out there, though I've had decidedly mixed results with them. There also are a lot of add-in and standalone products available that will do a good at this. Here's another one to add to your list, until SSMS includes a formatting feature: Oracle's SQL Developer.
SQL Developer is a free download (here) that will, in fact, connect to SQL Server instances. While I do not use it for my day-to-day SQL Server administration tasks, I use it regularly to reformat SQL that I've pulled from the DMVs. Friday, February 25, 2011
How to Change the Owner of All SQL Agent Jobs in a SQL Server Instance
Each job in a SQL Server instance has an owner, and you may run into a situation in which that owner needs to be changed.
If there are a lot of jobs that were created by that owner, this can be a tedious task.
Here we’re opening a cursor and looping through the SQL Agent Jobs in the instance that are owned by the old user (@olduser) and executing the sp_update_job stored procedure to change that to match @newuser.
If there are a lot of jobs that were created by that owner, this can be a tedious task.
Here we’re opening a cursor and looping through the SQL Agent Jobs in the instance that are owned by the old user (@olduser) and executing the sp_update_job stored procedure to change that to match @newuser.
USE MSDB
GO
declare @jobname varchar (200) declare @oldusername varchar (30) declare @newusername varchar(30) set @oldusername='DOMAIN\oldusername' set @newusername='DOMAIN\newusername' declare cur_jobname cursor LOCAL for select name from sysjobs where suser_sname(sysjobs.owner_sid) =@oldusername open cur_jobname fetch next from cur_jobname into @jobname While @@FETCH_STATUS = 0 begin
EXEC msdb.dbo.sp_update_job @job_name=@jobname, @owner_login_name=@newusername fetch next from cur_jobname into @jobname end close cur_jobname deallocate cur_jobname
GO
declare @jobname varchar (200) declare @oldusername varchar (30) declare @newusername varchar(30) set @oldusername='DOMAIN\oldusername' set @newusername='DOMAIN\newusername' declare cur_jobname cursor LOCAL for select name from sysjobs where suser_sname(sysjobs.owner_sid) =@oldusername open cur_jobname fetch next from cur_jobname into @jobname While @@FETCH_STATUS = 0 begin
EXEC msdb.dbo.sp_update_job @job_name=@jobname, @owner_login_name=@newusername fetch next from cur_jobname into @jobname end close cur_jobname deallocate cur_jobname
Labels:
SQL Server
How to Change the Owner of All Databases in a SQL Server Instance
Each database in a SQL Server instance has an owner, and you may run into a situation in which that owner needs to be changed. One example of this would be a case when a DBA moves departments, but stays in the organization. In that case, the account would still be active, but you’d probably want to change the database owner.
If there are a lot of databases that were created by that owner, this can be a tedious task.
Here, we’re opening a cursor and looping through the databases in the instance that are owned by the old user (@olduser) and executing the sp_changedbowner stored procedure to change that to match @newuser.
If there are a lot of databases that were created by that owner, this can be a tedious task.
Here, we’re opening a cursor and looping through the databases in the instance that are owned by the old user (@olduser) and executing the sp_changedbowner stored procedure to change that to match @newuser.
USE MASTER
GO
declare @dbname varchar (50)
declare @oldowner varchar (30)
declare @newowner varchar (30)
declare @sql varchar (300)
set @oldowner='DOMAIN\oldusername'
set @newowner='DOMAIN\newusername'
SET @sql=''
declare cur_dbname cursor LOCAL
for SELECT name
FROM master.sys.databases where SUSER_SNAME(owner_sid)=@oldowner
open cur_dbname
fetch next from cur_dbname
into @dbname
While @@FETCH_STATUS = 0
begin
set @sql='exec ['+@dbname+'].sys.sp_changedbowner ''' + @newowner + ''''
-- PRINT @sql
EXEC (@sql)
fetch next from cur_dbname
into @dbname
end
close cur_dbname
deallocate cur_dbname
GO
declare @dbname varchar (50)
declare @oldowner varchar (30)
declare @newowner varchar (30)
declare @sql varchar (300)
set @oldowner='DOMAIN\oldusername'
set @newowner='DOMAIN\newusername'
SET @sql=''
declare cur_dbname cursor LOCAL
for SELECT name
FROM master.sys.databases where SUSER_SNAME(owner_sid)=@oldowner
open cur_dbname
fetch next from cur_dbname
into @dbname
While @@FETCH_STATUS = 0
begin
set @sql='exec ['+@dbname+'].sys.sp_changedbowner ''' + @newowner + ''''
-- PRINT @sql
EXEC (@sql)
fetch next from cur_dbname
into @dbname
end
close cur_dbname
deallocate cur_dbname
Labels:
SQL Server
Subscribe to:
Posts (Atom)