0% found this document useful (0 votes)
29 views

Backup Disk Space

1. Backups will increase disk space usage over time as data grows. Transaction log backups especially will increase in size with data changes. 2. Shared backup drives can fill up if not cleaned regularly, causing backup failures. 3. The document proposes monitoring backup drive space using a small database on the backup drive to avoid security risks of other methods. Space requirements are checked before starting backups to prevent failures.

Uploaded by

Jason
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Backup Disk Space

1. Backups will increase disk space usage over time as data grows. Transaction log backups especially will increase in size with data changes. 2. Shared backup drives can fill up if not cleaned regularly, causing backup failures. 3. The document proposes monitoring backup drive space using a small database on the backup drive to avoid security risks of other methods. Space requirements are checked before starting backups to prevent failures.

Uploaded by

Jason
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Backup disk space

Backups are by nature going to result in disk space being eaten up whenever they run. Most of
the backup routines we build involve full backups (once or maybe twice a week), differential
backups (daily on non-full backup days), and/or transaction log backups, which run frequently
and backup changes since the last log backup. Regardless of the specifics in your environment,
there are a few generalizations that we can make:

1. Data will get larger over time, and hence backups will increase in size.
2. Anything that causes significant data change will also cause transaction log backup sizes
to increase.
3. If a backup target is shared with other applications, then they could potentially interfere
or use up space.
4. The more time that has passed since the last differential/transaction log backup, the
larger they will be and the longer they will take.
5. If cleanup of the target backup drive does not occur regularly, it will eventually fill up,
causing backup failures.

Each of these situations lends themselves to possible solutions, such as not sharing the backup
drive with other programs, or testing log growth on release scripts prior to the final production
deployment. While we can mitigate risk, the potential always exists for drives to fill up. If they do,
then all further backups will fail, leaving holes in the backup record that could prove detrimental
in the event of a disaster or backup data request.

As with log space, we can monitor backup size and usage in order to make intelligent decisions
about how a job should proceed. This can be managed from within a backup stored procedure
using xp_cmdshell, if usage of that system stored procedure is tolerated. Alternatively, Powershell
can be used to monitor drive space as well. An alternative solution that I am particularly fond of is
to create a tiny unused or minimally used database on the server you’re backing up and put the
data and log files on the backup drive. This allows you to use dm_os_volume_stats to monitor disk
usage directly within the backup process without any security compromises.

For an example of this solution, we will use my local C drive as the backup drive and the F drive
as the target for all other database data files. Since our data files are on the F drive, we can easily
view the space available like this:

2 SELECT

3 CAST(CAST(available_bytes AS DECIMAL) / (1024 * 1024 * 1024) AS BIGINT) AS gb_free

4 FROM sys.master_files AS f

5 CROSS APPLY sys.dm_os_volume_stats(f.database_id, f.file_id)

6 WHERE f.database_id = DB_ID()

7 AND f.type_desc = 'ROWS';


This returns the free space on the drive corresponding to the database I am querying from, in this
case AdventureWorks2014. The result is exactly what I am looking for:

With 14.5TB free, we’re in good shape for quite a while. How about our backup drive? If we are
willing to use xp_cmdshell, we can gather that information fairly easily:

1  

2 DECLARE @results TABLE (output_data NVARCHAR(MAX));

3  

4 INSERT INTO @results

5 (output_data)

6 EXEC xp_cmdshell 'DIR C:';

7  

8 SELECT

9 *

10 FROM @results

11 WHERE output_data LIKE '%bytes free%';

12  

The result of this query is a single row with the number of directories and bytes free:

Unfortunately, xp_cmdshell is a security hole, allowing direct access to the OS from SQL Server.
While some environments can tolerate its use, many cannot. As a result, let’s present an
alternative that may feel a bit like cheating at first, but provides better insight into disk space
without the need to enable any additional features:

1  

2 CREATE DATABASE DBTest

3 ON

4 ( NAME = DBTest_Data,

5 FILENAME = 'C:\SQLData\DBTest.mdf',

6 SIZE = 10MB,
7 MAXSIZE = 10MB,

8 FILEGROWTH = 10MB)

9 LOG ON

10 ( NAME = DBTest_Log,

11 FILENAME = 'C:\SQLData\DBTest.ldf',

12 SIZE = 5MB,

13 MAXSIZE = 5MB,

14 FILEGROWTH = 5MB);

15  

This creates a database called DBTest on my C drive, with some relatively small data and log file
sizes. If you plan on creating a more legitimate database to be used by any actual processes, then
adjust the file sizes and autogrow settings as needed. With a database on this drive, we can run
the DMV query from earlier and get free space on this drive:

1  

2 USE DBTest;

3  

4 SELECT

5 CAST(CAST(available_bytes AS DECIMAL) / (1024 * 1024 * 1024) AS BIGINT) AS gb_free

6 FROM sys.master_files AS f

7 CROSS APPLY sys.dm_os_volume_stats(f.database_id, f.file_id)

8 WHERE f.database_id = DB_ID()

9 AND f.type_desc = 'ROWS';

10  

The result is exactly what we were looking for earlier, with no need for any OS-level commands
via xp_cmdshell or Powershell:

I currently have 154GB free, and the only cost of this data was the creation of a tiny database on
the backup drive. With this tool in hand, we can look at a simple backup stored procedure and
add logic in to manage space while it is running:

1  

2 USE AdventureWorks2014;
3 GO

4 IF EXISTS (SELECT * FROM sys.procedures WHERE procedures.name = 'full_backup_plan')

5 BEGIN

6 DROP PROCEDURE dbo.full_backup_plan;

7 END

8 GO

9  

10 CREATE PROCEDURE dbo.full_backup_plan

11 @backup_location NVARCHAR(MAX) = 'C:\SQLBackups\' -- Default backup folder

12 AS

13 BEGIN

14 SET NOCOUNT ON;

15  

16 DECLARE @current_time TIME = CAST(CURRENT_TIMESTAMP AS TIME);

17 DECLARE @current_day TINYINT = DATEPART(DW, CURRENT_TIMESTAMP);

18 DECLARE @datetime_string NVARCHAR(MAX) = FORMAT(CURRENT_TIMESTAMP ,

19 'MMddyyyyHHmmss');

20 DECLARE @sql_command NVARCHAR(MAX) = '';

21  

22 DECLARE @database_list TABLE

23 (database_name NVARCHAR(MAX) NOT NULL, recovery_model_desc NVARCHAR(MAX));

24

25 INSERT INTO @database_list

26 (database_name, recovery_model_desc)

27 SELECT

28 name,

29 recovery_model_desc

30 FROM sys.databases

31 WHERE databases.name NOT IN ('msdb', 'master', 'TempDB', 'model');

32  

33 SELECT @sql_command = @sql_command +

34 '

35 BACKUP DATABASE [' + database_name + ']


TO DISK = ''' + @backup_location + database_name + '_' + @datetime_string + '.bak'';
36
'
37
FROM @database_list;
38
 
39
PRINT @sql_command;
40
EXEC sp_executesql @sql_command;
41
END
42
 

This simple stored procedure will perform a full backup of all databases on the server, with the
exception of msdb, tempdb, model, and master. What we want to do is verify free space before
running backups, similar to earlier. If space is unacceptably low, then end the job and notify the
correct people immediately. By maintaining enough space on the drive, we prevent running out
completely and causing regular transaction log backups to fail. The test for space on the backup
drive incorporates our dm_os_volume_stats query from earlier and assumes that we must
maintain 25GB free at all times:

1  

2 IF EXISTS (SELECT * FROM sys.procedures WHERE procedures.name = 'full_backup_plan')

3 BEGIN

4 DROP PROCEDURE dbo.full_backup_plan;

5 END

6 GO

7  

8 CREATE PROCEDURE dbo.full_backup_plan

9 @backup_location NVARCHAR(MAX) = 'C:\SQLBackups\', -- Default backup folder

10 @backup_free_space_required_gb INT = 25 -- Default GB allowed on the backup drive

11 AS

12 BEGIN

13 SET NOCOUNT ON;

14  

15 DECLARE @current_time TIME = CAST(CURRENT_TIMESTAMP AS TIME);

16 DECLARE @current_day TINYINT = DATEPART(DW, CURRENT_TIMESTAMP);

17 DECLARE @datetime_string NVARCHAR(MAX) = FORMAT(CURRENT_TIMESTAMP ,

18 'MMddyyyyHHmmss');

19 DECLARE @sql_command NVARCHAR(MAX) = '';

20  
21 DECLARE @database_list TABLE

22 (database_name NVARCHAR(MAX) NOT NULL, recovery_model_desc NVARCHAR(MAX));

23

24 INSERT INTO @database_list

25 (database_name, recovery_model_desc)

26 SELECT

27 name,

28 recovery_model_desc

29 FROM sys.databases

30 WHERE databases.name NOT IN ('msdb', 'master', 'TempDB', 'model');

31  

32 SELECT @sql_command = @sql_command + '

33 DECLARE @backup_drive_space_free BIGINT;

34     DECLARE @current_db_size BIGINT;

35 DECLARE @error_message NVARCHAR(MAX);'

36 SELECT @sql_command = @sql_command + '

37 USE [DBTest];

38 SELECT

39 @backup_drive_space_free = CAST(CAST(available_bytes AS DECIMAL) / (1024 * 1024 *

40 1024) AS BIGINT)

41 FROM sys.master_files AS f

42 CROSS APPLY sys.dm_os_volume_stats(f.database_id, f.file_id)

43 WHERE f.database_id = DB_ID()

44 AND f.type_desc = ''ROWS'';

45  

46 USE [' + database_name + '];

47 SELECT

48 @current_db_size = SUM(size) * 8 / 1024 / 1024

49 FROM sysfiles;

50  

51 IF @backup_drive_space_free - @current_db_size < ' + CAST(@backup_free_space_required_gb AS

52 NVARCHAR(MAX)) + '

53 BEGIN

SELECT @error_message = ''Not enough space available to process backup on ' +


database_name + ' while executing the full backup maintenance job.  '' + CAST(@backup_drive_space_free AS

VARCHAR(MAX)) + ''GB are currently free.'';

54 RAISERROR(@error_message, 16, 1);

55 RETURN;

56 END

57

58 BACKUP DATABASE [' + database_name + ']

59 TO DISK = ''' + @backup_location + database_name + '_' + @datetime_string + '.bak'';

60 '

61 FROM @database_list;

62  

63 PRINT @sql_command;

64 EXEC sp_executesql @sql_command;

END

Within the dynamic SQL, and prior to each backup, we check the current free space on the
backup drive, the size of the database we are about to back up, and compare those values (in GB)
to the allowable free space set in the stored procedure parameters. In the event that the backup
we are about to take is too large, an error will be thrown. We can, in addition, take any number of
actions to alert the responsible parties, such as emails, pager services, and/or additional logging.

In the event that I try to back up a particularly large database, the expected error will be thrown:

Msg 50000, Level 16, State 1, Line 656 Not enough space available to
process backup on AdventureWorks2014 while executing the full backup
maintenance job. 141GB are currently free.

Since backup failures are far more serious than an index rebuild not running, we would want to
err on the side of caution and make sure the right people were notified as quickly as possible.
The parallel job solution from earlier could also be used to monitor backup jobs and, in the event
that free space was too low send out alerts as needed and/or end the job.

You might also like