SQL backup to Azure using Powershell

Tags: powershell, azure, backup, sql

2016.11.19 update: I just created new GitHub repository which will hold all my StuffLeftAtParty, so all the files (SQL and PS) are right there for you to grab them.

Everybody knows you should be making backups of everything. Especially if it's your startup's database. Without db backup you'll probably be bald soon. And you don't want to be bald.

So the plan for me to keep my hair as long as I can was to create an automated SQL backups that go straight to the cloud.

SQL backup script

I won't lie, I'm no SQL backup expert, so I went and found this very nice SQL backup script directly from Microsoft itself. Go and grab it now and then install it in your SQL database.

It can create full or transaction-log-only backups for one or more databases. I chose to have Full recovery model for my database so I'll backup my transaction log every hour and execute full backup once a day. This way, if something goes wrong, I should theoretically loose at most 1 hour of data, which for my business should be acceptable.

Oh, that reminded me that I should go and check, just in case, if the log file file isn't langoliering my disk. It shouldn't but, you know, you never know.

Anyway, we now have the ability to do either Full or Log SQL backup. Time to Powershell it.

Powershell script

I like Powershell. I'm not using it that much, but it's a really "cool tool" ™. For me, it's better that bash on Linux and since they're saying it's working on Linux now, I'll definitely have to verify that.

OK, so the script should do the following things:

  1. Execute SQL backup script we just installed in either Full or Log-only mode and save the output to proper folder
  2. ZIP any *.bak or *.trn files we have in this folder. Delete *.bak or *.trn files (*) because since we just zipped them, there's no need to store them uncompressed anymore.
  3. Send the ZIP file to Azure storage
  4. Write some info to CSV file
  5. If it's full backup, send email to you with this CSV file and then remove CSV file

Before we go further you should also get Azure for Powershell first and create new Container in Azure that will hold your files.

Here's the Powershell script itself:

param([ValidateSet("full", "log")][string]$BackType = "full")

$sqlBackupLocation = "c:\_SQLBACKUP\" # local path for backup files
$dbName = "<your_db_name>"
$backupFilesExtensions = "*.bak","*.trn"
$zipFileName = (Get-Date -Format yyyyMMdd_HHmmss) + ".zip"
$csvFileName = "log-info.csv"
$zipFilePath = $sqlBackupLocation + $zipFileName
$csvFilePath = $sqlBackupLocation + $csvFileName

$csvObjProps = @{ 'Date' = Get-Date -Format "yyyy-MM-dd HH:mm:ss"; 'Type' = $backType; 'Filename' = $zipFileName }
class EmailConfig
    [string]$To = "<log_receiver@email>"
    [string]$Subject = "[<your_app>] Backup log"
    [string]$From = "<sender@email>"
    [string]$Body = "SQL backup log is in the attachment."
    [string]$SMTPServer = "<sender_serwer>"
    [int]$SMTPPort = 587 # or 25
    [string]$SMTPPassword = "<sender_pass>"
$emailConfig = [EmailConfig]::new()

class AzureConfig
    [string]$AccountName = "<azure_account_name>"
    [string]$AccountKey = "<azure_account_key>"
    [string]$BackupContainer = "db-backup" # be sure to have this container created
$azureConfig = [AzureConfig]::new()

$startTime = Get-Date

function BackupSQL()
    $sqlBackupType = 'F'
    if( $BackType -eq "log")
        $sqlBackupType = 'L'

    sqlcmd -S .\SQLEXPRESS -E -Q "EXEC sp_BackupDatabases @backupLocation='$sqlBackupLocation', @databaseName='$dbName', @backupType='$sqlBackupType'"

function ListZipAndDeleteSQLFiles()
    # list files to be zipped
    $files = Get-ChildItem -Include $backupFilesExtensions -Path $sqlBackupLocation\*
    $files | Compress-Archive -DestinationPath $zipFilePath

    # delete src sql files
    $files | Remove-Item -Force

function UploadZipToCloud()
    $Ctx = New-AzureStorageContext -ConnectionString $StorageConnectionString
    # upload
    Set-AzureStorageBlobContent -Context $Ctx -Container $azureConfig.BackupContainer -File $zipFilePath
       Write-Host "Uploading to cloud: $_" 

function AddInfoToCSV()
    # create CSV object and write it to file
    $csvObjProps.FileSizeKB = (Get-ChildItem $zipFilePath).Length/1024
    $csvObjProps.TimeSpentSecs = (New-TimeSpan -End (Get-Date) -Start $startTime).TotalSeconds
    New-Object -TypeName PSObject -Property $csvObjProps | Export-Csv -Delimiter "," -Path $csvFilePath -NoTypeInformation -Encoding UTF8 -NoClobber -Append

function Send-EMail {
    Param ([Parameter(Mandatory=$true)][String]$EmailTo, [Parameter(Mandatory=$true)][String]$Subject, [Parameter(Mandatory=$true)] [String]$Body,
            [Parameter(Mandatory=$true)][String]$EmailFrom, [Parameter(Mandatory=$true)][String]$SMTPServer, [Parameter(Mandatory=$true)][int]$SMTPPort, 
             [Parameter(mandatory=$true)][String]$Password )

        $SMTPMessage = New-Object System.Net.Mail.MailMessage($EmailFrom,$EmailTo,$Subject,$Body)
        $SMTPattachment = $null
        if ($attachment -ne $null) {
            $SMTPattachment = New-Object System.Net.Mail.Attachment($attachment)
        $SMTPClient = New-Object Net.Mail.SmtpClient($SmtpServer, $SMTPPort) 
        $SMTPClient.EnableSsl = $true 
        $SMTPClient.Credentials = New-Object System.Net.NetworkCredential($EmailFrom, $Password); 
        if( $SMTPattachment -ne $null ){
            Remove-Variable -Name SMTPattachment
        Remove-Variable -Name SMTPMessage
        Remove-Variable -Name SMTPClient
        Remove-Variable -Name Password


function SendEmailWithCSV(){
        Send-EMail -EmailTo $emailConfig.To -EmailFrom $emailConfig.From -Subject $emailConfig.Subject -Body $emailConfig.Body -SMTPServer $emailConfig.SMTPServer `
                    -SMTPPort $emailConfig.SMTPPort -attachment $csvFilePath -Password $emailConfig.SMTPPassword
       Write-Host "Error sending email: $_" 
    Remove-Item $csvFilePath -Force

Write-Host "Performing backup: $BackType"
# execute SQL backup first
Write-Host "Backup created"
# create files to zip file
Write-Host "Files zipped"
# upload to azure storage
Write-Host "Files uploaded to cloud"
# add to info csv
Write-Host "CSV info updated"
# send CSV to email if we are doing full backup
if( $BackType -eq "full" ){
    Write-Host "Email with CSV sent"

Write-Host "Done!"

The script isn't anything fancy but it's doing its job. Go ahead, customize it for your needs, configure it and check if it works. It should. It works on my machines.

Finishing up

We have the script, now we should set it for periodic execution. Use Task Scheduler for it and remember about those few things:

  1. Set task Action as Run a program which is Powershell, so something like: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
  2. As program arguments, set your script: -Command "& c:\Scripts\execute-backup.ps1 -BackType 'log'" (change log to full for once-a-day backups)
  3. Be sure to run the task even when user is not logged in
  4. Create new user for this task and allow this user to write into destination folder. I'm also not sure if you shouldn't set some permissions for your PS script... oh well, you'll figure it out if something won't work.
  5. I also like to have full backup when the machine starts, so that I'll have "fresh start" with transactions logs.

And finally we're done. From now on, everything goes to the cloud and so your data should be fairly safe. Assuming that cloud won't go down... nah, it won't. It will never go down, right? RIGHT? RIGHT?!?!11

(*) I just realized that it might be a better idea to delete those *.bak and *.trn files right at the end, so that when something goes wrong, we won't loose them that easily.