Backing up PostgreSQL databases in Azure

Backing up PostgreSQL databases in Azure

I was exposed to PostgreSQL about two years ago through a few deployment projects. I blogged about my experience configuring and running Azure Database for PostgreSQL instances. At the time, I didn’t need to worry much about backups or recovery, as everything was built for tests and rapidly refreshed data from external data sources.

But then the day came when someone declared the databases are becoming pretty crucial – so are we backing them up? It turns out it’s not a trivial task. This post aims to walk through the challenging bits of backing up PostgreSQL databases in Azure.

The big picture

To backup PostgreSQL databases, you will need to utilize Backup Vault, which is the place where backups are stored. In addition, PostgreSQL databases require a Key Vault to be accessed through the Azure Backup service and a backup policy.

Once you have these in place, the procedure is as follows:

  • Configure permissions for your backup account within the PostgreSQL instance
  • Run a script to set proper permissions throughout the instance for your backups
  • Store the access credentials in Azure Key Vault as a secret
  • Configure backup in Backup Vault
  • Execute the backup and verify the results

It’s a bit cumbersome, as you need to prepare certain things with permissions before backups work. Let’s take a closer look at these.

Configuring permissions for backups

Pick up the PowerShell script from Microsoft (here), which is then used to grant all necessary permission to your database. It’s a bit of a hassle to get it running initially, but you can re-do the script as many times as needed. The script requires the PSQL command-line tool available, and you can download that here.

Next, once your permissions are set within the database instance (and databases), provision a new Key Vault. It’s advisable to dedicate this to your PostgreSQL backups, so refrain from using a shared Key Vault instance you might have lying around. Once a new Key Vault instance is running, pick up the connection string of your PostgreSQL instance under PostgreSQL instance > Connection strings. It’s the first one titled ADO.NET, and it looks like this:

Server={instance};Database={your_database};Port=5432;User Id={user}@{instance};Password={your_password};Ssl Mode=Require;

Replace {instance} with the PostgreSQL instance name, and apply the username and password for your backup account. Store this text string as a secret in Key Vault.

Note! If the password has a semicolon (;) backup will fail – so make sure the password is complex but not too complicated 🙂

Also, make sure you grant the proper access permissions in Key Vault. See details here.

If you choose to use a different account to perform the backups, you must provision that within the PostgreSQL instance and re-run the PowerShell to grant the necessary permissions. Make sure you grant proper permissions by following the additional script-based guidance here.

Once done, head over to Backup Vault in Azure, and perform the configuration. This guides you through selecting the PostgreSQL instance, connecting with the Key Vault, finding the connection string, and authenticating to the PostgreSQL instance and databases. You can also set the backup policy while doing this.

If the last bit fails, which I did several times, you need to consult the troubleshooting guide. The numerous different errors are explained there.

In closing

Currently, the capability to backup PostgreSQL databases in Azure is in preview. After spending a few hours setting the permissions, I was able to get this to work – but the hassle with the external PowerShell script added to the complexity, as I didn’t have proper permissions to the database, to begin with. With the additional guidance of granting enough permissions to my PostgreSQL backup account, I connected to the database through the secret stored in Key Vault. Once this was done, everything else was a breeze.