Migrating SQL Server 2008 databases to Azure SQL

Migrating SQL Server 2008 databases to Azure SQL

“This is a quick one”, I thought. A customer asked me to have a look at their old CRM environment, that was running in a local physical server in their datacenter. The plan was to decommission the server, but rescue the CRM database.

I knew I could lift-and-shift the database or databases to Azure SQL quite easily. This post walks through my experience and some of the challenges I encountered.

Provisioning Azure SQL

I received the source database as a traditional SQL Server backup, a .BAK file. It was compressed but still considerable big at 11,6 GB.

I provisioned a new managed SQL Server in Azure. Not the Managed Instance, as I feared it would be too expensive to maintain this archive database for longer periods of time. A single instance database is the easiest, and it’s also very cost effective.

Nowadays you can choose between vCore and DTU based pricing models. DTU is the traditional one, and it works by allocating a fixed Database Transaction Unit count. The higher the DTU, the more performant the database will be. vCore allows more flexibility and control, but comes at a higher price. A Gen4 vCPU is about 164 €/month, while the lowest DTU-based database is only 4 €/month.

The backup is 11 gigabytes and I didn’t know how big the database would be uncompressed and running. Luckily, Azure SQL made this easy for me – the Basic (B) is limited to 2 GB of storage for the database size, and the next tier is Standard S0 with 250 GB storage included. This includes 10 DTU and is priced at 12.42 €/month (you can review the pricing and details here).

I provisioned a Standard S0 database in Azure and verified everything works. I didn’t go for a scripted provisioning, as I only needed to provision this one database and for such needs I find Azure Portal to be excellent.

I then download SQL Server Management Studio (version 17.9.1 is the latest) and installed it locally in one of my spare virtual machines. With SSMS I can easily access both local and remote SQL Servers at the same time. To connect to my Azure SQL database I need the connection details from the Azure SQL blade, under Properties:

Connecting to <servername>.database.windows.net using SQL Authentication in SSMS shows me my server and database are up and running.

Restoring to Azure SQL using a .BAK file

I’ve worked occasionally with SQL Servers from version 6.5. Since that time many things have changed, but the traditional tools and approaches are the same.

I had the 11 GB .BAK file in my C:\Temp, and I tried to restore it directly to my Azure SQL database. Turns out this doesn’t work, or at least it’s not evident how it would work. Nothing in SSMS implies this is supported, but importing data is. As I don’t know what’s in the backup file I don’t want to start guessing what to import – I just want an identical database in the cloud.

I found guidance on how to restore directly from .BAK file. While reading through it and replicating some of the steps to figure out how it works, I realized this approach requires a Managed Instance for Azure SQL. How it works is that you provision a Storage account in Azure, upload your file there as a Blob and then restore using a Shared Access Signature-based access. A T-SQL statement can be executed to perform a sanity check for this process – and for me it complained about a missing Managed Instance.

So, I turned down this option and went back to the planning board.

Migrating to Azure SQL from a local SQL Server 2017

Since I already had a VM running Windows Server 2019 and SSMS, I decided to install SQL Server 2017 also. This way, I reasoned, I could get the database restored locally and try to migrate using the database engine. I had a mild pang in my forehead I’d done this in 1997.

Installing SQL Server 2017 only takes a few minutes, and restoring the database took about 10 minutes. I allocated a few cores and enabled dynamic memory for my VM. Turns out this was a good choice, as Task Manager showed me a bit later on.

The restored database expanded from 11 GB to 30 GB. Everything looked great, as I had a local copy of the database running and I was able to verify the .BAK file was intact.

This also produced new tools for me to try out.

As I had no clue what the Data-tier approach would require, I chose the Deploy Database to Microsoft Azure SQL Database.. tool. Finally, I had evidence SQL Server understood my intentions!

Unfortunately, the migration tool insisted I provision a new database, and I couldn’t simply select the Standard S0 database I already had available. I opted to create a new database and left the migration running (as there isn’t anything else to configure).

Each time my migration failed. I got numerous errors on missing accounts and unresolved references. This was frustrating, as I had hoped the tool would allow me to bypass anything it couldn’t migrate. But it ran for a few hours and produced the errors one by one – leaving my database in an inconsistent state.

I love challenges but sometimes I wish there would be a few less for trivial things such as “move my database to the cloud” 🙂 But that’s part of the journey in figuring out how migration works in the modern age.

It seemed I couldn’t use SSMS and SQL Server together to perform the migration. I briefly considered exporting the whole database as T-SQL scripts but realized I’d probably lose a lot of data, so that wasn’t an option.

Bring in the modern tools: Data Migration Assistant

I stopped working on this problem for a few hours, tried to mimic my Swedish friends by having a fika, and stumbled upon the Data Migration Assistant. It’s a free, lightweight tool for upgrading to modern data platforms (read: Azure SQL). I walked back to my study room, gave a caressing touch to my LEGO Porsche 911 GT3 RS on the way (it often brings me good luck), and fired up DMA.

DMA allows for assessment and migration. I performed the assessment first, which took about 30 minutes. I assessed the viability of the locally available database being migrated to my Standard S0 Azure SQL database. That produced no errors, so I was more than good! Thank you, LEGO.

I performed the migration, and there is not much to configure within the tool to set it up. You select the source, target and what to migrate. It first creates a script of the schema, provision it and then migrates all data.

When you confirm the migration to start, DMA suggests upgrading the database from S0 to Premium P15. This is the largest and most performant database tier in Azure SQL – and it costs a staggering 13 238 €/month! Per hour it’s only 18 € but I didn’t know how long this would run. And if I accidentally forgot to downscale the database back to S0, I would end up mortgaging my house and selling my car to finance all this.

I initiated the migration as Standard S0 tier, but after 2 hours and barely any progress I changed the database tier to P6. That’s only about 4 € and I set an alarm on my phone to remind me of this in two hours.

Using Azure SQL metrics monitoring I was able to follow up on DTU usage. I started the DMA-based migration at 8:34 AM (far left in the graph above), and it was completed at 11:40 AM. With the Standard S0 tier the database was lagging yet it wasn’t constantly hitting the 10 DTU max. At 10:31 AM I changed the tier to Premium P6 and that’s where you see the DTU spiking at 100 %. From there, it only took about an hour more until the migration was completed.

This slight change from Standard S0 to Premium P6 accumulated the cost to around 5 €. This is based on the Resource Group Cost Analysis.

Final thoughts

The database is now happily living in the cloud, and I have no need for the local copy anymore. I can still use SSMS locally to manage the database, if I need. I also remembered to scale it down Standard S0, thankfully.

Using the traditional tools produced such different results, that I was very happy to see DMA being available. It was a refreshing experience to simply migrate everything in one go, and the suggestion to upgrade to Premium tier was interesting, yet slightly worrying because of the cost factor. Next time I know I’ll start directly using DMA instead of SSMS.