Over the last nine months, I've been leading a project to migrate 60 Data Analyst users and all their objects from a legacy SQL Server database to a fit for purpose, scalable AWS Aurora database.

What I hope you take away from this post is not only that it's very easy to get started with AWS, but that migration is a team sport. This project would not have been a success without the buy-in and support from the Data Analyst users.

1 - Growing Pains
2 - Community and Communication
3 - Design Challenges
4 - Migration
5 - Testing & Go Live
6 - The Future

Part One: Growing Pains

Analytics has been growing incredibly quickly at Xero. There are now over 60 Analysts using the platforms that my team, Data Services, build and maintain. But it hasn't always been this way.

Back in 2015, there were four Marketing Analysts in Wellington and another four in the regional offices in London, San Francisco and Melbourne. Data Science wasn’t a hot job yet, and the Data Analysts role was primarily to support marketing automation by using the Data Services platform to pull lists for email campaigns.

By the time we reached 2017 the community had grown to around 20 Analysts and the cracks were starting to show on the SQL Serverenvironment. The amount of data stored was going up every few months, and there was a demand for faster processing.

Fast forward to today and there are now over 60 Data Analysts and Scientists who use the Data Services platforms for everything from sales reporting, analysing sentiment in customer tickets and tracking customer journeys through the Xero app.

Life with SQL Server

The SQL Server deployed as a sandpit environment was designed for a much smaller workload, wasn’t intended to be a production space and became much harder to monitor, debug and deliver a quality service.

We had reports of:

  • Tables being empty and not enough visibility of when the 12-hour process to update the SQL Server environment would complete.

  • Regional Analysts losing work time as the update window cut into their workday.

  • Database locking as multiple queries hit the same tables during busy times

Data Services had moved to Amazon Redshift for our Data Warehousing so made the call to use another AWS option to take the Analysts database from ‘state of emergency’ to ‘state of the art’

Deciding on Aurora

Aurora provided the efficiencies we were looking for:

  • It’s a fully managed service at 25% of the cost of SQL Server licensing and disc

  • It uses the same Postgres syntax as the Redshift data warehouse so speeds up copy times as they speak the same language

  • And can scale with us as we grow. Scaling up with AWS is as simple as pressing a button.

Aurora is a better platform Analysts as it’s 5x faster than SQL Server, 3x faster than standard PostgreSQL with queries running in seconds not minutes.

Part Two: Community and Communication

Before we could start solutioning a new environment, we wanted to make sure there was a shared understanding between the Analyst community and Data Services.

Migrations within our own team had been tricky enough but this one had added complexity. We have users in four different timezones which makes it a challenge to schedule meetings and keep communication flowing.

Due to Xeros growth, there are new Analysts starting each month. Some Analysts have only Excel experience, others are used to crunching through big data sets.

To get everyone on board we have held monthly workshops with key users, created a dedicated Slack channel and focused on making the environment as accessible as possible.

Part Three: Design Challenges

In mid-2018, with the community on board, we launched a Proof of Concept Aurora database and took stock of the SQL Server environment we were dealing with.


The first thing challenge was to see how many tables we would be tasked with migrating and hopefully reduce that to the bare minimum.

Because the SQL Server environment was designed as a sandbox there were legacy tables, temp tables and schemas of objects no one seemed to own.

Through a huge amount of effort and a little bit of pestering from me, we managed to clear out 2500 tables and freed up 2TB of disc.

Moving Data Around

Challenge number two was to replace the mechanism used to move the data from our Data Warehouse to the Analyst Environment.

Some tables are billions of rows deep, and others are hundreds of columns wide which the SSIS process wasn’t coping with.

Having to translate all that data from a Postgres Datawarehouse to a SQL Server database reduced efficiencies and updating the SSIS packages that moved the data around was tedious and fiddly.

By using DB Link and Apache Airflow as our scheduler we’ve removed the complexity in transferring between the Data Warehouse and Analyst Database. And the copy time has gone from over 12 hours to just two.

Searching for a Scheduler

Our third challenge was to find a scheduling tool that the Analysts could use to schedule their stored procedures.

Aurora does not come with a built-in job agent so we needed to work through other possibilities.

We considered CRON jobs but not all the Analysts are familiar with the command line, we then considered standing up an Airflow server for the Analysts, but with each new job scheduled, the server needs to be restarted. This could have been a disaster with Analysts in four timezones potentially interrupting running jobs to schedule their own.

In the end, we decided on using what was already there. Windows task scheduler, as it provided little disruption and was a tool the Analysts were already familiar with.

Separation and Security

The last major consideration we made was around security and separating functional groups.

In SQL Server everyone could see everything, so everyone could potentially delete anything.

We made the conscious decision to create a schema for each function so they have a dedicated place to write their tables. Analysts then grant access to tables they want to share because everyone shouldn’t have access to everything

Part Four: Migration

Unlike a traditional database migration, we had some tools from AWS to make things faster and easier.

But just like a traditional migration not everything went according to plan.

The Schema Conversion Tool and Database Migration Service

The Schema Conversion Tool works hand in hand with the Database Migration Service but has presented the biggest challenges through the whole migration.

The Schema Conversion Tool is designed to seamlessly convert the entire schema with all its keys, constraints and indexes. Once configured with a set of transformation rules the Database Migration Service then converts the data to automagically land it perfectly in the new destination database. From reading the documentation it sounds like it should be perfect and seamless, almost like magic.

But, there was a lot more manual intervention needed in this process. The further we got through the process, the more fiddly it became. Sometimes it would run all the transformation rules so the data moved as expected, sometimes it would pick some and sometimes none at all.

A lot of the time this resulted in multiple tables landing that needed to be cleaned up.

Part Five: Testing & Go Live

Despite some challenges in designing the right environment and moving the data around, we went from gathering requirements to User Testing in six months. But the challenges don’t end here.

Building an environment in isolation is very different from letting over 60 users lose, there’s the new syntax to get to grips with, less technical users needing more help and queries hitting the database all at once. It’s still early days but the feedback has been great.

“Great job commissioning Postgres and migrating all the data. Not an easy task. 
Thank you for all the support!”

We’ve had feedback on how smoothly everything has gone, even though there were bumps in the road to get to where we are now.

“Wooo hooooo! Aurora is excellent.

Glad we are using Postgres now”

The Analysts have enthusiastically taken to the new system and are getting to grips with the new UI, the new tools and the quirks that go along with a new database.

“It feels so much faster!
I’m sure we will encounter a few challenges but that’s part of every new system”

The copy times have reduced from 12 tedious hours to just over two, Analyst queries are much quicker to return and ownership is a lot clearer between Data Services and the Analysts.

“I just had a question about an Aurora thing, so I looked on Confluence and FOUND THE ANSWER. Awesome work on that page”

We’re documenting everything we can in a user guide that gets added to by both Data Services and the Analyst Community and the Slack channel will stay open for the Analysts to help each other.

As one of the first teams to make the move to Aurora we’ve found the process fairly smooth and having such positive feedback has been a huge win.


Part Six: The Future

If you would like to get started with AWS the first step is to check out the free tier. AWS offers one year of services to explore what they offer. Aurora comes in MySQL and PostgreSQL flavours and Serverless is also on the roadmap. After that, it's a matter of clicking a few buttons to get going.

But it’s not just about tooling, as migration is a team sport.

We have the community of data Analysts and scientists working with us to build out a roadmap and it’s encouraging to see that they help each other out.

By listening to the communities ideas and taking a slow and steady approach we have tackled the planning, development, migration and training of 60 users in just nine months.

We’re now in a really good position to build the platform further for the Analysts we support and contribute to their success.

Cover photo by David Dibert from Pexels


Guest Blogger

emmawedekind profile

Special thanks to our guest blogger Helen Anderson, a BI Data Analyst and Technical Consultant for her contribution to the Ronald James Blog this week.

twitter  Helen github helenanders26 link helenanderson.co.nz

Who Are Ronald James?

We are a leading niche digital & tech recruitment specialist for the North East of England. We Specialise in the acquisition of high-performing technology talent across a variety of IT sectors including Digital & Technology Software Development.

Our ultimate goal is to make a positive impact on every client and candidate we serve - from the initial call and introduction, right up to the final delivery, we want our clients and candidates to feel they have had a beneficial and productive experience.

Contact our Team

If you’re looking to start your journey in sourcing talent or find your dream job, you’ll need a passionate, motivated team of experts to guide you. Check out our Jobs page for open vacancies. If interested, contact us or call 0191 620 0123 for a quick chat with our team.

Let's be Friends!

Follow us on our blog, FacebookLinkedInTwitter or Instagram to follow industry news, events, success stories and new blogs releases.



Back to Blog

</Follow Us>