Azure Data Sync Issues

Azure data sync is a great tool for those people looking for a quick and easy way to move data from on premise systems to the cloud. It works great when you have a fairly simple table structure and no transformations before the data is ready to load to the destination. Recently we came across a few edge cases when using Azure Data Sync a fairly old SQL Server 2008 database system to Azure.

These issues are fairly obvious but for some reason doesn’t seem to be documented in the Azure Data Sync documentation.

First Issue

Azure data sync can only work with databases that have 500 or less tables. When configuring the tables to sync you need to refresh the schema in order to list the available tables and column. If the database has more than 500 table the schema refresh fails.

We couldn’t delete tables without too much effort so we decided to use Azure data Factory to move data from this database instead.

Second issue

Table names that have special characters like (.) or ([) should not be present in column or table names obviously there are other special characters as well that could cause issues and as a best practice you should avoid using these. In this case the problem tables were legacy unwanted tables or user created and we that the freedom to rename them so that fixed the issue.

Third Issue

Azure data Sync works only with database above SQL 2008 since it uses the Merge command to sync data between source and destination. However even though we were using the SQL 2008 database installation the sync failed because the database compatibility level was for 2005. So make sure you check these properties before starting the sync.

Another issue we faced that was documented was the use of user defined data types is not supported in Azure data sync so beware of these tables and ensure they are addressed before starting the sync.

Please Consider Subscribing

Leave a Reply