Rules of successful data migration

By on
Rules of successful data migration
Page 2 of 2  |  Single page
Rule three: no one needs, wants or will pay for perfect data

Applications are only as good as the data they have available to them. We also know that many a data migration has been scuppered by overestimating the quality of, or not understanding, source data. Oh the joy of legacy data with its gaps, inconsistencies and redundancies.

However, while enhancing data quality is a worthy goal, it is really important not to go off on a tangent mid-project in the quest for perfect data quality. Like over-specification of an application, the quest for data perfection can result in negative consequences for the project. It is where many, many projects run aground – inflating both the cost and time to deliver the project. To avoid this trap, data owners and users need to determine the level of quality they need at the start of the project so the technologists have an appropriate goal to aim at. It’s also why project managers need to be aware of the true quality of their legacy data and allow adequate time and budget to achieve the requisite data quality.

A successful migration strategy also needs to incorporate a range of cleanse strategies at different points in the life of the program – sometimes pre-migration, sometimes in-flight, and sometimes post-migration – but always with a conscious decision from a business manager. Too many historical migration approaches mean bringing the whole process to a halt while a data quality issue is explored and resolved. A modern platform will provide the flexibility to continue migrating while this is done.

Rule four: if you can’t count it, it doesn’t count

Another challenge is how to measure data quality in order to assess the state of your legacy data and determine the level of quality your business users require. To make matters worse, data quality is not static but erodes and improves over time. It’s really important the measures you use make sense to business users, not just to technologists. This enables you to measure deliverables, perform gap analyses, and monitor and improve ongoing data quality. It also ensures you concentrate your efforts on where business users see value and can quantify the benefits.

Reconciliation of data migrated from source(s) to target(s) is always a critically important activity – how do you know when you’re done? When dealing with dynamic environments where you can’t freeze the data this becomes even more challenging: you need to be able to handle a shifting scope. Having a flexible data model and closely coupled reporting capability is key to understanding and driving this process.

Achieving a business-driven migration

Having put a business-driven migration project in place the trick then is to select methodologies and technologies that can deliver against these requirements.

Business-driven migration involves decoupling the technical problem of moving data from the business processes that use it. This requires a migration solution that enables you to easily encapsulate the business problems you face, while being flexible enough to cope when those requirements change. This ensures that ROI from new application investment is maximised and operations enhanced rather than adversely affected.

The importance of getting migration right from a business perspective was addressed at a recent British Computer Society meeting by BT’s Phil Dance: “Increasingly our business case will depend on how good we are at getting our data across. A bad data migration ultimately means a bad customer migration, in a competitive market that’s very bad news.”
Previous Page
1 2 Single page
Got a news tip for our journalists? Share it with us anonymously here.
Tags:

Log in

Email:
Password:
  |  Forgot your password?