Is it just me, or is Data Migration suddenly interesting?
I recently joined a big Australian Bank and noticed there are many data migration exercises to come as a result of a recent merger and major system replacement programs in the works. I thought with so much data migration work in the pipeline, it would make sense to standardize the way data migrations were performed and perhaps even setup a competency centre or the like.
I know a small consulting firm in Sydney named Lucsan Capital that specializes in helping financial markets firms with implementations of systems such as Murex and Calypso. They have been engaged in many system implementation projects and found a need for a tool that combines Data Migration, Reconciliation and Process Management into a single easy to use platform. They’ve built this tool called LMIG and use is as a practice aid on their implementation projects.
What’s the competition doing?
Though I have been pretty impressed with what I’ve seen of LMIG and the Lucsan people, before going too far, I thought I better do a bit of research to see what the competition has to offer. I looked at Informatica and IBM Infosphere. Both of these tools are leading ETL products and obvious candidates. But there is definitely a dinstinction between the requirements of large mission critical ETL platforms – things that populate your data warehouse or act as information gateways and the needs of a project team working to quickly and safely migrate the data from one or more legacy systems to the target environment.
Informatica Data Migration Solutiona contains information on the work informatica has been doing on both the tool and their Velociy Methodology to adapt to requirements of Data Migration. There is also a fair bit of research by Bloor in this space which looks at the market opportunity and competing products.
Both these products appear to be world class. Where the seem to fall down is their being almost too good. They both seem to have many modules and options and many moving parts for their full deployments. For example there are developer studios, process servers, schedulers, etc… all these things have to be idenfitied and costed in your final solution. That may be suitable in the case of building a stable ETL environment. But when working in Data Migration, you need a bit more agility and simplicity. Something that gets the job done but doesn’t become the focus of your entire project. Keep in mind, Data Migration is really just a necessary evil to achieve a strategic objective such as system consolidation or upgrade from a legacy to a shiny new system 🙂 You need to be sure your DM solution doesn’t divert your attention from the real objective.
How about OpenSource?
I generally love open source and Java for everything. A few of my old colleagues at Macquarie Bank turned me on to Talend which is an OpenSource Data Integration platform. It has a data profiling engine which sounds very interesting. If I had a development team working for me, I’d probably be keen to go OpenSource. But at the moment, I’m looking for something that is out-of-the-box and easy for Business Analysts to use. I’ll look at Talend a little bit later.
I’d love to tell you the conclusion. But I’m afraid the jury is still out. I’m really looking forward to getting past the analysis stage and getting on with delivering some benefits to the business in terms of greatly reduced lead times to Data Analysis results and migration of data. Let me know if you have any views on this topic.
- I guess a blog post isn’t complete without reference to the Wikipedia Definition and
- There’s Johnny’s DM Blog gives insight to the problem
- Tony Sceales blog on DM
- Data Migration Pro contains online resources