At Heineken, we dream of becoming the best-connected brewer. To achieve our dream, one of the key elements that our digital & technology team needs to guarantee is the availability of quality and accurate data, upon which every decision making across the company should rely on. Business areas such as planning and sales are used to have data driving their decisions, and that usually reflects over others like finance, controlling, and so on. But how all of that relates to data migration?
It’s a fact companies are all living organisms and the only certain thing is that things are about to change, but besides conceptual thinking, it all should start with the assessment of the kind of information is needed, what is available and establish the guidelines and standards that will be adopted and this job, although easily explained and not so easily done and goes through defining the governance, frequently at field level, and even creating a dedicated team to manage and maintain this kind of information, at least for the key data for the company (e.g., pricing strategy, recipes, financial scores, etc.).
How do we tackle the challenges?
In a real situation, consider a company that just acquired another one, and they’re in the same business segment but have different ways of working. They will have to coexist during the data migration process, reporting the results in such a manner that needs to consider the company as a whole during the period of coexistence of the two different processes (at this point, you've got it, the data migration takes time).
Data migration projects must be viewed as a foundation and enabler for other initiatives
Start finding out the challenges, and we have them. Listing the main ones, high complexity, high volume, data integrity, compliance, conceptual divergences among areas and downtimes. The next step is to define the phases, considering the destination system as the model, going through the data assessment, and preparing for the extraction and upload, assessment of the available tools,\ and which better fits the job and testing. Note that some databases offer tools to help with data migration (e.g., Oracle Golden Gate), helping effectively do the information exchange.
In addition to that is also needed to assess the connected legacy systems that ones that would be decommissioned.
What else is needed?
One thing that must be done with the data migration is to set up process standards (including for the legacy systems) and assess which one should be adopted and which should be adopted according to the destination system. Consider that all that leads to a less complicated learning curve when talking about the affected processes and tools and the information flow within the destination system, aiming to have a common view and use among the users. Consider a key action defining the data owner (business driven).
Bringing all of that to the real world, during the fusion between the two companies, they had the same ERPs but in different versions, and we had to take some steps to, at first, assess the difference between processes but always considering that we had to move everything to the newer one. That kind of analysis showed us, besides a huge amount of data that should be moved (e.g., work in progress, etc.), a substantial number of transactions should have some changes related to specific processes that were necessary within the new system, but mostly had the focus to be adopted leading to considerable organizational change management. Technically speaking, we had to create a data staging area to make the EDM work properly and make the correct data transformation that was needed within the destination system, also changing the programs at the destination system.
The Execution
We had assessed technical tools to deal with the migration data, but given the circumstances, it was faster to go to a different strategy, making the extraction, cleansing the data, and uploading it to the destination system, making use of the data staging area and programs developed for that purpose (we had three mocks to validate the process). We had less than 48 hours of downtime within an operation that works 24x7, and we did a full data migration from an SAP instance 4.6 version to a 6.0 version, and all CORE processes standardized within the destination system during that downtime. Some of the ‘satellite’ processes had been standardized accordingly, but after de downtime considering their criticality was manageable during the regular operation, in other words, it is often not possible to do everything in a brief period and we had to choose de battles wisely.
All that work enabled possibilities to implement other projects, such as unified financial closing, WMS standardization, and S&OP standardization, besides other minor enhancements within the business processes, such as applying different excise rules and sales procedures, to name just a few.
Takeaways and Conclusion
Data migration projects must be viewed as a foundation and enabler for other initiatives. They will always serve for future projects of every magnitude, bigger ones such as an S&OP implementation or smaller ones, like one that we've had to standardize the software that makes integration between the weighing bridges and the ERP, and in this particular situation, we had even fields that had to be in use differently. There is a tricky thing to deal with because data transformation also needs change management due to process standardization.
We can foresee soon some of these analyses being done supported by AI speeding up the job, but we still must have the people to assess and evolve the models and standards, and upskilling is still a key factor.
Adopt good practices for data migration such as planning, data assessment, cleansing, validation and testing strategies, data mapping, and defining data ownership.
Analyzing the data availability and quality is key as a foundation for a company's future goals and who is adopting data science as a strategy. Since it’s a new area still in development, it's good to bond the knowledge and experience with the new to drive improvements and start collecting benefits from different insights.
In other words, the data quality prepares us for the future.