Businesses need data in analytics systems and processes in order to make business decisions but two of the biggest challenges they face are timeliness and agility, says Mervyn Mooi, director at Knowledge Integration Dynamics.
It can take too long to get the data together, ensure that it’s good quality, that it’s correct, that it’s integrated, and that it’s ready for manipulation and consumption to be able to respond quickly to market demands.
But data practitioners have long preached that data quality and master data management are the Holy Grail of data projects and ongoing data programmes. Anyone who suggests that organisations can allow people to take the alternative tack, which is introducing their own data or going straight to source systems, could well be burned for heresy.
The reasons come back to why you’re using the data in the first place: by allowing people to go straight to source systems, bring their own data, possibly use spreadsheets and other non-integrated, non-normalised, non-quality assured, non-correct sources, they could potentially make poor business decisions.
Depending on the criticality of the decision risk exposure could range anywhere from a minor issue to the company closing its doors.
Essentially, the more important the business decision and the more crucial the data it relies upon the greater the potential risk.
However, waiting for data to be integrated, normalised, corrected, enriched, and quality-assured can keep companies and decision makers in the dark too, unable to make decisions, unable to achieve the flexibility and agility that businesspeople need in order to make the organisation a success in competitive market places.
Do you expose the organisation to risk by relying on potentially flawed data or do you expose it to the risk of sliding market share and revenue erosion by more agile competitors?
Somewhere between the treacherous route of anything goes data furnishing and rigid disciplinarian conformity lies a slender thread of hope for businesses that want accuracy, reliability and dependability snug alongside flexibility, adaptability and agility.
Limited application of analysts supplying their own data and performing minimal corrections, just enough to meet the business need, is a practical possibility. I’ve seen it in practice. The trick is how to get it right, with certain checks and balances to minimise risk exposure.
Run-of-the-mill data analysts are more technologically savvy today than at any previous time in history and many of them have a strong grasp of their function, their role, their tools, their organisations, and their abilities to help achieve business goals.
Those with a history of success can definitely be trusted to supply their own data in pursuit of rapid and agile business decision-making, thereby bypassing the warehouses and other trusted repositories created for their purposes.
That is, until there is time to complete the process.
However, it remains prudent in the meantime to reconcile their efforts and explicitly state the condition of their product for correctness and deviation so that users can quickly grasp how balanced their results are before they are used to make business decisions. That approach is conducive to business continuity and commits organisations to addressing any deviations as time permits and as requirements demand.