With many of our clients, we see issues of data fragmentation: data that is duplicated in many places, often updated on a casual basis, and with little clarity as to which copy of the data is the most current. This data could be anything from product stock levels, through to customer contact information or retail sales figures.
We would describe this data as not having a single source of truth, and we see organisations encumbered by a number of common challenges that stem from not investing in a single source of truth for their business intelligence:
Workflows are slowed down
All too often we see that people’s day-to-day workflow is slowed down by not having confident access to business data. We discover that time is often spent checking with others in the organisation as to the validity of a particular dataset: often requiring the employee to wait for a response which may slow them down by hours or days in their original task.
Decision making is impaired
Organisations should be driving their decision making by using as much data as is available. Where there is uncertainty as to the currentness or validity of data, organisations are often unable to answer what should be relatively trivial, though important, questions such as “how many units of Widget D did we sell in the UK in February 2014?” or “how many products were returned due to quality issues in the second quarter 2012?”.
When these simple questions aren’t easily answerable by an organisation, we typically find that they’re unable at worst, or slow at best, to use data to make effective decisions. Particularly for organisations who rely heavily on forecasting, such as those who have an element of slower manufacturing as part of their process, there can be significant costs in terms of their ability to bring products to market fast enough, or to bring the right products to market at all.
Ability to innovate is slowed
An organisation that suffers from data fragmentation often finds that their ability to innovate and implement new systems is slowed. An organisation that is not in control of their customer data would find it harder, for example, to adopt a new CRM system that perhaps offers some exciting new functionality to capture customer data at the point of sale in third party retail stores, as there would be a lack of clarity as to where that customer data should ultimately reside, how to match offline customer sales to online customer sales, how to match retail purchases made by customers in different physical stores, and so forth.
People spend too much time copying data
While not always the case, the symptoms previously detailed usually hint that people are spending too much of their time performing manual and error prone copying of data from one system to another. Every time somebody manually inputs data is another opportunity for errors to be introduced. Not to mention, it’s seldom a good use of your employees’ time to be inputting data, both from a cost and motivational perspective.
On their own and combined, these challenges too often put businesses at a disadvantage, slowly nibbling at the edges of operational efficiency, offering up real barriers to an organisation making timely decisions, or preventing the adoption of new innovations.
With the increasing adoption of (or the re-labelling of existing initiatives as) “big data”, the importance of data integrity and an organisation’s ability to mine this data for insight becomes ever more apparent. By ignoring, or working around, source of truth challenges in your organisation, you’re likely falling in to an ever-less competitive position.
In a later post we’ll be looking at techniques we’ve been able to employ to move organisations from having fragmented business data to benefit from a single source of truth. At the very least, defining which copy of data – if you absolutely must have data exist in more than one place – is the “master”, and by automating as much of the work involved in duplicating data as possible, an organisation can start to make real steps forward in its business intelligence efforts.