Category Whitepapers and Guides
Did you know that only 12% of data is analysed in the average organisation (Leftronic)?
Consumers and businesses are demanding data powered experiences to drive meaningful customer interactions and data monetisation opportunities. However, the legacy data ecosystem isn’t fit for this purpose and needs modernising.
This blog is going to look at the demand for data modernisation, the challenges businesses face to address this demand and the importance of a North Star for those on their journey. For the purposes of the piece, data modernisation means ‘the application of modern data approaches to an organisation’s current and legacy data and the associated technology, organisation and ways of working’.
Many trends are coalescing to drive forward a massive demand for data insights and the modernisation of data. This demand has been prompted by:
As organisations look to satisfy these trends, they’re coming up against a range of challenges.
There are multiple complex reasons for this.
The legacy analytical ecosystem – where most of the data processing and analytics workloads still exist – can’t keep up with the demand. Legacy data systems have shown the significant value of consolidating and integrating transactions, interactions, system data and many other data sources to drive better, more timely decisions.
However, they have now become too complex and too large to understand, manage or adapt. An incredible industry surrounds them with thousands of ETL (extract, transform, load) jobs, tables and roles that are generally poorly understood and difficult to manage. And for every large organisation, there are many hundreds of highly skilled people involved in just keeping the light on. The change and release cycles have become longer and longer with massive backlogs of outstanding work building up.
But it’s not just the legacy data platforms that are a drag on insight and innovation generation.
The data world has created multiple barriers for end users and applications to access and use data. Data for reports and analytics can have a minimum of five hops before it’s in its final access layer. On top of that, there is the data masking, privacy checks, security and governance sign-offs in the process. In some large organisations, there is a basic lack of data ownership and accountability which significantly slows down access and approval governance.
These barriers are difficult to navigate and negotiate to produce reports, analytics or feed digital use cases. Analysts and data scientists are extremely frustrated with only 14% of them having open access to their own company data (Forbes).
The legacy ecosystem is also expensive and out of date from a licensing, hardware, people and climate perspective. In ECS’s experience, most organisations are spending many 10s of millions just keeping the data lights on and only 12% of stored data is actually analysed (Leftronic). A key question is: who is measuring the ROI from that 12%, and does it cover the industry of technology, process and people who service the other 88% – data that is sourced and stored in company mega vaults?
It’s also worth noting that on-premise data centres are 30% less efficient than cloud data centres, meaning they can affect critical net zero targets (Accenture).
As mentioned previously, with only 20% of insight making a business impact and only 12% of data analysed in the average organisation, there is a massive opportunity to liberate data to meet the new demand. There is also a significant risk of repeating history and creating data mega factories in the cloud driving the same issues with complexity, speed to market and cost.
When you consider that 80% of data migrations to the cloud run over budget or overtime (Bloor Group) there is also the opportunity to make changes right from the outset.
As an industry, we must come up with solutions to the following questions: How can businesses deliver data modernisation in a way that secures incremental value, low risk disruption to operations and embed learnings from data history? How can we navigate the cultural issues of data fiefdoms and assess and implement new ways of working? How do we democratise the use of data in organisations but still meet rigorous security and governance standards? How do we execute on cost savings from decommissioning and cloud adoption while also gains in staff productivity?
ECS believe one of the critical steps to creating a data powered organisation is to prioritise the data objectives and stay on focus – in other words, with a data modernisation north star.
This sets out the:
If built and socialised well, this North star will help excite and rally the staff, customers, shareholders and partners. It also allows you to stop or de-prioritise things you have done in the past that don’t support your purpose and value.
There are already some concepts and designs available to help create data north stars from scratch; including Competing on Analytics DELTA model, DAMA, EDMC, data fabrics, lakehouse architecture and Data Mesh to name a few. They all have some compelling re-usable features but the Data Mesh has been trending significantly recently.
Data mesh prioritises domain ownership over centralisation, data products over collecting data, self service capabilities over central management and federated governance over top down. The most exciting part of the data mesh is the hypothesis that the core principles of distributed architectures and application microservices can be applied to the data world. Microservices enable speed to deployment & scalability and has been a game changer for application development. However, there are watch outs: complexity is increased, standardisation becomes more important, as does end-to-end system monitoring and service resolution.
Whilst promising, data mesh is not a panacea and is still very early in its adoption. There are some gaps and risks such as:
ECS found the existing frameworks did not cover our end-to-end modernisation experiences, so we defined an approach to create a data modernisation north star. We have developed this from our data modernisation engagements with our clients, our experience of implementing various established frameworks and our own extensive experiences in digital transformations. Digital transformations are generally in a more mature state of transformation and benefits realisation.
This approach provides a holistic view of the data strategy and value, big picture data design options, organisational design options, future ways of working and data modernisation roadmaps. The approach is also dynamic, modular and can be created and updated in a fast-paced way to suit an organisations schedule.
The approach is made up of five pillars:
1. Data strategy and value
What is an organisation’s core strategy and how does data support this?
What is the incremental ROI for cloud native data and data migrations initiatives?
What is your data modernisation north star metric?
What are the cultural, business, and technical barriers to data modernisation adoption?
2. Data design big picture
What are the big picture target data design options?
How do you assess the current estate for ETL, data, platform, queries and data models to name a few key areas?
What options and engineering compromises will you have to make?
3. Organisational design options
How do you design and execute on the right hub and spoke model for your organisations?
How do you professionalise your data people to meet the upcoming modernisation?
4. Future ways of working to support modernisation
How do you weave practises like agile and DevOps into new ways of workings for DataOps, BIOps, MLOps and FinOps?
How do you build self-sufficient capabilities and culture?
5. Data Modernisation Roadmaps
What are the successful approaches to large scale data migrations?
How will you organise the modernisation programme?
How does ECS use the incept, evolve, and scale approach to deliver iterative value, learn fast and manage risks?
Even if you have started your modernisation journey, you can use this blog as a health check on your transformation. If you are just starting, this will be a useful guide in your thinking and development.
Sean Robertson has spent his entire career attempting to make sense out of data. He started off in hands on roles building predictive machine learning models in the energy and banking industries. He then progressed into leadership roles both in client and consultancy imagining, shaping and leading large data transformation programmes. His current data interests include data modernisation and industrial IOT.