Context

Background

Our first step in the project was to work out what data maturity means and what models and frameworks already existed.   In May 2016 we published a short report on what we found.  

This led us to the conclusion that we needed to create our own framework and set out some important definitions around the research:

– Data – When we say data we have a broad definition. We include all the types of information an organisation might collect, store, analyse and use.

– Social Sector – In the context of this project we defined the social sector as being charities and social enterprises (businesses trading for social and environmental purposes).

– Data maturity – The journey towards improvement and increased capability in using data.

 

What we found out about Data Maturity in other sectors

Sian Basker and Madeleine Spinks, Data Orchard, 26th April 2016

Over the past couple of months we’ve been finding out what existing work has been done on Data Maturity and what we can learn from it. We’ve been trawling the internet, reading the literature, and interviewing some leading people with relevant knowledge and expertise (see Appendix 1 and Appendix 2 for details).

1. Data Maturity Frameworks

The concept of data maturity is relatively new and seems to be most widely used and understood in the data science community. Whilst there’s no listing for the term on Wikipedia, we found around 40-50 different models/frameworks and related theories. Indeed there are reported to be hundreds. Many of these focused on a particular industry or aspect/s of data: information maturity, analytics, business intelligence, data governance, open data, data warehousing , IT architecture, or big data. Various examples are listed in Appendix 4.

The earliest references we found to ‘data maturity’ were around 2005-2007 when both Gartner and IBM developed models for data quality and data analytics maturity. IBM built its first Analytics Quotient model in 2010 offering a quiz which identifies businesses as ‘novice, builder, leader or master’. Many other models emerged around 2012-13 when ‘big data’ first gained high media profile.  As knowledge and understanding has advanced more sophisticated/ updated versions have appeared.

Many data maturity models have been created by specialist vendors and consultancies as a means for selling products and services. They offer varying levels of simplicity/ technical detail to enable potential clients to understand where they are and where they might be going. Published examples include: an updated 2014 IBM model,Cardinal Path, Adobe Applied AI,  Accenture . Typically they offer an organisational diagnostic and, in some cases, an assessment report e.g. 2014 The Data Warehouse Institute analytics maturity tool assesses five dimensions (organization, analytics, data management, infrastructure and governance). Much of the published literature and resources are in fairly high tech language largely focused around data architecture and data governance. These tend to be aimed at more technical audiences within large enterprise environments.

In essence what many models explain is the journey from looking at retrospective ad hoc data to explain the past,  to a more continuous ‘current/real-time’ understanding of the here and now, a level of optimizing for efficiency and effectiveness, through to the ultimate state of predicting and creating the future.  Some models use analogies to human development pre-natal/infant/child/teenager/adult/sage; or (with reference to machine power) crawling/walking/running/riding a bike ; others focus on practical processes, tasks or action.

Most models we found were conceived and applied within private sector markets where primary drivers have been efficiency, risk management, maximizing revenues, and competitive advantage.  They tend to be aimed at large/very large enterprises where turnovers below $10M are regarded as small.  The sectors where data capabilities are being most rapidly and innovatively advanced appear to be in competitive markets: retail, technology, energy, health, insurance, and increasingly, banking and finance.

The only commercial sector benchmarking research around analytics maturity we found was published by Accenture in the Netherlands in 2015. Based on the DELTA model by Tom Davenport of the International Institute of Analytics, it had surveyed 250 companies in 2012 and again 2015 using key indicators along the themes of: Data, Enterprise, Leadership, Targets and Analysts. It identifies a move from the earlier use of analytics for improved efficiency towards a more recent development of data to support new and improved ways of working and decision-making. It also shows data analytics maturity strongest in sales and marketing roles.

Data Analytics has become an integral component of the service offer amongst leading business consultancies e.g. PWC, Deloitte, KPMG. Whilst these primarily serve the private sector, they also count government and charity organisations amongst their clients.

The public sector also has its data maturity story.  Whilst efficiency, security, and risk management were early drivers; increased transparency and public accountability are also key. The Environment Agency Data Maturity Model started in 2011 and now uses this alongside the 2015 Open Data Institute (ODI) model. The latter has been used to assess departments e.g. in November 2015 it published a scored DEFRA assessment against five themes. The primary departments involved in supporting this work appear to be: The Cabinet Office, DEFRA and BIS. Nesta is also currently undertaking research on data in the public sector.

To date we’ve found two models relevant to the social sector, both from the US. The first is Educause which has benchmarked analytics in hundreds of higher education organisations (many of them charities) since 2012. It offers some great resources and useful insights into how organisations are progressing  and not progressing. The second is data maturity model developed for social impact. Published in April 2016 by the Centre for Data Science and Public Policy, University of Chicago, it offers a framework around two aspects: data and technology; and organization readiness (Appendix 6). They plan to collect data from non-profit and government organisations and use this to benchmark.

There’s general agreement that whatever the sector, very few organisations are operating at the very advanced levels. Indeed most are at the early stages.

Much of the discourse and literature suggests the journey to maturity is fairly long and challenging, though there’s evidence to show it’s worthwhile in the end. The most recent well-rounded and applicable resource we found was a book called “Creating a Data-Driven organisation: practical advice from the trenches“, by Carl Anderson in 2015. Experts suggest it can take five years or more to develop, implement and reap the rewards of becoming a data mature and data driven organization. However, there seems to be no single ‘truth’ about data maturity. Those we spoke to had very different perspectives.

“I don’t see any use of formal rubrics inside of most companies. There’s a wide range of sophistication around how people document data, provide access, and think about data governance, but I don’t see a standard way of thinking about this. This is likely due to the different regulatory standards and norms in each industry, and each company has their own point-of-view around those norms. We work with folks in insurance, banking, and legal, and they are all very different.”

Hilary Mason, Data Scientist, Fast Forward Labs.

Whilst many of the maturity models are fairly simple, many recognize the complexity and interrelationships with other key aspects of organization development. Notably: leadership, business planning and strategy, culture, as well as the policy, security, data governance and underlying infrastructure digital tools and systems.  There’s no evidence to suggest how widely used the various maturity models are nor how useful. Indeed some commentators suggest they’re not worth doing at all.  However comparing with peers and market leaders  through benchmarking appears to be a popular approach to raising aspirations and understanding stages of development.

2. Do the existing frameworks concur with DataKind’s theoretical 5 stages?

DataKind’s theoretical model in Appendix 3 (based on one from Terradata via Duncan Ross, Chair of DataKind UK) suggests a pathway of: Nascent, Explanatory, Exploratory, Developing, Mastering.

Many of the other frameworks we identified had 3, 4 or 5 stages that are not dissimilar in principal.  However the detail and language does vary significantly and will need further research and development.    The typical model in the private sector is unlikely to resonate with the vast majority of charities and social enterprises. However many of the larger ones may already be operating at that level to some extent i.e. the 1.2% of charities and 6% of social enterprises with turnovers of £5m+.

3. What enables organisations to become more data driven

The seven key factors that appear most influential and effective in enabling organisations to grow and develop in their data maturity are:

– Data people at the heart/centre of the organisation, adjacent to leadership team

– Data recognized and valued as a key asset with and data culture established as a collective effort i.e. data is a team sport, not the responsibility of just one data person, data becomes intrinsic skill and asset for every team in the organization.

– Data must be accessible to many in the organisation. Therefore people need to be able to query, join/relate and share the data across the organisation.

– Quality Data: the organisation must be collecting the right data, relevant to the question at hand, and be able to trust it with confidence.

– Skills: people with the right skills to steward and query the data, asking the right questions.

– Time to absorb, discuss and challenge using data.

– Forward looking: moving from reporting on the past (what happened?) to the present (what’s happening now) to the future (extrapolation, modeling, recommended action, prediction/simulation).

According to Anderson, the top barriers stopping organisations making effective us of data are:

– Lack of understanding on how to use  analytics to improve what they do

– Lack of management capacity (competing priorities)

– Lack of internal skills

– Existing culture doesn’t encourage sharing.

(For full list of barriers: see chart on p. 16 of Anderson, 2015).

What next?

This research will be used by Data Orchard and DataKind UK as part of the Data Evolution project to explore social sector data maturity. The findings will be shared as a blog post and feedback and comments will be welcome.  Discussions about this research will also be held with social sector organisations at various events and workshops during 2016. More specifically the research will help shape our questions with charities and social enterprises as part of a national survey.

Contact

Appendices

View the appendices.

4 comments to “Context”

You can leave a reply or Trackback this post.