Businessman holding a tablet for Corporate data management system

By Alexander Igelsböck  

Five years from now, companies will reach a state of “data ubiquity” where data is hard-wired into all processes, interactions, and systems: at least, that’s what McKinsey predicts.   

Getting to this point will take some work, especially given ongoing progress against the consultancy’s last major forecast. By 2025, employees across sectors were meant to start enhancing everything they do based on data, in addition to expertly wielding smart machines. However, the continuing need for data leaders to build workforce data literacy suggests effortless daily data use isn’t advancing as quickly as hoped.  

So, while investment in new technology will keep growing this year, with 91% of global tech decision-makers due to boost IT spending, there will also be greater consideration of immediate values. More specifically, we can expect firms to zoom in on whether data management tools and practices are fuelling tangible benefits and moving them closer to meeting McKinsey’s data-empowered vision – and if not, why not? 

Making data more accessible

 Companies increasingly recognise that embedding data-driven working as standard means ensuring relevant information is freely available to the right people: a concept known as “data democratisation”. As noted by McKinsey, achieving that will involve mastering two core bases: making data easy to use and trust.   

For now, let’s focus on usability. Over recent years, many companies have fallen into the common trap of bringing in specialised solutions each time a new need crops up. Such piecemeal adoption, however, creates multiple issues. First, it can quickly lead to tech stack overload as more niche tools are leveraged, particularly with an ever-expanding range of analytics and intelligence tools on offer. Second, there is a high chance that isolated implementations won’t link smoothly to existing tech, meaning data isn’t actually simple for users to access or activate.  

In 2025, we will see more organisations looking at the wider data management picture. o. In general terms, that will include reviewing setups as a whole to determine if they cover all of the crucial l elements to drive efficient data handling, from collecting and connecting multi-source information to delivering neatly organised and analysis-ready datasets.   

In tandem with increased prioritisation of practical value,  additions to tech stacks will also become subject to sharper scrutiny. Looking beyond purely enticing features and their suitability for solving specific problems, firms will evaluate how potential new solutions can fit into current systems, and most importantly, whether they will fulfil promises of enabling simpler data use.   

Winning trust with persistent quality  

So, what about trust? Establishing slick data processing engines will ensure key components are in place to quickly get data where it’s needed. Yet amid fast-rising adoption of artificial intelligence (AI), there is also growing awareness that data-fuelled activity can be sent off course if the insights teams and tools are running on aren’t reliable. In fact, Gartner estimates one in three AI projects will fail this year due to subpar data, alongside several other issues.  

Unsurprisingly, data quality is becoming more important: recently ranked as the second biggest data and business intelligence trend for 2025, just behind data security and privacy. Over the next 12 months, increased interest in enhancing data trustworthiness will lead to a much stronger emphasis on tightening governance approaches. 

As part of wider data management assessment, companies will look at where improvements are needed to consistently safeguard data accuracy. During initial data sorting, for example, this might mean swapping any error-phone manual processing with automated cleansing and deduplication. When it comes to sustaining long-term data quality, firms may also opt to follow Gartner’s advice on frequently analysing the information they get from all data sources, in addition to embracing machine assistance to automatically check for and flag any suspect data.   

Companies are long past the time when data-supported proficiency was just a nice to have. In the volatile modern business environment, ensuring team decisions and smart tech outputs are informed by precise, relevant, and up-to-date insight is crucial to keeping performance on track and avoiding missteps. While the data maturity development of each organisation will always evolve at an individual pace, this means striving to attain McKinsey’s “data ubiquity” ideal is now a vital goal for all, as is enabling the usability and accessibility it requires.

About the Author 

Alexander IgelsböckAlexander Igelsböck is the Chief Executive Officer and Founder at Adverity. Since founding the business in 2016, he has been responsible for driving the growth and development of the company, as well as establishing a global presence of Adverity in the industry. Under his leadership Adverity has secured over $165m in seed funding and expanded into a global company with offices in Vienna, London, and New ork. Prior to joining Adverity, Alex was Managing Partner & Investment Committee member ati5invest. Now, he is an active member of the Forbes Technology Council and ICOM Global Advisory Board. 

LEAVE A REPLY

Please enter your comment!
Please enter your name here