Why a data strategy underpins a successful AI strategy

AI and machine learning offer exciting innovation capabilities for businesses, from next-level predictive analytics to human-like conversational interfaces for functions such as customer service. But despite these tools’ undeniable potential – Deloitte research indicates that 74% of firms are already testing AI technologies – many enterprises today are unprepared to fully leverage AI’s capabilities because they lack a prioritised data strategy.  

While unstructured data typically makes up over 80% of a company’s data landscape, a lot of organisations’ potentially game-changing data remains siloed, unstructured, underutilised, and harder to find over time – it is, in the words of analysts Gartner, “dark data.”   

For example, oil and gas firms can drive greater efficiencies in upstream activities by consolidating and better analysing disparate seismic data sources; manufacturers achieve leaner processes by improving the accessibility of design files, inventory, and quality data; and media companies transform their content options by getting a single view of graphics, video, images and post-production files.  

Bringing siloed and far-flung unstructured data repositories into a single, accessible source is one of the enterprise inhibitors of being able to utilise AI effectively – Mulesoft and Deloitte Digital research indicates 81% of companies assessed believe it is holding their company back. By consolidating data, organisations of all types can achieve crucial operational and competitive benefits including:  

– Making “dark data” visible for analytics and AI tools 

– Gaining a single source of truth from unstructured data  

– Improving decision-making by reducing information blind spots  

– Enabling organisation-wide collaborations, insights and efficiencies 

– Simplifying regulatory compliance  

– Decreased management costs by retiring legacy file systems. 

Leading hybrid cloud platform providers have identified a set of four key actions – a framework to make them ‘Fit for AI’ – that helps organisations to consolidate their file data arrangements and thus deliver the enterprise intelligence needed for the AI age.  

Let’s explore what these steps comprise:  

1) Assessing file data silos for business value and risks

An expert technology partner can help an organisation implement a data assessment in addition to assessing the business value and risks of consolidating file silos, in terms of capital costs (cost of consolidation compared to keeping its current arrangements); operational costs (IT time/resources needed for a unified data, set against current costs); business productivity and revenue (assessing workforce constraints and negative impacts on revenues if siloed data isn’t unified); and business continuity (assessing the relative risks of consolidated business continuity or retaining existing file infrastructures). 

This approach enables CIOs to fully understand their file data storage environment, allowing them to assess migration risks, and plan the data migration process. 

2) Rationalising file storage

Expert partners can also help organisations identify the best path to data consolidation. This approach will build an architecture that not only provides full unstructured file data visibility but also the single source of truth required for successfully adopting AI services which ultimately underpins an organisation’s evolving business processes.  

3) Securing and protecting consolidated data.

As malicious attacks such as ransomware exploits become more sophisticated, CIOs also need to re-evaluate security in the context of AI applications accessing unified data sets to ensure multi-layered protection around their data assets. Today’s hybrid cloud platforms incorporate a full complement of ransomware protection services. Using such tools, detection starts at the network edge, notifying IT teams of suspicious file patterns, malicious file extensions, and any anomalous behaviour across the organisation. Mitigation policies reduce business impacts before an attack can spread. Point-in-time recovery is as important as mitigation. It ensures any impacted files can be rapidly recovered, and AI-automated business processes that rely upon the underlying data quickly brought back online, while SIEM integration, audit logs and incident reports keep comprehensive records of threat events. 

4) Curating data for AI use

An effective AI strategy requires consolidated, well-governed data foundations. By leveraging specialised data intelligence tools, organisations can refine data sets utilised by AI resulting in more qualitative interaction. As the adage goes, ‘garbage in, garbage out’! 

Integrated dashboards provided by today’s hybrid cloud storage tools are able to quantify storage consumption down to department or file type level and can help earmark infrequently accessed data for future archival. 

Modern AI-ready search tools help simplify data curation with powerful indexing, and efficient structuring of content for actionable insights and can provide further validation of the curated dataset to guarantee quality and usability for downstream applications.  

Today’s data management tools integrate fully with organisations’ existing identity management systems. This helps IT teams highlight group permissions and access control lists, to build effective company-wide security protocols as AI tools are tested and adopted.  

Effective data strategies must also accommodate new unstructured data generated and accessed at the “edge” daily. When data is consolidated from the edge to the core, AI algorithms can build predictive models based on comprehensive data profiles while receiving real-time edge data and historical context from the unified repository. This enables more accurate real-time insights and operational decision-making. 

‘Fit for AI’

A ‘Fit for AI’ framework can underpin a digital management strategy that enables organisations to prepare their dispersed and unstructured file data for AI use cases. Not only this, but it also ensures that risks are contained and that data is secure for AI implementations. As data levels grow exponentially and AI tools proliferate, effective data management is an enabler for AI success that delivers new insights from consolidated corporate data that can transform companies’ processes and their ability to compete.  

Jim Liddle

Jim is Chief Innovation Officer Data Intelligence and AI at Nasuni. A seasoned entrepreneur and executive leader, Jim has 25+ years’ experience and is an expert in big data and AI innovation.

Birmingham Unveils the UK’s Best Emerging HealthTech Advances

Kosta Mavroulakis • 03rd April 2025

The National HealthTech Series hosted its latest event in Birmingham this month, showcasing innovative startups driving advanced health technology, including AI-assisted diagnostics, wearable devices and revolutionary educational tools for healthcare professionals. Health stakeholders drawn from the NHS, universities, industry and front-line patient care met with new and emerging businesses to define the future trajectory of...

Why DEIB is Imperative to Tech’s Future

Hadas Almog from AppsFlyer • 17th March 2025

We’ve been seeing Diversity, Equity, Inclusion, and Belonging (DEIB) initiatives being cut time and time again throughout the tech industry. DEIB dedicated roles have been eliminated, employee resource groups have lost funding, and initiatives once considered crucial have been deprioritised in favour of “more immediate business needs.” The justification for these cuts is often the...

The need to eradicate platform dependence

Sue Azari • 10th March 2025

The advertising industry is undergoing a seismic shift. Connected TV (CTV), Retail Media Networks (RMNs), and omnichannel strategies are rapidly redefining how brands engage with consumers. As digital privacy regulations evolve and platform dynamics shift, advertisers must recognise a fundamental truth. You cannot build a sustainable business on borrowed ground. The recent uncertainty surrounding TikTok...

The need to clean data for effective insight

David Sheldrake • 05th March 2025

There is more data today than ever before. In fact, the total amount of data created, captured, copied, and consumed globally has now reached an incredible 149 zettabytes. The growth of the big mountain is not expected to slow down, either, with it expected to reach almost 400 zettabytes within the next three years. Whilst...

What can be done to democratize VDI?

Dennis Damen • 05th March 2025

Virtual Desktop Infrastructure (VDI) offers businesses enhanced security, scalability, and compliance, yet it remains a niche technology. One of the biggest barriers to widespread adoption is a severe talent gap. Many IT professionals lack hands-on VDI experience, as their careers begin with physical machines and increasingly shift toward cloud-based services. This shortage has created a...

Tech and Business Outlook: US Confident, European Sentiment Mixed

Viva Technology • 11th February 2025

The VivaTech Confidence Barometer, now in its second edition, reveals strong confidence among tech executives regarding the impact of emerging technologies on business competitiveness, particularly AI, which is expected to have the most significant impact in the near future. Surveying tech leaders from Europe and North America, 81% recognize their companies as competitive internationally, with...