2024 – the year we need to focus on less

With each new year we all receive predictions for more of something. In 2024 there will be more AI, more Cloud, more Cyber attacks, more use of this technology or that. More, more, more… How about next year we focus on less?

Cheap compute and storage, increasing Hybrid Cloud adoption, and exponential growth of AI, are leading to data volumes spiralling dangerously out of control. Data volumes are already growing by an average of 50 percent per year in more than half of all companies, and the majority of organisations have infrastructure crammed with data, where on average 70 percent of the content is completely unknown.

All that data requires power, and AI needs even more power; ChatGPT for example uses as much as 10x more than a standard data search, yet data centre efficiency (PUE) has not improved in line with increased workloads. We know that we live in a time of climate emergency, and yet there are no concerted efforts amongst enterprises or the IT industry to drive down those volumes of Data. Efficiency and management alone do not solve the issues that we are just storing too much of everything, for too long. If data were paper, we’d be buried under an Everest of it, but it’s out of sight and out of mind.

Organisations must start the year with resolutions to go on a data diet, cut the fat, get compliant. Their first two actions should be:

Consolidate their data on a common platform instead of operating dozens or even hundreds of separate silos. There, this data can be further reduced using standard techniques such as deduplication and compression. Reduction rates of 96 percent are possible as a result.

Use AI to index and classify data according to its content and value for the company. Everything that is without value can be deleted.

Although energy-intensive, AI is proving to be a considerable help in clarifying the content and value of data, and so being able to automatically identify obsolete, orphaned and redundant data that can be deleted immediately.

Most companies have their infrastructure crammed with data, where on average they don’t even know 70 percent of the content. In this unstructured dark data, cat videos can be found as well as the menu from the last Christmas party, aged copies of databases, research results, all mixed with data that must be retained for regulatory and commercial purposes. To meet modern energy challenges while continuing to amass data, companies need to focus on indexing, classification, and lifecycle-centric data management.

This data needs to be cleaned, and not only to reduce its risk of litigation. Anyone who cleans up and disposes of data waste will be able to feed their AI with high quality content and free up space for new data. To do this, the data must be indexed and classified according to its content and value for the company. AI also plays a key role here in classifying the content very accurately at pace.

Companies should consolidate their data on a common platform instead of continuing to operate dozens or even hundreds of separate silos. There, this data can be further reduced using standard techniques such as deduplication and compression. Reduction rates of 96 percent are possible in everyday life.

2024 should be the year when we don’t end up with more, but take responsibility to reach the end of the year with less. Far less:

Index the data, and enable reporting to provide accurate curation and empower decisions. Everything that is without value can be deleted. Obsolete data, duplicates of systems, orphans, outdated test systems. Say goodbye to things you really don’t need.

Reduce data volumes using technology; DeDuplicate and Compress data to eliminate redundant copies and/or lighten certain structures, automatically replacing original data with a thin version. The amount of data can be reduced by up to 97% depending on its type. Getting thin in the new year.

Classify that Data; In order for data owners to be able to make the right decisions, the type, content, and value of the data must be crystal clear. Classification according to your Relevant Records policy. This will allow Defensible Deletion Decisions, those decisions you need to take but have been unable to do so through lack of data intelligence. You will keep only that which you need to keep, for the prescribed period, and then automatically delete it. This will reduce your mountains of data, it will also give you strong intelligence when you experience a Cyber Event and need to know what has been compromised, or encrypted, or taken. AI and Machine Learning can be truly enabled to defuse complex problems, their LLMs empowered by solid data.

What each individual can do

Every user can also help reduce overall power consumption and slow down data growth. Because everyone can search through their data in the cloud and delete what is useless. This can be X-fold versions of the same photo, with a slightly different perspective. Or videos that you once found funny and haven’t watched since. That cat video perhaps. Every bit we can save through reducing our stored data, will reduce energy consumption. So let’s start cleaning up.

Technological innovations such as AI should also be approached as a tool to optimise on-premise and cloud storage, through a better understanding of the data they host. When integrated directly into a data management solution, AI can reduce the amount of data stored, and therefore the energy resources consumed.

Mark Molyneux

Mark Molyneux is CTO for EMEA at Cohesity

Custom Software Development

Natalia Yanchii • 04th October 2024

There is a wide performance gap between industry-leading companies and other market players. What helps these top businesses outperform their competitors? McKinsey & Company researchers are confident that these are digital technologies and custom software solutions. Nearly 70% of the top performers develop their proprietary products to differentiate themselves from competitors and drive growth. As...

The Impact of Test Automation on Software Quality

Natalia Yanchii • 04th October 2024

Software systems have become highly complex now, with multiple interconnected components, diverse user interfaces, and business logic. To ensure quality, QA engineers thoroughly test these systems through either automated or manual testing. At Testlum, we met many software development teams who were pressured to deliver new features and updates at a faster pace. The manual...

Custom Software Development

Natalia Yanchii • 03rd October 2024

There is a wide performance gap between industry-leading companies and other market players. What helps these top businesses outperform their competitors? McKinsey & Company researchers are confident that these are digital technologies and custom software solutions. Nearly 70% of the top performers develop their proprietary products to differentiate themselves from competitors and drive growth. As...

Six ways to maintain compliance and remain secure

Patrick Spencer VP at Kiteworks • 16th September 2024

With approximately 3.4 billion malicious emails circulating daily, it is crucial for organisations to implement strong safeguards to protect against phishing and business email compromise (BEC) attacks. It is a problem that is not going to go away. In fact, email phishing scams continue to rise, with news of Screwfix customers being targeted breaking at...

Enriching the Edge-Cloud Continuum with eLxr

Jeff Reser • 12th September 2024

At the global Debian conference this summer, the eLxr Project was launched, delivering the first release of a Debian derivative that inherits the intelligent edge capabilities of Debian, with plans to expand these for a streamlined edge-to-cloud deployment approach. eLxr is an open source, enterprise-grade Linux distribution that addresses the unique challenges of near-edge networks...
The Digital Transformation Expo is coming to London on October 2-3. Register now!