Back to blog

Can Your Data Connectivity Strategy Handle Data Gravity

If data connectivity is an airport, data gravity threatens to shut it down. Here is emerging tech to keep it open.

Dr. Ofer Markman
February 19, 2024

Data connectivity, the digital world’s infrastructure, influences everything from internet speed to cloud computing efficiency. It’s key for a data center’s operation, with speed and availability being determining factors. However, the pressure on data connectivity infrastructure around the world has become rather intense. 

Perhaps too intense.

Data Connectivity’s Key Challenge

Let’s compare Data Connectivity to an airport.

In this busy “airport” of data connectivity, large data sets—akin to an airport’s passengers, planes, maintenance crews and warehouses—continue to grow in volume exponentially, creating a phenomenon known as data gravity. Gravity refers to the inherent difficulty of moving large-scale data across different platforms or locations, which can take a toll on data connectivity. Gravity draws more data towards it, making the movement of data—similar to “air traffic”— ever more complex to handle.

AI, Large Language Models (LLMs), and IoT are significant contributors to data gravity’s effect on connectivity. It is estimated that by 2024, with the proliferation of AI tools, enterprises will generate 1.1 million gigabytes per second, requiring over 16,000 exabytes of storage annually. Similarly, the 41.6 billion IoT devices likely to be live and connected to a network by 2025 are projected to generate 79 zettabytes of data. With AI applications expected to see an annual growth rate of 37.3% by 2030, the volume of data generated will continue to increase.

How to Handle This Data Overflow?

The effects of such data gravity on connectivity highlight the importance of stepping into 2024 with a data connectivity strategy that is AI & IoT-ready. Without such a strategy we risk finding ourselves attempting supersonic flights with propeller planes. Companies should look to take advantage of two emerging technologies to soften the effects of data gravity on their connectivity. The first is edge computing, which can help mitigate data gravity by processing data closer to its source, thereby reducing the need for data to travel long distances. The second is data size optimization which can be achieved through advanced data compression. 

  1. Computing On The Edge

Edge computing, similar to an airport control tower, is a method that processes and stores data near its source. This proximity improves speed and saves bandwidth, allowing for quick data creation and decision-making, thereby reducing latency and network congestion.

The importance of edge computing is seen in the growth of IoT devices, which produce vast amounts of data. Edge computing uses this data for real-time insights and predictions, reducing the need for constant communication with a central server or cloud. Without it, the data from edge devices would overload most networks. 

  1. Lower Size, Lower Gravity

Optimizing for data size, not just transfer speed, is important as data volumes expand. With networks in 2024 and onwards expected to accommodate over 100 gigabits per second, ultra-low latency, and high reliability, reducing the size of the “baggage” we haul in our “planes” becomes all the more important. A practical approach to address size is the use of a content-aware data compression engine that optimizes file size - or “baggage” -  in a lossless manner. How fast our planes fly depends on how light they travel - and that’s where data compression comes into play, optimizing our load for a smoother, more efficient journey. 

A Path Forward

We are at a critical juncture in the world of data connectivity, where the strong effects of data gravity are an opportunity in disguise. The increasing gravity of AI, LLMs and IoT is a call to action to assess whether a company’s infrastructure can not only cope, but deliver without interruption. By considering increased reliance on edge computing, further investment in network infrastructure, and optimization for data size, companies can mitigate the powerful attraction of data gravity on their connectivity.

So, let’s not just fasten seatbelts and stay in our seats, because in these rapidly evolving airways, staying grounded is not an option.

Contact our sales team for more info
Let's Talk
linkedin link icon

Dr. Ofer Markman

is FILO Systems' VP Business Development

Want to start saving?
Access Your FREE Trial!

Filo Focus

Insights, ideas and perspectives from the Filo team
April 17, 2024

Big Data: Healthcare’s boon or a costly setback?

Is the promise of Big Data in healthcare overshadowing the practical challenges of its implementation?

Dr. Ofer Markman
April 10, 2024

What Role Does Data Play in Enabling Healthcare 5.0?

Hospitals generate nearly 50 petabytes of data annually, yet 97% of it remains unused. Can Healthcare 5.0 change this?

Dr. Ofer Markman
April 4, 2024

Will high imaging-related costs risk patient outcomes?

Rising costs of new imaging tech may risk patient outcomes and diagnostic efficiency. A data management shift can help.

Dr. Ofer Markman
More on our Blog