An introduction to data gravity

An introduction to data gravity

posted in: Blogs, FileCatalyst | 0

A couple of months ago, our President and Co-Founder John Tkaczewski had the honour of presenting an introduction to data gravity at the NAB virtual conference. While the concept of data gravity has been around for a few years, it isn’t talked about very often, especially in relation to our business – file transfer.

Coined in 2010 by Dave McCrory, the basic concept of data gravity is that as data accumulates, greater amounts of applications, services and tools are attracted to it. When the data gets large enough, it is almost impossible to move, so the services and applications are pulled towards the data.

As throughput and latency to the data increases, the gravitational pull of the data mass also increases, therefore the applications and services are forced to move closer to the data.

Since FileCatalyst is in the business of accelerated file transfer, it’s important to note how data gravity fits into all of this. The benefits of our file transfer acceleration software also reduce the overall effects of data gravity and give more flexibility to where you put your data and how you want to move it.

Data gravity will still exist, but its effect is reduced by eliminating the latency component. The gravitational pull will exist towards every storage location, but with faster moving data the owner will have more choices as to where to store it. The future will see the need for faster file transfers continue to grow as we see cloud services continuing to expand and as the size of the cloud, data, and links get bigger.

To learn more about data gravity and FileCatalyst, I encourage you to take few minutes to watch John’s presentation, and click here to learn more about FileCatalyt’s accelerated file transfer solutions.