The Role of the Shipping Parcel in a Data Management Strategy

May 27, 2016 Pieter Kinds

The most basic function of IT is moving data from place to place. We want to get some information from where it is to where we want it. And, as IT professionals, we are expected to transfer data as perfectly and as quickly as possible.

IT professionals have a toolbox full of protocols and cool technologies to get data moving. “Alphabet soup” protocols like HTTP/S, FTP, SFTP, SCP, SSH, Rsync, and Wi-Fi are all well-known and widely used, but even they just scratch the surface of the range of technological choices that are available.

The trick is the ability to pick the right tool at the right time, and knowing how to use it. Such choices affect performance and the ultimate success or failure of a given task. These results, like it or not, come back to reflect on your company’s core competencies.

It might surprise you, then, that perhaps one of the biggest data-transfer methods that my company, Datto, uses daily isn’t taught in a computer science classroom. There is no pool of guru candidates with highly sought-after certifications and expertise into which we can tap. The associated acronyms for our data transfer channel of choice would stand out like a proverbial sore thumb among the ranks of officially sanctioned information technology protocols. Even 150 years ago, Charles Babbage himself surely used a version of this familiar method.

The parcel. A simple shipping box. Why, in this age of broadband Internet, fast fiber, and wireless connections are we relying heavily on a mechanical method with a transfer rate measured in miles per hour?Parcels usually involve FedEx for our team. But other alphabet-soup carriers such as UPS, DHL, the USPS, and even the Australian Post Office also play a part, handling their fair share of Datto-bound cardboard boxes. Collectively, these carriers help us move hundreds of physical hard drives and terabytes (TB) of data between our clients and data centers every single day. They follow a simple protocol that requires them to deliver these parcels in a timely manner and with no damage to the data.

Mirroring the data that clients store locally in an offsite data center is the linchpin concept of a disaster-recovery strategy. However, mirroring this data in the cloud presents a classic “chicken or egg” scenario:

  • If all of your initial data are local on your device, how do you move it quickly offsite to an empty server?

 

If your offsite server is empty, will you bog down your local network resources while trying to move base image information into the cloud via the Internet?

 

The sooner you can achieve parity between local and offsite data sets, the better your overall backup strategy will be. On many device deployments, where the local data set is small enough and the Internet bandwidth is good enough, the normal use of transfer protocols can quickly achieve local and offsite parity on their own with no outside help needed. This is a best-case scenario.

But best case is not usually the case. Backup providers have to figure out a way to quickly plant the data where it needs to be. After putting aside magic, rainbows, pixie dust, and the instant appearance of universal fiber broadband connections, there really is only one way to do it. Moving terabytes of data quickly, safely, reliably, and cheaply means moving it physically.

Enter the parcel.

Moving terabytes or petabytes of data quickly, safely, reliably, and at a low cost can mean moving them by way of an age-old concept: physically, with parcels. One way to solve the data-transfer problem is to create a fleet team responsible for the logistics behind the preparation, testing, stocking, shipping, receiving, syncing, wiping, billing, and maintenance of parcels. Companies can ship several petabytes of raw storage capacity. As more and more people store more and more data, the need for more cloud storage grows. Unless we all magically wake up with fiber broadband connections tomorrow morning, the “need for seed” will surely continue.

Bill Chellis is RoundTrip manager at Datto. He has been with the company for four years, having previously served as technical support engineer, evening support manager and cloud operations manager. Prior to Datto, Chellis worked at CBIZ Network Solutions, Cablevision and the United States Postal Service. Bill loves Linux, the Beatles, and cooking. He currently resides in Connecticut with his wife and stepson.

- See more at: http://data-informed.com/the-role-of-the-shipping-parcel-in-a-data-management-strategy/?

Read more...

About the Author

Pieter Kinds

Pieter Kinds (41) is Director at ControlPay, a global Freight Audit provider and the CEO of TenderTool, a cloud-based logistics sourcing platform. Active for over 14 years, Pieter is eager to share insights, thoughts and experiences via his blogs.

Follow ControlPay on Twitter More Content by Pieter Kinds
Previous Article
4 Freight Packing Tips for LTL Shipments
4 Freight Packing Tips for LTL Shipments

If you are looking to send out an LTL freight shipment, it’s important to know what guidelines you’ll need ...

Next Article
The Top Companies to Work for in 2016
The Top Companies to Work for in 2016

What are the top companies to work for in America?

The widest range of Freight Audit services available in the market today

CONTACT US!
×

Type in your corporate email address to subscribe to our latest hub updates

Last Name
First Name
Thank you!
Error - something went wrong!