Insight and analysis on the data center space from industry thought leaders.
Is Parcel Protocol Right for Your Data Transfer Needs?
Even in a cloud-dominated world, the parcel remains perhaps the simplest, most common-sense answer for moving bulk data between a client and an off-site data center.
June 24, 2016
Bill Chellis is the RoundTrip seed drive manager at Datto.
In a constantly connected world, we take for granted the methods for getting data from one place to another. It just happens so fast, seemingly in the blink of an eye. Improvements to the vast number of data transfer protocols IT professionals use to make this magic possible occur every day.
It is safe to say that the HTTP/S duo does much of the heavy lifting for applications on the web. But there is truly an exhaustive menu of protocol choices available, each tailored for a specific use and benefit. Companies can pick which option works best in certain times and conditions and move forward with confidence.
Frequently, though, the best option for file replication and off-site backup isn’t a matter of using the fastest Internet connection or consuming the most bandwidth. Bulk data inbound for the data center can often find a cheaper, more cost-effective ride. And the method takes advantage of a tried-and-true technology that’s been in use for centuries.
It’s the parcel – which, even in a cloud-dominated world, remains perhaps the simplest, most common-sense answer for moving bulk data between a client and an off-site data center.
Off-site backup strategies work best the sooner you can achieve parity between local data sets and off-site servers. When a local data set is compact enough to move quickly with an Internet-based transfer protocol, off-siting with such an option is ideal. But that isn’t always the case. Larger data sets call for careful planning. You need to answer the following conflicting questions:
How can you best move all of your local data to off-site storage quickly?
Can you do this transfer without overwhelming your local network resources?
It needs to be done quickly, but it can’t limit the speed of your local network. Employees need to keep working after all. Moreover, downtime is never acceptable, even if it happens in the name of creating an off-site backup. Using parcels to ship data off-site means companies can ship entire hard drives – data buckets measured by the terabyte – where they need it to go quickly.
It would, of course, be faster if entire servers could be replicated at the push of a button. The likelihood that this can be achieved safely and quickly without any issues or network delays is small, though. After all, how many companies have end-to-end fiber broadband connections capable of handling this workload at the desired speed?
Using traditional shipping methods means companies can get their off-site backups up and running in a matter of days. FedEx, DHL and the like are not usually included in the ranks of IT vendors. They are not who comes to mind when you talk in terms of transferring data files. Yet they’ve helped countless organizations get their backup plans in place.
Off-site replication is a core service for most backup providers. It makes sense to use standard shipping methods for data seeding when the amount of client data is large enough. After the initial replication has occurred, smaller follow-up incremental backups can typically be sustained through the plethora of data transfer protocols we usually associate with the Internet.
Backup and disaster recovery vendors need to specialize in whichever cloud-based protocols their customers prefer, of course. Simple backups and replications need to be done almost instantly. Vendors also must be realistic. Sometimes shipping data devices to and from specific locations is the best answer. So as much as they need the expertise and capability to move data with a mouse click, they need logistics teams as well – entire groups designed specifically to handle the movement of these data devices. Fleet teams responsible for the logistics part of preparing, testing, stocking, shipping, receiving, syncing, wiping, billing and managing parcels are required. When this concept is baked into the way a backup vendor operates, customers can be certain their larger file transfers are handled with care and precision.
Hybrid approaches to data storage and backup have emerged as the standard for many companies. Companies need a significant portion of their file transfers and replication to happen instantly. Backup and disaster recovery vendors have built-in cloud options to account for this market demand. However, taking the time to build out your own parcel protocol is just as significant. Vendors with a dedicated, defined standard for shipping larger data sets demonstrate a clear dedication to customer satisfaction and premium service.
It’s funny to think about, really. Moving data from one location to another has been a hallmark of IT for decades. As a result, the methods for doing so have become increasingly advanced. People can now send more data in less time, further around the globe than ever before. Despite all of this, having a formalized process and standards for shipping data sets and devices in parcels can differentiate one backup vendor from another. Have you consider taking advantage of the benefits of the parcel protocol? It just might be the “whole package” answer you have been looking for.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
About the Author
You May Also Like