Description

 

Fast Data Transfer is a tool for fast upload of data into Azure – up to 4 terabytes per hour from a single client machine. It moves data from your premises to Blob Storage, to a clustered file system, or direct to an Azure VM. It can also move data between Azure regions.

The tool works by maximizing utilization of the network link. It efficiently uses all available bandwidth, even over long-distance links. On a 10 Gbps link, it reaches around 4 TB per hour, which makes it about 3 to 10 times faster than competing tools we’ve tested. On slower links, Fast Data Transfer typically achieves over 90% of the link’s theoretical maximum, while other tools may achieve substantially less.

For example, on a 250 Mbps link, the theoretical maximum throughput is about 100 GB per hour. Even with no other traffic on the link, other tools may achieve substantially less than that. In the same conditions (250 Mbps, with no competing traffic) Fast Data Transfer can be expected to transfer at least 90 GB per hour. (If there is competing traffic on the link, Fast Data Transfer will reduce its own throughput accordingly, in order to avoid disrupting your existing traffic.)

Fast Data Transfer runs on Windows and Linux. Its client-side portion is a command-line application that runs on-premises, on your own machine. A single client-side instance supports up to 10 Gbps. Its server-side portion runs on Azure VM(s) in your own subscription. Depending on the target speed, between 1 and 4 Azure VMs are required. An Azure Resource Manager template is supplied to automatically create the necessary VM(s).

 

 

 

Use Fast Data Transfer if any of the following apply:

  • You know, or suspect, that your existing data transfer tool is not fully utilizing the capacity of your network connection.
  • You have a very high-bandwidth link (e.g. 10 Gbps).
  • You want to load directly to the disk of a destination VM (or to a clustered file system). Most Azure data loading tools can’t send data direct to VMs. Tools such as Robocopy can, but they’re not designed for long-distance links. We have reports of Fast Data Transfer being over 10 times faster.
  • You are reading from spinning hard disks and want to minimize the overhead of seek times. In our testing, we were able to double disk read performance by following the tuning tips in Fast Data Transfer’s instructions.
  • You want to better understand and mitigate the constraints on your upload performance – Fast Data Transfer can show you whether the bottleneck is local disk, network, or destination storage. For each possible bottleneck, Fast Data Transfer offers you appropriate performance-tuning options.
  • You want to throttle your transfers to use only a set amount of network bandwidth.
  • You want fully automatic retry-after-failure.
  • You need to transfer a large number of very small files.
  • You have an ExpressRoute with private peering.

 

Meet the team

Azure Batch
"Fast Data Transfer addresses data transfer challenges that all other tools have struggled with for many years."
Dave Fellows

Garage Team

George Pollard, John Rusk, Dave Fellows

Azure Batch

New Zealand

Backstory

Thanks to Azure, it’s easy to scale up compute capacity in the cloud. But first you have to get the data into the cloud, and that was a pain point for many customers.  So we set out to look for a faster solution, starting with a survey of Open Source and commercial products.  None was as fast as we wanted, so we started experimenting to see what we could build ourselves.  We developed a number of optimization techniques, and achieved upload speeds faster than all the other tools we evaluated. After a private beta program, we’re now launching publicly through the Garage.