Effects of Bandwidth in Cloud Computing

October 9, 2015 Off By David

Grazed from BackupTechnology. Author: Editorial Staff.

The term “bandwidth” has been used in Electrical Engineering for years to mean “the difference between the upper and lower frequencies in a continuous set of frequencies, measured in hertz”. In the early 1990’s Telcos started to use the term “bandwidth” to describe the volume of data handled and defined it as “the transmission rate of data across a network”. Bandwidth in data transmission is measured in bits per second and it represents the capacity of network connection. Increase in the capacity means improved performance, considering other factors like latency. We will further discuss the effects of utilisation of bandwidth to the challenges associated with cloud computing.

Cloud computing providers usually calculate required bandwidth of customers just by considering the available quantity of bandwidth as well as the mean bandwidth utilisation needed by variety of applications. In addition, cloud computing providers consider latencies in transmission to calculate the required time to upload both the initial backup and all subsequent backups. For that reason, Internet based cloud backup service providers work hard to enhance the overall Internet bandwidth. They also do everything within their power to reduce the amount of data that flows through their pipes…

There are many things the cloud service providers do to achieve such goals. They can use incremental backup technologies, link load balancing technologies or even some exceptional binary patching to transmit and extract file’s changes so as to reduce/ balance the transmitted amount of data. In addition, both de-duplication and file compression techniques may be used to decrease the quantity of files that are transmitted over the network…

Read more from the source @ http://blog.backup-technology.com/14845/effects-bandwidth-cloud-computing/