Google offers data crunching in the cloud

November 16, 2011 Off By David
Grazed from Tech.Blorge.   Author: Editorial Staff.

Google has publicly launched BigQuery, a cloud-based data analysis service. It could be another example of the scaling of cloud computing slashing costs, though security and privacy issues have already been raised…

 

The logic behind BigQuery is relatively simple: by using cloud computing, Google can spread the hardware and start-up costs among all customers, meaning it can charge customers based on the amount of data they want analyzed, rather than having huge minimum costs that make the service uneconomical for smaller firms. The service can currently cope with datasets of up to 70 terabytes.

Google hasn’t yet announced pricing: it has now switched from beta testing to free public access, though for the moment the service is by application only. The company says it will give 30 days’ notice before introducing fees.

In response to the beta tests, Google says it has a new interface to make results easier to use, meaning users don’t necessarily have to download files. It’s also created a new API to make it simpler to incorporate the service into a company’s own code and then run multiple jobs at once.

There’s certainly some skepticism about the service, however. One industry analyst questioned the wisdom of companies trying it out, becoming reliant on the service, then getting left in the lurch if the eventual pricing turns out higher than expected.

Another suggested Google might find itself stuck between two types of audience: those who want complete control to carry highly-specialized analysis that Google can’t provide, and the “lighter” user who could need more help understanding the results of the analysis and may find that lacking in Google’s more automated service.

There’s also the issue that the usual concerns about security in cloud computing are heightened by suspiciousness about Google’s general nosiness about the data it handles.