Virtualization continues to play the good-cop/bad-cop game in the enterprise, offering both tremendous promise, but with significant challenges on the road to a more efficient and effective data ecosystem. At the same time, however, the hiccups in virtual deployment are hampering that other significant IT development: cloud computing.
After only three short months of life, the OpenStack open source cloud computing initiative is out with its first public release of production quality code.
The first OpenStack release is codenamed Austin and includes both storage and cloud compute fabric technologies that can be used by enterprises to deliver cloud services. Originally an effort kick-started by NASA and Rackspace, OpenStack is now benefiting from the support and contributions of more than 35 technology vendors.
I was floored today when the director of BGI told me they would soon reach a sequencing rate of 1000 (human) genomes per day (so, 10^5 to 10^6 genomes per year is right on the horizon). According to him, they can be profitable at a price of $5k per genome! [Clarification: I later learned this might mean at 10x coverage … not exactly sure, although I tried to get a more precise statement.]
There are not many chief executives who can boast a workforce of half a million people around the globe.
But then Lukas Biewald’s workforce is not your traditional one.
As boss of San Francisco-based CrowdFlower, he says that his company offers "labour on demand".
His employees are crowdsourced – people who work from home, when needed, on specific projects.
"It doesn’t make sense to build a box around people, put in internet and plumbing and everything else, make them drive to work and have managers for them," Mr Biewald says.
A large number of companies could benefit from using deduplication solutions at their businesses.
This is according to Data Storage Connection columnist Charles Butler, who explained that many organisations are reporting rapid data growth of a more complex and dispersed nature than previously seen.
He said that deduplication solutions can play a part in tackling this problem, by dramatically reducing bandwidth and storage requirements, as well as centralising backup data to make disaster recovery planning easier to manage.
GigaSpaces Integrates With Citrix OpenCloud Platform To Ease Creation Of Elastic Data Center Environments
Read the headlines and talk to some of the vendors, and it’s easy to believe that there isn’t an organization not deploying applications to the cloud.
IDC, long been a champion of cloud computing, certainly to believes that’s where the future lies. More than 18 months ago, Senior Vice President and chief analyst Frank Gens claimed that although the adoption rate for clouds at the time was around 15 percent, it would account for 25 percent of the net growth of technology from 2011 to 2012, and 30 percent of growth from 2012 to 2013.
Like politics or economics, computing goes through cycles. Corporate computing started out highly centralized on mainframes and other room-size machines, but since the introduction of the PC has been going through swings between the autonomy of the individual computer user and centralized management by the IT department.
A conversation with Nati Shalom, chief technology officer of GigaSpaces, got me thinking about another pendulum swing: the one between making applications optimized for specific hardware vs. being independent of it.
Cloud computing is taking center stage at this week’s Interop New York 2010, with dozens of cloud players showcasing their latest and greatest solutions to make the leap to the cloud a smooth one.