June 15, 2013 Off

Cloud Computing: Google’s balloon-powered Internet takes flight – behind the scenes with Project Loon

By David

Grazed from ITWorld.  Author: Tom Spring.

Google is bringing new meaning to the word "cloud computing." No Google is not rolling out a new SaaS solution. Instead Google is launching Project Loon that aims to bring Internet access to every corner of the globe via high-altitude balloons. Yes, that’s right it’s called Project Loon, as in "a crazy person" as Merriam-Webster defines the word. But it’s June and this is not an elaborate April Fool’s joke.  

Google’s Project Loon is an ambitious experiment to use a network of high-altitude balloons to bring Internet access to parts of New Zealand that would otherwise not have Web access. It’s a test of Google’s larger ambitions to pioneer efforts to bring the Internet to other parts of the world including Africa where millions do not have access to the Internet…

June 15, 2013 Off

What the data reveals about how to make SaaS secret sauce

By David

Grazed from PandoDaily.  Author: Ben Sesser.

Life would be much easier if a great product was the only requirement for a great business. Of course, talk to any vineyard owner or long-time New York Times shareholder and they can attest that this is not the case.  Software as a service companies are no different. As challenging as it is to build great software someone is willing to buy, other pieces must fall into place for success.

In SaaS, three critical pieces that must fall in place to create long-term value are price, customer acquisition cost, and churn. You need a sustainable mix of what customers will pay for your product, how much it costs to acquire customers, and how long you can keep them…

June 15, 2013 Off

Regain your visibility and control over cloud computing costs

By David

Grazed from Network World.  Author: Linda Musthaler.

When cloud computing was in its infancy, proponents raised interest in the technology by telling companies they could save money by running their applications on infrastructure they didn’t own or operate. In fact, some companies wouldn’t even need their own data center anymore! Amazon, or Rackspace or some other service provider, could provide all the computing capacity a company could ever want. And SaaS providers could deliver ready-to-use enterprise applications that companies could rent by the month.

With a value proposition like that, some companies started to jump on the bandwagon, using cloud computing in all its forms. Software, security, infrastructure, platforms — all delivered as a convenient service.  But the downside involved losing visibility and control over computing. Now, departments and even individuals can engage a cloud service without the approval or even the knowledge of the IT department…

June 14, 2013 Off

Microsoft secures Azure cloud services with multi-factor authentication

By David

Grazed from V3.co.uk. Author: Daniel Robinson.

Microsoft has added much-needed multi-factor authentication to its Windows Azure cloud computing platform, enabling organisations to secure access to any Azure services used by workers, partners and customers.  Available now, Active Authentication enables multi-factor authentication for Windows Azure Active Directory identities, the cloud-based service that provides identity and access capabilities for applications and other resources on Windows Azure itself.

Active Authentication requires users to authenticate themselves at sign in using an app on their mobile device or via an automated phone call or text message. This extra step helps prevent unauthorised access to data and applications in the cloud, Microsoft said…

June 14, 2013 Off

Does location matter when it comes to cloud infrastructure?

By David

Grazed from PR NewsWire. Author: PR Announcement.

You might think that in cyberspace, the servers might as well be anywhere. But even in the era of cloud computing, physical location matters. So where are your IT services truly safe? Cloud computing has taken the world by storm in recent years. More and more companies are transferring their data and applications to external providers, saving infrastructure costs and freeing up IT staff for strategic tasks. And the Internet is making geographical distances irrelevant – opening up a global market for cloud solutions. So as long as the service is attractively priced and available when required, does it really matter where the servers are located?

Offshore can be unsure

Well, yes, it does. There’s a firm consensus among experts that mission-critical enterprise data needs to be stored in a secure data centre, where it is safe from cybercrime and industrial espionage. But the site of the data centre is important too. When it comes to weighing up the risks, the old real-estate adage – location, location, location – is a good rule of thumb…

June 14, 2013 Off

European and US cloud providers go head-to-head after NSA revelations

By David

Grazed from ITWorld. Author: Mikael Ricknäs.

European cloud providers think the U.S. spy scandal will result in more enterprises choosing local alternatives over the likes of Amazon Web Services and Rackspace, which, on the other hand, are adamant that they aren’t taking part in programs such as Prism.

The debate over U.S. access to cloud data that the Patriot Act helped fuel has once again become a hot topic in the wake of revelations about surveillance programs such as Prism, under which the U.S. government is said to have access to data on servers supplied by Google, Facebook, Microsoft, Yahoo, Apple and Skype…

June 14, 2013 Off

Red Hat Escalates Private Cloud Fight With VMware

By David

Grazed from InformationWeek. Author: Charles Babcock.

Red Hat is offering companies with a big stake in Linux an alternative to building their private clouds with either VMware or Windows Server. It’s combined its Red Hat Enterprise Linux (RHEL) with the open source code modules of OpenStack to produce its own cloud computing platform.

In effect, Red Hat would like its success with an enterprise version of Linux to translate into a second generation of success in private cloud computing. At its Red Hat Summit user group meeting this week in Boston, it announced the combination of RHEL and OpenStack as "Red Hat Enterprise Linux OpenStack Platform."…

June 14, 2013 Off

First Social Cloud Management Tool Aims to Lessen Impact of Cloud Silos

By David

Grazed from Huffington Post. Author: Kevin Ducoff.

While plenty of enterprises have now stored their information in the cloud, data silos still loom as a major problem. According to a recent Oracle survey, 54 percent of IT executives have experienced downtime in the past six months after being forced to stop working when cloud applications were not properly integrated with other apps across the enterprise.

Cloud silos emerge when data is stored in separate servers or data centers and can’t interact with other systems. The reduction in efficiency means enterprises may be failing to meet their potential, but a new cloud management tool aims to change that…

June 14, 2013 Off

Cloud Computing: IBM to Support Linux KVM Virtualization on Power Systems

By David

Grazed from eWeek. Author: Jeffrey Burt.

IBM officials are looking to accelerate the adoption of Linux in the data center and are taking a number of new steps to push along the effort. At the Red Hat Summit in Boston, IBM officials said the company will support the Kernel-based Virtual Machine (KVM) virtualization hypervisor technology in the Power servers that run Linux. In addition, IBM in July will open two Power Systems Linux Centers in the United States, which will help software developers to more easily build applications that leverage Linux and IBM’s Power 7+ chip technology.

The new KVM support and the new Linux centers, announced June 11, are the latest steps by IBM officials to enhance the use of Linux on their Power systems. Big Blue’s efforts in this area date back more than 10 years, when officials started pushing their Linux-on-Power initiative…

June 14, 2013 Off

Samplify APAX Storage Library Accelerates Disk Throughput & Storage Capacity for HPC, BigData, and Cloud Computing

By David

Grazed from IT News Online. Author: PR Announcement.

Samplify, the leading intellectual property company for accelerating memory, storage, and I/O bottlenecks in computing, consumer electronics and mobile devices, announces the availability of its APAX HDF (Hierarchical Data Format ) Storage Library for high-performance computing (HPC), Big Data, and cloud computing applications. With APAX HDF, HPC users can accelerate disk throughput by 3-8X and reduce the storage requirements of their HDF-enabled applications without having to modify their application software. The APAX HDF Storage Library works with Samplify’s APAX Profiler tool to analyze the inherent accuracy in each dataset being stored, and applies the recommended encoding rate to maximize acceleration of algorithms with no effect on results.

"Our engagements with government labs, academic institutions, and private data centers reveal a continuous struggle to manage an ever increasing amount of data," says Al Wegener, Founder and CTO of Samplify. "We have been asked for a simpler way to integrate our APAX encoding technology in Big Data and cloud applications. By using plug-in technology for HDF, we enable any application that currently uses HDF as its storage format to get the benefits of improved disk throughput and reduced storage requirements afforded by APAX."…