Big Data

Cloud Computing: SAP CEO Bill McDermott - 'It's A Hana World'

Grazed from TalkinCloud. Author: CJ Arlotta.

SAP CEO Bill McDermott took center stage today at Sapphire Now 2015 to expand on the enterprise resource planning (ERP)'s company vision of making the world a better place and improving the lives of people by introducing data-driven and seamless solutions into the marketplace.

"Data-driven businesses must be seamless," McDermott told conference attendees. "When you think seamlessly, you can be a champion." With more than 20,000 customers in attendance, Sapphire Now, being held this week in Orlando, Florida at the Orange County Convention Center (OCCC), promoted the concept of digitally transforming businesses by keeping things simple for its customers, something McDermott harped on during his keynote address last year at SAP Sapphire Now...

Big Data is Driving HPC to the Cloud

Grazed from ScientificComputing. Author: Leo Reiter.

Once upon a time, high performance computing (HPC) was mainly about one thing: speed. It started with fast processors, memory, bus fabrics and optimized software algorithms to take advantage of them. We ran FORTRAN-based computational benchmarks powered by LINPACK, which still to this day factors in the TOP500 list of supercomputers.

We soon learned of limiting factors such as heat, power and the pesky speed of light. Seymour Cray realized that “anyone can build a fast CPU. The trick is to build a fast system.” We responded with massively parallel systems made up of lots and lots of very fast components. All was good in the world...

Cloud, Mobility & BI: Interop's Applications Track

Grazed from InformationWeek. Author: Andrew Conry Murray.

Mobility and cloud computing are transforming application delivery and development. IT has to keep up with these transformations to ensure that end users and customers have a good experience regardless of where the application runs. At the same time, IT is responsible for security, governance, compliance and the reams and reams of data these applications consume and generate. Interop’s Applications track helps IT meet this tall order with a mix of real-world case studies and in-depth workshops and tech sessions. Here’s a sample of what you’ll find.

Transforming Data Into Information And Knowledge

This workshop delves into data analytics and how it can be applied to change the way work is planned, accomplished and improved. The workshop looks at industry examples, including healthcare, to show how raw data can be transformed into actionable information. It will also provide an overview of the different data analytics technologies available, and share tips on how to get a data analytics project off the ground...

Using the Cloud to Manage Large Data Sets

Grazed from ProductDesignDevelopment.  Author: Dan O'Neill.

Having the ability to gather large quantities of data does not necessarily translate into also having the ability to use that data in a meaningful way. While technological advances have made it less expensive to collect data, a need remains for a way to store and analyze it.

Virtually every industry collects data on a regular basis for analyzing and monitoring various conditions. Industrial manufacturers, aerospace and defense companies, commercial businesses – they are all constantly monitoring and collecting information from systems to help with equipment health, performance envelopes, and business operations...

Data Captured by IoT Connections to Top 1.6 Zettabytes in 2020, As Analytics Evolve from Cloud to Edge, Says ABI Research

Grazed from BusinessWire. Author: Editorial Staff.

A new report from ABI Research estimates that the volume of data captured by IoT-connected devices exceeded 200 exabytes in 2014. The annual total is forecast to grow seven-fold by the decade’s end, surpassing 1,600 exabytes—or 1.6 zettabytes—in 2020. Principal Analyst Aapo Markkanen says, “The data originating from connected products and processes follows a certain journey of magnitudes.

The yearly volumes that are generated within endpoints are counted in yottabytes, but only a tiny fraction of this vast data mass is actually being captured for storage or further analysis. And of the captured volume, on average over 90% is stored or processed locally without a cloud element, even though this ratio can vary greatly by application segment. So far, the locally dealt data has typically been largely inaccessible for analytics, but that is now starting to change.”...

Cloud Based Analytics - Coming Clash of the Titans

Grazed from DataScienceCentral. Author: Naagesh Padmanaban.

From all indications, 2015 is well on its way to becoming the year of cloud computing. The feverish pitch of activities at key players on one hand and the data as well as observations of industry pundits affirm this. There are apparently a handful of reason to keep the IT industry leaders awake at night.

For starters, per , 2014 revenues for cloud services grew by 60 percent. The global cloud computing market, per Forrester, is expected to grow to over $191 billion by 2020. IDC also foresees a robust market for cloud computing services. The cloud computing stacks - Infrastructure as a service (Iaas), Platform as a service (Paas) and Software as a service (Saas) – are all driving the healthy cloud computing market growth...

Cloud Computing: Microsoft Is Crunching Huge Data To Foresee Traffic Jams Up To An Hour Before

Grazed from UberGizmo.com.   Author: Editorial Staff.

Traffic jams are a part of our everyday problems, and there is no denying that most of our time is spent suffering through these jams. How about we tell you that soon you will be able to foresee these jams! and No, it is not an April fool gimmick. Reportedly, Microsoft is working on a research that could forecast traffic jams.

The company has collaborated with the Federal University of Minas Gerais,which is among Brazil’s largest universities, to initiate the research project, known as Traffic Prediction Project. If all goes well, then we could foresee a jam up to an hour before it happens...

Cloud Computing: New report shows MongoDB to be leader of the NoSQL database pack

Grazed from CloudTech.  Author: James Bourne.

A report from United Software Associates (USAIN) has found MongoDB to be top of the pile of NoSQL database providers in benchmark testing.  The research tested three leading products – Cassandra, CouchBase and MongoDB – through Yahoo!’s cloud standard benchmark, YCSB. USAIN wanted to assess the durability of each, going on the theory that most applications should prioritise durability over performance, not accepting data loss. The databases were put through the ringer on three types of performance metric; throughput optimised, durability optimised, and balanced.

In workload A of 50% read and 50% update with throughput optimised, under the YCSB benchmark MongoDB hit 160,719 operations per second, ahead of Cassandra (134,839) and Couchbase (106,638). With workload B’s 95% read and 5% update, MongoDB again came out on top with 196,498, ahead of Couchbase (187,798) and Cassandra (144,455)...

The cloud for clouds: IBM and The Weather Company work on big data weather forecasts

Grazed from ZDNet. Author: Colin Barker.

IBM and The Weather Company want to use big data, the cloud, and the Internet of Things to improve weather forecasting for businesses. As part of a new deal between the companies, The Weather Company will shift its massive weather data services platform to the IBM Cloud and integrate its data with IBM analytics and cloud services.

The deal reflects how competition in the cloud market is heating up too: The Weather Company is a close partner with Amazon Web Services (AWS) and Bryson Koehler, the CIO/CTO for The Weather Company, told ZDNet: "I believe in the multi-cloud story and believe that any serious cloud-based business or application needs to be built in a cloud-agnostic way."...

Why big data's big promises are finally within reach

Grazed from CloudTech. Author: Adam Spearing.

Let’s face it - until very recently big data has been a big letdown. Data warehouses and data analytics tools have historically proven difficult to design, build, and maintain. How much storage space will be necessary? How much data is there? What data management tools can the organisation afford and, just as important, what expertise is available in-house to build and run the data warehouse or data analytics platform?

InformationWeek recently outlined eight reasons why big data projects often fail. The article cited a survey from Gartner that found an astonishing 92% of organisations are stuck in neutral when it comes to their big data initiatives. Why? Because enterprises are spending a lot of money on big data technologies, or plan to, but don’t have the right skills or strategies in place to drive the initiatives forward...