Amazon super-sizes instances to lure Hadoop users to Web Services

April 1, 2014 Off By David
Object Storage

Grazed from ITWorld. Author: Mikael Ricknäs.

Amazon Web Services hopes to entice more Hadoop users to its Elastic MapReduce service with new virtual servers, one of which has 262GB of memory and 6.4TB of storage for big-data analytics. On Tuesday, the company launched 12 new virtual servers or instances that organizations can use to run their applications using Elastic MapReduce clusters. Potential applications include Web indexing, data mining, log file analysis, financial analysis, scientific simulation and bioinformatics research.

Hadoop is an open-source platform that allows for the distributed processing of large data sets across clusters of computers. The MapReduce framework assigns work to nodes in the cluster. Amazon’s compute-optimized c3.8xlarge virtual server is aimed at tasks such as image processing. It has 32 vCPUs (virtual CPUs), 64GB of memory, two times 320GB of SSD storage and 10Gbps network connectivity…

The price tag is US$0.270 per hour, plus from $1.680 for the corresponding EC2 (Elastic Compute Cloud) server…

Read more from the source @ http://www.itworld.com/cloud-computing/412492/amazon-super-sizes-instances-lure-hadoop-users-web-services

Subscribe to the CloudCow bi-monthly newsletter @ http://eepurl.com/smZeb