Amazon S3

N2WS Extends Data Protection into Amazon S3, Saving Customers up to 40%

N2WS, a Veeam company, introduces Amazon Elastic Block Store (Amazon EBS) snapshot decoupling and the N2WS-enabled Amazon Simple Storage Service (Amazon S3) repository. With the release of N2WS version 2.4, customers can reduce storage costs by up to 40 percent. The release of N2WS Backup & Recovery v2.4 allows Amazon Web Services (AWS) users to choose from different storage tiers and reduce costs for data being retained for longer terms.

"We are excited that N2WS is introducing snapshot decoupling into the Amazon S3 repository. This will help us to drastically reduce our storage costs," said Jamie MacDonald, head of platform and security at ZoneFox. "We manage over 700 snapshots with varying life cycling requirements, so this integration allows us to increase our backup retention period while paying less."

Enterprises can experience significant costs savings with backup lifecycle management and the ability to move snapshots to an N2WS-enabled Amazon S3 repository. Companies archiving data for compliance can also benefit from a cost-effective approach for long-term data retention on AWS. For Managed Service Providers (MSPs), N2WS v2.4 provides an opportunity to lower storage costs for their clients while improving overall levels of service delivery through effective data management.

Amazon Web Services Announces AWS Ground Station

Today at AWS re:Invent, Amazon Web Services, Inc. (AWS) announced AWS Ground Station, a new service that makes it easy and cost-effective for customers to download data from satellites into AWS Global Infrastructure Regions using a fully managed network of 12 ground station antennas located around the world. Once customers receive satellite data at a ground station, they can immediately process it in an Amazon Elastic Compute Cloud (Amazon EC2) instance, store it in Amazon Simple Storage Service (S3), apply AWS analytics and machine learning services to gain insights, and use Amazon's network to move the data to other regions and processing facilities. Getting started with AWS Ground Station takes just a few clicks in the AWS Management Console to schedule antenna access time and launch an Amazon EC2 instance to communicate with the satellite. There are no up-front payments or long-term commitments, no ground infrastructure to build or manage, and customers pay-by-the-minute for antenna access time used. To get started with AWS Ground Station, visit https://aws.amazon.com/ground-station.

Satellites are being used by more and more businesses, universities, and governments for a variety of applications, including weather forecasting, surface imaging, and communications. To do this today, customers must build or lease ground antennas to communicate with the satellites. This is a significant undertaking and cost because customers often require antennas in multiple countries to download data when and where they need it without waiting for the satellite to pass over a desired location. And the antennas are just the beginning of the infrastructure requirements because customers need servers, storage, and networking in close proximity to the antenna to process, store, and transport the data from the satellite. And then customers must build business rules and workflows to organize, structure, and route the data to employees or customers before it can be used to deliver insight. All of this requires significant capital investments and operational costs to build, manage, and maintain antennas, compute infrastructure, and business logic at each antenna location.

Chaos Sumo Releases Industry Report on AWS S3 Blind Spots and New Data Lake Use Cases

Grazed from Chaos Sumo

Chaos Sumo, a cloud-based log data retention and analytics service for object storage, today released the findings of The State of Object Storage 2018 Report: The Emergence of the AWS S3 Data Lake. As object storage such as AWS S3 continues to gain widespread enterprise momentum, with over 70 percent of companies reporting to use it today, it offers untapped opportunities for promising new use cases such as historical log analytics, and application and media hosting. More than one third of respondents in a recent survey, conducted by Chaos Sumo in December-January 2018, are also looking to object storage to streamline and enable data lake usage for historical trend analysis and machine learning. The study also found that the top barriers preventing S3 innovation are the lack of tools today that enable data access and visibility, and costs of moving data around in order to analyze the growing volumes of disparate object storage data with accuracy and scale.

"The current inability of businesses to perform consistent, longitudinal and easy trend and predictive analysis in object storage, including log analytics, is resulting in critical business information being thrown away or archived in an inaccessible manner," says Thomas Hazel, founder and CTO of Chaos Sumo. "This hidden culprit - the increasing costs of storing data for real- or near-time analysis, is the core impediment to doing more with the growing amount of data stored in object storage such as AWS S3, and Chaos Sumo is here to tackle this head on."

Lacework Enables AWS Customers to Rapidly Implement Security Best Practices and Proactively Identify S3 Buckets at Risk

Grazed from Lacework

Lacework, the industry's first solution to bring automation, speed and scale to cloud security, today announced new features that enable Amazon Web Services (AWS) customers to easily and continuously maintain an AWS cloud configuration that is compliant with proven security best practices. Lacework now automatically reports on the configuration's adherence to the Center for Internet Security (CIS) Benchmark for AWS.

Lacework has also introduced security controls targeted at AWS S3 buckets, enabling AWS customers to rapidly identify S3 buckets at risk or compromised due to misconfiguration. Through a targeted auditing of S3 configuration, Lacework ensures that all buckets are configured with best practices for logging, encryption and versioning, then provides continuous monitoring with AWS CloudTrail events and workload activity analysis.

"Deploying new initiatives to the public cloud brings a spectrum of new security challenges that many organizations are not yet familiar with," said Dan Hubbard, Chief Security Architect, Lacework. "Starting with the daily validation of the AWS configuration (AWS accounts and AWS resources such as S3 buckets), to the continuous monitoring of workloads deployed on AWS, the Lacework cloud security platform enables organizations to safely migrate data to AWS and deploy applications in AWS."

What have we learned from the Amazon AWS and Microsoft Azure outages?

Article Written by David Marshall



In the past month both AWS and Microsoft Azure have experienced lengthy outages due to issues with the storage supporting the cloud. These outages have flooded through companies globally and brought into focus the potential downsides of putting all your eggs into the public cloud basket. After both of these incidents, the question is, how reliable is the public cloud and what options and alternatives are available?

A few technology experts have offered their opinion:

The best of both worlds is a multi-cloud strategy

 

Amazon Corrects Massive AWS S3 Cloud Outage While Vendors React

Article Written by David Marshall

Last Tuesday, parts of the Internet came to a grinding halt when the servers that powered them suddenly vanished.  The disappearing server act came from servers that were housed as part of Amazon S3, Amazon's popular Web hosting service.

When that incident happened, several big and popular services and Web sites were disrupted, including DraftKings, Gizmodo, IFTTT, Quora, Slack and Trello.

According to the Web site monitoring firm Apica, 54 of the largest online retailers experienced performance impairments on their Web sites, with some slowing down by more than 20 percent; 3 sites went down completely (Express, Lulu Lemon, One Kings Lane); and for effected websites, average slow down time was 29.7 seconds - 42.7 seconds to load.

What happened?

"At 9:37 a.m. PST, an authorized S3 team member using an established playbook executed a command which was intended to remove a small number of servers for one of the S3 subsystems that is used by the S3 billing process," Amazon said.  "Unfortunately, one of the inputs to the command was entered incorrectly and a larger set of servers was removed than intended.  The servers that were inadvertently removed supported two other S3 subsystems."

Those subsystems are important.  One of them "manages the metadata and location information of all S3 objects in the region," according to Amazon.  And without it, services that depend on it couldn't perform basic data retrieval and storage tasks.  The second subsystem, the placement subsystem, "manages allocation of new storage and requires the index subsystem to be functioning properly to correctly operate."  The placement subsystem is used to allocate storage for new objects.

In Search of S3: Read the Fine Print

Article Written by Jon Toor, CMO of Cloudian  

For businesses using and creating applications for the Internet, it is becoming more difficult to ignore Amazon Simple Storage Service (S3). S3 is the massively scalable, cost-effective cloud storage solution developed specifically to house the huge influx of data created by organizations around the world. Amazon S3 commands twice the market share of all its closest competitors combined and is likely to be the storage platform of choice for on-premises hybrid or private cloud deployments for years to come. 

S3 has become the standard for cloud storage. Almost every application connects to S3 and most storage vendors have already announced that they connect to S3 or are working to do so. In addition to Amazon, there are a number of competing storage implementations that are S3-compliant, including Google Cloud Storage, Openstack Swift, Rackspace's Cloud Files and Ceph. These services use the standard programming interface but have different underlying technologies and business models. 

The rise of S3, which Amazon describes as "cost-effective object storage," has also helped to drive the adoption of object storage. In addition to Amazon, household brands such as Facebook, Netflix, Dropbox and Twitter all use object storage. It's also deployed by enterprises for applications that require massive amounts of unstructured data, including content media storage, bioinformatics, data analytics, private cloud, file distribution and sharing, and backup and archiving. 

Minio Introduces Cloud Native Object Storage Server

Grazed from Minio

Minio today announced the general availability of its distributed object storage server built for cloud applications and DevOps. The solution enables applications to manage massive quantities of unstructured data, and enables cloud and SaaS application developers to adopt emerging cloud hosting providers such as Digital Ocean, Packet and Hyper.sh with Amazon S3 like capabilities. Minio's object storage server is now production ready, with major features such as erasure code, bitrot detection and lambda notification, and has grown in popularity amongst the Docker, Mesos and Kubernetes communities due its cloud native architecture.

"Minio is a valued partner of Mesosphere and a leading voice on the topic of storage in the DC/OS community," said Florian Leibert, CEO of Mesosphere. "With its future-proof and developer-friendly distributed object storage offering, Minio solves a real problem for our joint customers, and this latest release continues their history of innovation."

Completing the Storage Stack

AWS Launches Amazon Athena

Grazed from Amazon Web Services, Inc.

Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), today announced Amazon Athena, a serverless query service that makes it easy to analyze data directly in Amazon Simple Storage Service (Amazon S3) using standard SQL. With a few clicks in the AWS Management Console, customers can point Amazon Athena at their data stored in Amazon S3 and begin using standard SQL to run queries and get results in seconds. With Amazon Athena there are no clusters to manage and tune, no infrastructure to setup or manage, and customers pay only for the queries they run. Amazon Athena scales automatically – executing queries in parallel – so results are fast, even with large datasets and complex queries. To get started with Amazon Athena, visit https://aws.amazon.com/athena.

Unitrends Boomerang Named 2015 Cloud Computing Product of the Year

Grazed from Unitrends

Unitrends, the leader in cloud empowered continuity solutions, today announced that TMC, a global, integrated media company, has named Unitrends Boomerang a 2015 Cloud Computing Product of the Year. This award, presented by Cloud Computing magazine, honors vendors with the most innovative, useful and beneficial cloud products and services that have been brought to market in the past year.

Boomerang is an easy-to-use and cost-effective virtual appliance for replicating VMware virtual machines (VMs) to the Amazon Web Services (AWS) cloud for low-cost backup and disaster recovery. Boomerang also automatically remodels replicated VMs to native Amazon Machine Images (AMIs) for fast execution, providing an ideal solution for migrating from on-premise systems to the cloud and for Disaster Recovery as a Service (DRaaS).