Benchmarks Don't Tell Whole Story About AWS and Public Cloud Competitors

By Tim Zeller | March 07, 2014

Organizations have an increasingly diverse set of choices for public cloud infrastructure. When assessing different solutions, however, buyers should take into account factors other than just raw CPU speeds. More specifically, they have to assess the entire ecosystem of APIs, services and redundancies that surround each offering.

On these merits, Amazon Web Services is often the preferred choice over public cloud competitors such as Windows Azure and Google Compute Engine. In addition to high-level services for developers, support for many languages and thousands of available APIs, AWS provides proven scalability that has supported numerous Web-scale operators.

AWS has become so ingrained in the workflows of development and test teams that many of them still want to tap into its core functionality even after shifting some operations to a hybrid or private cloud. With software such as Eucalyptus, they can just do that and go on utilizing familiar AWS APIs, making the infrastructure and platform transition much easier and more cost-effective than if they had been using a different public cloud.

Speed Tests Don't Tell the Whole Story

InfoWorld recently ran a series of speed tests comparing different public cloud services, including Amazon EC2, Azure and GCE. It used the open source DaCapo benchmarks on three separate Linux configurations - one for each cloud - running the default Java Virtual Machine.

While GCE had the quickest response time, the test's coordinators cautioned that end user results could vary depending on how their public cloud provider was divvying up resources. Buyers have to be sure that they're getting into a cloud that has enough redundancy and services to produce real efficiency gains for dev/test teams.

"[A] close look at the individual [speed] numbers will leave you wondering about consistency," wrote InfoWorld's Peter Wayner, who performed the tests. "Some of this may be due to the randomness hidden in the cloud. While the companies make it seem like you're renting a real machine that sits in a box in some secret, undisclosed bunker, the reality is that you're probably getting assigned a thin slice of a box. You're sharing the machine, and that means the other users may or may not affect you."

Wayner noticed similar variance in areas such as bursting behavior, in which 2-CPU AWS machines had their Eclipse benchmarks bolstered by at least 3x, easily outperforming even 8-CPU servers running on other public infrastructure. CPU performance fluctuation was one of the more interesting revelations of the study, as Wayner discovered that buying extra cores didn't always provide as much of a speed boost as anticipated.

Moreover, he argued that there wasn't necessarily a clear correlation between price and performance and that organizations could find ways to save money while still running a highly efficient, scalable environment. He cited Amazon's reserved instances, which can be pre-purchased for use over a set period, as a viable way to achieve this goal. This arrangement lowers the cost per hour and gives operators flexibility to turn machines on and off as needed.

On the cost side, some companies have gone even further and shifted many operations over to an AWS-compatible private cloud via Eucalyptus that offers granular control over hardware and security. The result is infrastructure that taps into the best of both worlds - AWS' uniquely varied and rich ecosystem on one hand, and the predictable economic model and dedicated performance of the private cloud. Workloads can be shifted seamlessly between environments and developers can still utilize EC2, S3, EBS and other services and, moreover, take advantage of AWS's global redundancy.

AWS's scalability is often the tipping point for buyers, especially if they're running a large operation that requires significant resources and consistent uptime. AWS gives them a way to run numerous workloads, while its widely supported APIs ensure that teams have the flexibility to still leverage its advantages even if they shift some tasks to private infrastructure. It's unsurprising that AWS continues to be the central area of interest for current and would-be cloud users.

"Ultimately we didn't feel that the [Azure] roadmap was anywhere near as strong as AWS and just the sheer size of the offering and how many businesses they were supporting, we just didn't feel as confident about Azure as we did about AWS," Just Eat CTO Carlos Morgado told Computing.

Get Started with Eucalyptus

Use FastStart to easily deploy a private cloud on your own machine from a single command!