Cloud Computing Cost Benefits – Cash Flow Impact

We’re heavy users of cloud computing technologies – primarily Amazon Web Services (AWS) and services built on top of it. The decision to embrace this technology has been very important to our client projects. There are many reasons behind this, but here is the first of the major reasons behind the decision (we’ll cover the others in later posts).

Cloud Computing Reduces Costs & Improves the Cash Flow of an Uncertain Project

Many of the projects we work on begin with some kind of prototype to confirm the client’s expectations or project goals. If the project goes well, the client then moves on to a larger roll-out and larger subsequent phases. If it doesn’t then the project ends. Because of this, there is a lot of variability and uncertainty in any estimates used to project infrastructure requirements or pricing models. Our preferred method of managing this risk is to use a variable cost compute model and leading services (IaaS, PaaS, SaaS, etc). While doing this, we develop everything to be standard compliant so we can switch to a better option in the future as the clients needs change.

Examples:

  • Recently we completed a prototype/proof of concept project with an expected internal user base of around 300 users. About a week after launch we ended up with 750+ internal users due to the popularity of the program. We were able to transparently scale up once there was sufficient demand (not before) with zero downtime. We also did this using a variable-cost model so the ‘estimation risk’ to the client was greatly reduced.
  • We designed the infrastructure for a site that hosted and displayed user submitted videos for a large internet based comedy contest. There was no accurate way to predict how popular the contest would be so we used an entirely variable-cost model to build and deploy the solution. The solution used a few different AWS This allowed the site to scale up and down based on actual demand and traffic, which we believed was a reasonable proxy for revenue.
  • We’re currently working on an advanced prototype project now that does some complex analysis on ~210GB of structured and unstructured data. Again, the project is a prototype with an uncertain future. Instead of taking a leap and estimating what size hardware to buy or making any other decisions based on limited information, we’re using Amazon Web Services to build a cluster of servers to test our product. Our projections show we’ll be able to build the cluster for about 80-90% less than the cost of a single server. This estimate doesn’t factor in other items necessary to make a complete comparison, but is a good starting point.

Conclusion

There are certainly some situations where ‘the cloud’ doesn’t make sense financially or operationally, but generally if the pricing model full factors everything in, and takes into account a measure of uncertainty, the benefit is clearly on the side of cloud services like AWS.We’ll go through an example cost model in the future to further clarify this. Until then, the AWS pricing tool should give some good ideas on pricing comparisons – especially under situations needing lots of computational power for a short time. Its pretty clear that for certain use cases, the cloud is the only way to go from a pricing standpoint.

Its a whole different animal to estimate the requirements and cost model for a well defined and modeled project. If the project you’re working on has a high variability (i.e. standard deviation) then its almost certainly more cost-effective to go with a cloud solution. Even if the costs are equal or slightly higher, we feel its cheap insurance and safety to go with a more adaptive solution.

[gravityform id=”4″ name=”Subscribe to our Blog” description=”false” ajax=”false”]

Great New Amazon Web Services (AWS) Announcement – DynamoDB

Solid Logic has been using Amazon Web Services (AWS) since 2008 now with great results.  Today was a big day for the AWS team.  They launched a new NoSQL service -DynamoDB- today around noon.  Like the other AWS offerings (EC2, S3, etc.) it is a scalable, variable cost service.  Here is the product listing page and other relevant info: http://aws.amazon.com/dynamodb/

Werner Vogels’ blog: http://www.allthingsdistributed.com/2012/01/amazon-dynamodb.html

Announcement Video:

http://www.youtube.com/watch?v=3I5PZv6vmZY

DynamoDB Overview Video:

We primarily use EC2, S3, CloudFront, and RDS (MySQL as a service).  AWS allows us to be much more agile and can reduce system complexity.  It removes SLTI from many of the day-to-day demands of setting up and managing physical infrastructure.  By using AWS, we are able to launch an internal or client application in minutes to hours, instead of days to weeks.

There are tons of great reasons to switch many use cases over to an AWS cloud-hosted infrastructure instead of a physical one.  In the future, we will work up our spin on a cost-benefit model for AWS.  There are many good ones available on the web, most notably here – http://aws.amazon.com/economics/.  Unfortunately, many of the ones available do not include any intangible benefits.  In this case, I am defining an intangible benefit as something that is “fuzzy” to put a price figure on and is subjective in nature.  Things like how it can affect focus, hiring practices, time-to-launch, required in-house skill sets, etc. would fit into this category.  These items can have a huge impact on the decision making process, especially for small-to-medium size firms or firms outside of concentrated technology areas with a large pool of qualified candidates.

We are excited about today’s announcement and look forward to using DynamoDB for a product that we have coming to market in 2012.  As we begin to work with it more, we will try to document our findings and include them in the blog over here.

[gravityform id=”4″ name=”Subscribe to our Blog” description=”false” ajax=”false”]