Cutting Storage Costs with Intelligent Tiering for S3
Learn more about how you can leverage AWS Intelligent Tiering to save on storage costs in the AWS cloud.
AWS transformed the way that the industry reasons about compute, network, and storage, driving the mental shift to utility-based, consumption-driven billing. The impact on how we architect applications has been massive, as developers use features like auto-scaling and spot instances to better align their spend with demand. This was followed by a shift toward architecting workloads through composing smaller, more focused microservices, rather than creating monoliths.
All of this evolution has been very useful in helping businesses make smart decisions about cost optimizing their workloads, as infrastructure spend becomes more predictable and bills more actionable. That said, as an industry, we’re still far from an ideal state, where costs can be understood at a level much more granular than cost center or microservice. For many workloads, compute costs are broken down based upon infrastructure metrics like “instance hours,” and then go through complex cost attribution exercises to map those costs to microservices, applications, departments, or cost centers, still leaving plenty of room for inefficient application code to hide.
What if compute costs could be mapped directly to our application code? With the advent of AWS Lambda, that hypothetical can be a reality, drastically altering the granularity with which we can understand our spend. With AWS Lambda, pricing is fundamentally driven by consumption, and the question “how much did this Lambda function cost me this month” is a simple matter of answering:
At the end of each month, the costs associated with a serverless application can then directly expose application-level opportunities for optimization.
Serverless architectures are frequently lauded for improving agility, reducing management and operational overhead, and for easing scaling. But, the benefits are actually greater than that – we’re gaining an unprecedented transparency into compute cost. Simon Wardley, well-known cloud advisor, identified the big shift that serverless brings:
“The characteristic change is a shift from high to low obscurity of cost.”
What will the impact of this shift have on our behavior? I like Simon’s astute summation of this question as well:
“Serverless will focus the craft. Suddenly refactoring has financial value that hasn’t happened before. We will all discover that the crappy function buried in an application costs us money. The craft itself will improve.”
As an engineer, and a leader of engineering teams, I am deeply aware of the difficulty of justifying refactoring exercises. These activities are often viewed as speculative boondoggles with no clear ROI. In serverless applications, costs are transparent and granular, and the impact of refactoring a specific AWS Lambda function can be measured precisely, immediately providing financial justification for engineering teams to refactor. In addition, as Simon points out, the incentive for engineers to write efficient code from the outset increases as inefficient application code can no longer hide inside of monoliths or even within microservices. “The craft itself will improve,” indeed!
While AWS Lambda will bring this increased transparency to compute, AWS continues to introduce more “serverless” services in other areas. With Amazon Aurora Serverless, for example, AWS is bringing many of the same benefits to the database layer. Imagine if monthly spend for your application databases could be broken down to the query level. This would drive query optimization that would not only benefit users with improved performance, but would also have a direct financial benefit and ROI.
It is very exciting to be in the early days of such a fundamental shift in how we build and manage applications. Now is the time for businesses to start exploring how their applications and workloads could benefit from serverless architectures.