It’s confirmed – serverless is here to stay. Already, 40% of IT professionals use serverless architecture, while 50% of AWS users have adopted serverless functions. We know that serverless computing enables us to build apps without managing tech infrastructure (and that there’s a host of benefits from that alone), but the lesser-talked-about advantage is serverless’ sustainable impact.
Serverless models reduce the amount of on-demand compute consumption time. They therefore help optimize emissions because companies only use the resources that they actually need, so less energy is wasted on idle or surplus processes.
Being more efficient with serverless not only leads to cost-savings, but it’s also paving the way for better environmental practices in tech operations. Here’s my take on how serverless is saving the world, and what needs to happen for serverless to scale its sustainable potential.
Did you know that moving on-premise workloads to serverless programs like AWS can lower the workload carbon footprint by a staggering 88%? The impressive figure comes from serverless principles being more flexible in terms of resource allocation and scalability. As companies grow or slow, they can upscale and downscale resources with serverless. And, when serverless functions are no longer in use, resources can be reallocated elsewhere.
While traditional infrastructure doesn’t always optimize the reserved capacity of servers to the workload they deal with, serverless avoids wasted idle Central Processing Units (CPUs). Subsequently, companies can report their energy consumption in a more granular way, looking at what functions take up the most execution time. Traditional servers also require more complex management and maintenance, which can be more resource-intensive, especially for businesses with larger server fleets.
Serverless principles, however, not only better accommodates changing workloads and demands, it encourages developers to write more efficient code. This efficiency ultimately makes programs more accessible on older computers, meaning hardware waste is reduced too. It also means that less data is exchanged in code-writing, so even less energy is used, atop the existing savings that are made with serverless when compared to traditional servers.
Then there’s the fact that, despite surges in data usage in the past decade, data center energy consumption has only increased by a mere 6%. How is this possible? Well, with big players like Amazon, Google, and Microsoft leading the data serverless charge, they have the budgets and the agility to implement major improvements in the energy efficiency of servers, storage, and data center facilities. Moving forward then, it’s not unfounded to suggest that they will make serverless approaches even more accessible and sustainable.
There’s a reason why corporations like Netflix, Coca-Cola, Zalora, and Nordstrom all use serverless – it significantly contributes to their eco-friendly efforts and their bottom lines. In fact, being serverless can bring as much as 90% cost reduction on a small, non-mission-critical application for some businesses. Considering the scope of corporate companies’ activities, going serverless could bring even greater savings, and more importantly, more green practices.
With these players championing serverless, smaller organizations are more likely to come into the fold. Companies of any size that have NodeJs/Python/.net/C# or Go app(s) can quickly move into any cloud serverless infrastructure with the support of frameworks. Plus, smaller companies have the flexibility to apply more innovative solutions to a serverless strategy, and potentially optimize their energy and cost savings further.
I believe that serverless makes sustainable sense; but to fully reap its environmental advantages, organizations have to establish clear goals. They need to define KPIs around lowering emissions and reducing their energy consumption. They then need to consider how serverless can support those goals, and how it fits into the bigger picture of sustainable operations.
To start, companies need to know what their carbon footprint is, and determine what changes in architecture and IT systems will have the most profound effect. I recommend using tools like Berkley University CoolClimate Calculator, Carbon Fund, 2030 Calculator, and the GHG Protocol to get this initial data.
Over in the serverless provider space, companies should be taking action to ensure that they constantly strive to be environmentally-friendly. That could mean co-locating serverless microservices to reduce congestion and latency, running resource-intensive microservices in regions with low carbon intensity, and optimizing how data is stored to restrict idle times even more. These are big steps, but they are necessary to maximize energy savings and make sure that advances in technical infrastructure keep pace with sustainability norms.
Serverless has begun transforming cloud computing for the better. We now have a responsibility to not only maintain its momentum, but to evolve it so that energy savings are inherent and permanent.