There’s a right and wrong time to use any technology. There are a few common traits across the three major providers (AWS Lambda, Azure Functions, and Google Cloud Functions) that make it easy to reason about when a serverless function is the right way to go. That said, this article will have more details on AWS Lambda and its C#/.NET Core offering because it’s what I know best.
The languages supported in serverless platforms can be broken into two major categories: statically typed vs non statically typed. C#, being a statically typed language, suffers the same fate as Java with slower startup times and larger memory needed to accomplish the same thing with Nodejs or Python.
There are four criteria that should be reviewed before choosing serverless. A project doesn’t need to match all the criteria to be a good fit for Lambda. But one that only fulfills a couple of the criteria might be better served by another approach.
Low To Medium Memory Usage
Memory is a precious and expensive resource when it comes to Lambda. AWS doesn’t make it easy to reason about its exact cost with “Gigabyte-seconds” billing but it is fairly clear that any function with a high memory footprint is going to be pricey under heavy load. The cost of a single execution just about doubles every time you double memory usage. (Sidenote: Andy Warzon over at Trek10 has written an in-depth analysis on when you should consider moving your Lambda functions to a dedicated EC instance).
Some common types of low memory usage functions are Webhooks, event handlers, background order processing, and scheduled report generation. CRUD APIs are also quick to set up using Lambda, so long as you’re not too worried about the next criteria on the list.
Response Times Aren’t Critical
A Lambda function that is already in memory can spawn another instance incredibly quickly. But a function that isn’t already in memory has to do a cold start that incurs a significant amount of overhead. A cold start of function is like starting your car on a cold winter morning. The battery has to do a lot more work for the engine to turn over. But if the engine is already warm it’ll be able to get it going almost instantly.
C# functions are among the costliest in terms of start up time, making them more attuned to background asynchronous work rather than APIs.
If you insist on using Lambda for your C# API and need lightning fast response times, there are some measures you can take:
- Increasing the amount of memory your Lambda can consume helps drastically reduce cold start time. Going back to the car battery analogy, it’s like throwing a 600A battery at a car that needs to start at -30C.
- Create a second function which pings the first one every few minutes to ensure that it never incurs a cold start.
The bottom-line is that a language with a low cold-start time (such as Nodejs or Python) should be preferred over C#.
Before serverless came around, you’d need some kind of always-on hourly-costed infrastructure to host your service, regardless of how much traffic you actually had. That’s not cost effective for an API that only gets a few hits an hour though. Functions are great for these types of low traffic situations because you only pay for what you use, which in the case of an API that is only called a few times per hour is next to nothing (and could in fact be part of the free tier).
An alternative to Lambda for a low traffic API is to host it on a small machine like a t1.micro. The problem comes when the service suddenly has an unexpected peak in traffic, resulting in a degredation of service because the instance can’t keep up and can’t easily be scaled. Lambda protects you from this by being able to take on incredible amounts of load without skipping a beat.
Heavily Invested In Your Cloud Provider
It’s possible to port functions to another cloud provider without resorting to a complete rewrite, assuming that the language is supported on both clouds. However, C# is only supported on AWS and Azure so if cross-cloud compatability is important to you go with Node.js. It’s the only language supported across AWS, Azure, and Google.
Functions also tend to leverage other cloud services such as a databases, queues, file storage, as well as other functions. Every cloud provider provides APIs to easily access and use these resources so moving more infrastructure to the cloud pays off in terms of ease of integration. But beware, as it does lock you into your cloud provider for the long haul.