Let's Talk Serverless

By: Timothy Brantley II

Published At: Sun Sep 08 2024

Updated At: Thu Sep 11 2025


Today I want to talk about serverless how I think about it and the place it has in my workflow. First off let's talk about the good things about serverless. 

No commitment

Legit you can have a few serverless functions built in node, some in go, some in dotnet and there are you don't have to worry about deployments. The best thing about serverless is that you don't have to maintain a server. You can have the best part of each language with none of the downsides or having the commitment. 

It's cheap

It's cheap to get started and pay as you go. Serverless has a gracious free tier and you can easily get a couple thousand requests for absolutely free. I've been using serverless for my golang project that uses serverless go to access a database. It's free for invocations. 

Setup is easy

You can build and deploy so quickly most serverless instances cost nothing to build. I can't tell you how easy it was to set up the vercel instances there were some minor inconveniences when it came to constructing routes but once I overcame that it was too easy to deploy.

But There's a Catch ...

You pay with your soul 👻…

Serverless is cheap until it isn't. If your function runs too long, scales unexpectedly, or spirals into an infinite loop (hello recursive API calls 👋🏽), you could be hit with a bill that makes your free tier dreams evaporate. Don’t take my word for it Google serverless horror stories and witness the tales of thousand dollar mistakes.

The problem isn’t serverless itself it’s how easy it is to underestimate it.

Observability Can be difficult

When things go wrong, debugging serverless functions can feel like screaming into the void. Logs get scattered across multiple services (CloudWatch, Vercel logs, Datadog, etc.), cold starts can mask performance issues, and simple misconfiguration can cost you hours or worse, dollars. Without proper tracing, logging, and metrics, you’re flying blind.

And to be honest, most people don’t invest until something breaks. 

My Caution with the Serverless Framework

I love serverless, but I’ve had mixed feelings about the Serverless Framework.

While it does make things easier at first especially when deploying to AWS it can become a tight coupling you didn’t ask for. Plugin dependency issues, config bloat, hidden magic, and deployment inconsistencies across environments can stack up fast.

If you're not careful, you go from "this YAML config is amazing" to "why does removing one function break all the others?"

In some cases, using the Serverless Framework actually made debugging harder compared to going direct with AWS SAM or just wiring my own CDK stack. Again it’s not bad, but you need to understand what it’s doing behind the scenes, especially if you plan to scale or go multi cloud.

What’s the Verdict?

I'm pro serverless. I think it’s the future of compute for a lot of use cases. It frees you from DevOps overhead, lets you ship faster, and gives you incredible flexibility.

But don’t let that freedom lull you into a false sense of security.

Treat serverless like you would treat a chainsaw: powerful, efficient, but dangerous if used without understanding.

TL;DR: Respect the Power

  • âś… Serverless is great for speed, cost, and experimentation.
  • ⚠️ But costs can spiral if you're not paying attention.
  • đź§  Logging, timeouts, and safety guards are essential.
  • ❗️Use frameworks (like the Serverless Framework) with caution understand what they're doing for you.