Mongodb - Microservices - and - Serverless
Mongodb - Microservices - and - Serverless
Guide to Microservices
and Serverless
Architecture
From Microservices to
Serverless Functions
The Object Oriented Guide to
Microservices and Serverless Architecture
From Microservices to
Serverless Functions
Like each generation of the microprocessor, each
new generation of server-side architecture seems to
challenge physics itself.
From compiled monoliths of yore to three-tiered server constructs, from
service-oriented architectures to microservices, we keep shrinking the
scope of a single server’s responsibility. Our goals at each iteration
are the same: simplify the task of writing maintainable, high-quality,
high-performance software while growing the complexity of the overall
systems in which our applications reside.
Service
Service
2
Infrastructure as Code:
An Object Oriented Approach
As software infrastructure has grown in complexity,
it’s started to make sense to apply the thinking used
inside of individual software projects to larger distributed
software systems.
With service-oriented architectures, however, the problem space shifts to
delineating boundaries between services and finding efficient, scalable
ways to compose them in order to achieve larger business goals.
Infrastructure
in the Cloud
MANAGE
Automation
WRITE PUSH OR PULL API or Server
MANAGE
Version Infrastructure
Control on Premises
3
The Object Oriented Guide to
Microservices and Serverless Architecture
OOD Principles
Let’s look at a few principles of OOD, and how they
can be applied as easily to infrastructure as to code:
OOP INFRASTRUCTURE
4
The Ever-Shrinking Service
It’s no coincidence that microservice architectures and
container-based infrastructures have evolved hand in
hand since their beginnings in 2012.
They’re symbiotic in a sense; microservices have the smallest application
footprint and containers provide a stripped down, reproducible, and atomic
deployment. Together they provide a single unit of the application.
At its core, the ideas behind microservices promise to make life easier for us,
providing simple solutions to sophisticated application deployment challenges.
Let’s look at how Single-Responsibility and Interface Segregation
impact the design of microservices:
MICROSERVICES
5
The Object Oriented Guide to
Microservices and Serverless Architecture
6
We now find ourselves battling new complexities:
fleets of containers, sprawling codebases, and
divergent platforms and frameworks:
7
The Object Oriented Guide to
Microservices and Serverless Architecture
8
For the most part, we understand that we don’t install
things locally anymore.
However, we’re still building things from the Internet — this includes having to
download large filesystems for our Docker containers, and the same libraries
in Docker that we used to download to our PCs . Docker-compose is used for
local development, Terraform modules are used for deploying to our Cloud,
Kubernetes for our Docker containers when we’re not local, CircleCI Orbs are
triggered from GitHub to deploy it all...wasn’t this supposed to be easier?
DevOps
IT Operations
At its core, containers provide their users with clean environments in which
they can install anything they want — regardless of operating system,
libraries, or programming environments. The additional flexibility gained
from this model is profound: creating, modifying, and deploying any single
service is delightfully simple. From the point of view of data centers,
everyone appears to have agreed that we’ll communicate via HTTP from
now on and rely on years of traffic shaping wisdom to manage capacity and
elastic scaling. It’s a model that undeniably works, but it has its drawbacks.
9
The Object Oriented Guide to
Microservices and Serverless Architecture
In practice, this tends to mean that every team must be able to provide
meaningful answers to a set of complex questions. It seems unreasonable,
for instance, that a machine learning (ML) algorithm specialist should have
to understand why Nginx’s buffering is worth putting in front of a Gunicorn
Python service.
It’s reasonable for someone who’s never built a multimedia API before to have
no idea that there’s an important difference between throttling threads and
throttling coroutines. They’ll have no choice but to learn these lessons the hard
way as their coroutines compound, downloading large files and snowballing into
a mess that overflows RAM and grinds to a halt.
If every team is truly a DevOps team, we need easier options with less to think
about. This allows us to focus on the parts of computing that drive each of us.
We know we could learn it all if we put in the time — but we also know we won’t.
And, ultimately, we shouldn’t have to!
10
Functions: Atomic Units
of Service Architecture
Building infrastructure around functions (assumed
to be request handlers for HTTP or similar) is
novel, and frankly unintuitive at first glance.
In reality, it follows the familiar pattern of the Unix Philosophy,
which has been around for 40+ years. While it feels like the
logical conclusion of all these years of simplifying infrastructure,
it removes so much from the list of things we’re used to thinking
about that it’s hard to take it seriously. One function? Is it even
worth having a server for that at all? And how many containers
are we talking about here?
11
The Object Oriented Guide to
Microservices and Serverless Architecture
12
Databases in the
Serverless World
Yes, microservices make your organization
nimble, thanks to loosely coupled,
independently deployable applications.
But their siloed nature can make using self-managed
databases cumbersome at best and impossible at worst.
Each function can’t have its own database and as your
architecture becomes more complex, your database must
as well.
13
The Object Oriented Guide to
Microservices and Serverless Architecture
Reduce Complexity,
Increase Productivity
Looking at serverless architecture as the natural evolution of
microservices, it becomes obvious why this shift is compelling:
We get to delete thousands of lines of boilerplate code, reduce the cognitive
load on developers throughout the system, and drastically simplify the unit of
software we’re shipping.
The good news is that you don’t have to jump into the deep end right away.
It’s easy to take a single service endpoint and try running it on a serverless
platform. There are a few “gotchas” — namely, in understanding how things like
database connections and state work.
That said, the learning curve is reasonably short, and the payoff is tremendous.
Serverless systems respond much more quickly to changes in demand, they’re
generally much more efficient when it comes to resource usage, and they
take away heaps of complexity and pain from the software development and
deployment process.
It’s definitely worth a try.
14