
Migrating from Microservices to Serverless: An IoT Platform
Case Study
Mohak Chadha, Victor Pacyna, Anshul Jindal, Jianfeng Gu, Michael Gerndt
mohak.chadha@tum.de,victor.pacyna@tum.de,anshul.jindal@tum.de,jianfeng.gu@tum.de,gerndt@in.tum.de
Chair of Computer Architecture and Parallel Systems
Technische Universität München
Garching (near Munich), Germany
ABSTRACT
Microservice architecture is the common choice for developing
cloud applications these days since each individual microservice
can be independently modified, replaced, and scaled. As a result,
application development and operating cloud infrastructure were
bundled together into what is now commonly called DevOps. How-
ever, with the increasing popularity of the serverless computing
paradigm and its several advantages such as no infrastructure man-
agement, a
pay-per-use
billing policy, and on-demand fine-grained
autoscaling, there is a growing interest in utilizing FaaS and server-
less CaaS technologies for refactoring microservices-based applica-
tions. Towards this, we migrate a complex IoT platform application
onto OpenWhisk (OW) and Google Cloud Run (GCR). We com-
prehensively evaluate the performance of the different deployment
strategies, i.e., Google Kubernetes Engine (GKE)-Standard, OW,
and GCR for the IoT platform using different load testing scenar-
ios. Results from our experiments show that while GKE standard
performs best for most scenarios, GCR is always cheaper wrt costs.
CCS CONCEPTS
•Computer systems organization →Cloud computing.
KEYWORDS
Microservices, Serverless, Function-as-a-Service, FaaS, Container-
as-a-Service, CaaS, Performance Analysis
ACM Reference Format:
Mohak Chadha, Victor Pacyna, Anshul Jindal, Jianfeng Gu, Michael Gerndt.
2022. Migrating from Microservices to Serverless: An IoT Platform Case
Study. In Eighth International Workshop on Serverless Computing (WoSC
’22), November 7, 2022, Quebec, QC, Canada. ACM, New York, NY, USA,
7pages. https://doi.org/10.1145/3565382.3565881
1 INTRODUCTION
Cloud computing is the corner-stone of modern day corporate and
consumer IT. For companies it provides a multitude of advantages
such as outsourcing of costly infrastructure, scalability, elasticity,
and fault tolerance. To this end, the architectures of cloud native
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specific permission and/or a
fee. Request permissions from permissions@acm.org.
WoSC ’22, November 7, 2022, Quebec, QC, Canada
© 2022 Association for Computing Machinery.
ACM ISBN 978-1-4503-9927-2/22/11. . . $15.00
https://doi.org/10.1145/3565382.3565881
applications have adjusted to the ever increasing demand of migra-
tion to the cloud. Most prominent among them is the microservices
architecture that displaced the prevalent monolithic architecture [
7
].
This architecture enables the development of applications as a suite
of small independent services communicating with each other via
well-defined interfaces. A microservices-based architecture provides
several advantages such as better modularization, enhanced isola-
tion, and easier maintainability [
22
]. Moreover, each service can
be scaled up or down on-demand, or be deployed in multiple avail-
ability zones to eliminate single point of failures. However, this has
led to an increase in complexity of deployment and provisioning of
services, resulting in developers having to develop their application
as well as take care of its operation, i.e., DevOps [
15
]. To this end,
an emerging computing paradigm called serveless computing that
abstracts infrastructure management away from the user has gained
popularity and widespread adoption in various application domains
such as edge computing [16,24,27] and machine learning [4,13].
Function-as-a-Service (FaaS) and serverless Container-as-a-Service
(CaaS) are key enablers of serverless computing that allow develop-
ers to focus on the application logic, while responsibilities such as
infrastructure management, resource provisioning, and scaling are
handled by the cloud service providers. In FaaS or serverless CaaS,
the developer implements fine-grained functions that are executed
in response to external triggers such as HTTP requests and deploys
them into a FaaS platform such as Apache OpenWhisk (OW) [
29
] or
a serverless CaaS platform such as Google Cloud Run (GCR) [
25
].
On invocation, the platform creates an execution environment, i.e.,
function instance which provides a secure and isolated language-
specific runtime environment for the function. These functions are
generally launched on the platform’s traditional IaaS virtual machine
offerings [
5
,
17
]. Serverless computing offers several advantages
such as
scale-to-zero
for idle functions, a
pay-per-use
billing
policy, and rapid fine-grained automatic scaling on a burst of func-
tion invocation requests. However, serverless functions are stateless
and any application state needs to be maintained through external
transactions with a database or a data store.
Both architectures, i.e., either microservices or serverless have
their advantages and disadvantages and the decision to adopt one
over the other depends on several factors. In this paper, we investigate
these two competing software architectures for an IoT platform
application. Towards this, our key contributions are:
•
We migrate a microservices-based IoT platform application
developed by us onto OW and GCR.
•
We comprehensively evaluate the performance of the IoT
platform across different deployment strategies, i.e, Google
arXiv:2210.04212v1 [cs.DC] 9 Oct 2022