is now [learn more]

Cloud Computing

Making sense out of Serverless

Written by Aaren Quiambao  |  March 5, 2021

Adnan Rahic of Sematext.com gives us a quick low-down on Serverless computing and how leveraging it correctly can help organisations get the best out of it.

Editor’s note: This interview with Adnan Rahic was recorded for Coding Over Cocktails - a podcast by Lonti previously known as Toro Cloud.

In 2019, O’Reilly conducted their inaugural survey on Serverless adoption among IT professionals from different organisations and regions. 1,500 participants came forward with their responses, answering how Serverless has affected their respective industries.

Forty percent of respondents say that they have implemented Serverless in some form or another, adding that it has reduced their operational costs and has the benefit of being automatically scalable. Serverless also frees DevOps teams from the burden of managing and maintaining backend systems, allowing them to focus on code.

As Serverless brings simplicity and cost-efficiency to the table, it’s not surprising that its popularity in adoption is growing. In fact, the market size for Serverless is estimated to grow to $7.72 billion by this year.

This, despite a number of organisations still taking caution according to O’Reilly, as new technologies carry with them some weights of concern, such as security and "fear of the unknown."

Nevertheless, Sematext.com Developer Evangelist and Serverless expert Adnan Rahic tells us that it doesn’t have to be difficult. "If you know how to use [Serverless] correctly, or if you have a very good sense of how to get the best out of it, then it makes sense," he says.

In an interview with Coding Over Cocktails, Rahic shares his experiences as a developer and how they are able to maintain Serverless infrastructures.

"The point of Serverless is to make it simple, to make it easy for people that don't need to manage infrastructure."

Best case scenario for Serverless

Rahic describes Serverless as "anything that doesn't require a server." A popular subset under Serverless is FaaS (Functions-as-a-Service), where services like AWS Lambda and Azure allow users to deploy a code, run it through an event trigger and get a return value.

To get started, Rahic advises figuring out the best use-cases for Serverless. He shares his experience building an email service that is triggered via Lambda function and other AWS services. "When somebody types in a form, I get emailed that response for that question. And then I could just email back that person through any email client. But that's not running anywhere. It's not running on a server."

One good use-case for Serverless Rahic suggests is taking advantage of its scalability for running multiple transactions. Because it can easily scale, it can manage connections and functions with less latency as compared to a standard API on a server.

"If you have function on AWS, if you get 1000 concurrent connections in the same millisecond to that one API, it's gonna scale horizontally to 1000 functions right away."

On the other hand, Rahic advises against using Serverless for anything that requires a persistent database connection, especially for relational databases such as Postgres and SQL. "Just don't. Just skip the FaaS altogether. You don't wanna go into that one at all."

Although there is a workaround to this through the use of a proxy API hooked into the database, Rahic warns that this adds another layer of complexity and is not always considered best practice.

"If you think about it, that one API needs a connection to the database and if you're scaling out, then you have thousands of functions. They have thousands of connections to the database, and that's just that's just like an accident waiting to happen," he explains.

One of the most common challenges associated with Serverless are latency issues called "cold starts," which occur the first time a code has been executed.

"Let's say you have an event that's an API and that event will trigger your code that's in the function. The instance of this function doesn't exist anywhere, so you have to call it the initial time."

Rahic explains that there’s no tactical way of bypassing cold starts. Although periodically triggering the function can keep it "warm," it’s not really going to be of much help. "If you have 500 concurrent connections right away and you're keeping one function warm, it's not doing much, right? You're still gonna get 499 cold starts."

As for functions that have a longer runtime, Serverless may not be the best solution. "If you really need to run something for more than 15 minutes, probably using a server is gonna be cheaper and more efficient," Rahic explains.

"Not a silver bullet"

Rahic warns that the barrier of entry for Serverless can be pretty huge, especially if a developer hasn’t done anything like it before.

Talking about his experience as a developer, Rahic says Serverless could have you running multiple environments for testing in order to get a good overview of how production will go. "If you're not doing test-driven development, if you're not running unit tests for the code, it's gonna be a pain."

While Rahic considers Serverless a "whole new concept of development" and requires thinking outside of the typical box of development, proceeding with caution is always key. He makes it clear - Serverless is not a silver bullet.

"You have to figure out the best use-case and then based on that, use it for what it's kind of intended to be used as if that makes any sense. But it’s not a silver bullet."

Listen to more interesting conversations about Serverless with Adnan Rahic on this episode of Coding Over Cocktails - a podcast by Toro Cloud.

Coding Over Cocktails is a podcast created by Lonti previously known as Toro Cloud, a company that offers a low-code, API centric platform for application development & integration.

This podcast series tackles issues faced by enterprises as they manage the process of digital transformation, application integration, low-code application development, data management, and business process automation. It’s available for streaming in most major podcast platforms, including Spotify, Apple, Google Podcasts, SoundCloud, and Stitcher.

true true

You might also like

Big Data

The Impact of Big Data on Data Modeling

Discover how big data is reshaping data modeling approaches and revolutionizing the way we think about data. Explore the challenges posed by big data and the emergence of new methodologies in this insightful blog post.
Read More

Big Data

Data Mesh in the Age of Big Data

Discover the transformative power of Data Mesh in the age of big data. Learn how this decentralized architecture enhances scalability, adaptability, and data quality for organizations. Explore real-world case studies and future outlook. Book a demo to see how Data Mesh can revolutionize your data management.
Read More

Cloud Computing

Implementing Data Lakes in the Cloud

Learn how data lakes in the cloud are revolutionizing data storage and analysis. Discover the benefits of flexibility, scalability, and operational efficiency. Implementing a data lake is the strategic choice for modern businesses.
Read More