6 Best Practices To Switch To Serverless
After Amazon, Microsoft and then Google have both launched “serverless” cloud services.
Here’s how to take advantage of this new application environment. The promise of cloud services called “Serverless” is tempting. They allow you to deploy and execute code without having to worry about the underlying servers or virtual machines. Under the hood, machine resources are automatically scaled for traffic. The code is set to music via so-called “functions” in response to events.
When calling a web page for example, they can start the services associated with it (authentication, database query, etc.). If one of the components of the application falls, it is immediately restarted by another function. In 2015, Amazon is one of the first cloud providers to position itself in this field with AWS Lambda. Microsoft follows suit in 2016 with Azure Functions. Last August, it was Google’s turn to announce Google Cloud Functions.
These functional as a service (FaaS) solutions also have an economic interest. Exit the allocation of virtual machines, billing is stalled on the IT resource actually consumed.
Following are 6 Best Practices to switch to Serverless or start looking at integrating Serverless Computing into your IT stack.
I. Define Eligible Applications
Despite its flexibility, the serverless displays a size limitation. Regardless of the provider, the execution of a function remains limited in time. It can not exceed 5 minutes at AWS and 9 minutes at Google. Main consequence: not all codes are eligible. Exit those generating long treatments such as distributed computing, machine learning or video encoding. Likewise, serverless rhymes with stateless (the inverse of stateful). Because of its design, this technology can not store the state of an application session. For these types of workload, we can use third-party cloud services, such as a SQL or NoSQL database.
II. Cutting The Application Into Microservices
As you can see, the serverless involves cutting the application into small stateless components. Microservices that will come to nest in the functions. To integrate such an architecture, purchasing management software, for example, will have to be declined in a series of workloads: management of the accesses, the sessions, the recording of a purchase request, the checking of the stocks, etc. The objective is to avoid creating too heavy functions that would not be eligible for the processing time authorized by the provider. And this without counting the latency necessary to launch the function. The lighter the software libraries will be and the weight of the optimized code, the better it will be, in this respect advisable to prefer the Python language to Java. In terms of language, moreover, the different solutions of cloud serverless are not all housed in the same sign. While AWS supports C #, Java, Node.js and Python, Google functions are limited to the last two.
III. Managing The Complexity Of The Architecture
Because of the granularity of their architecture, serverless applications can very quickly prove complex to evolve. How to navigate in fact in a galaxy of many microcomponents? Faced with this challenge, code development and maintenance best practices are key, from software documentation to version management and test automation. With the serverless, one gains in flexibility to the detriment of the simplicity. The multiplication of the small application blocks increasing at the same time the risk of error of execution and latency.
IV. Test The Application
To avoid any surprise during their deployment, the serverless applications involve a rigorous testing phase. It will be necessary to simulate the execution of functions as close as possible to real production conditions. Cloud providers typically offer tools to orchestrate this process. But also services of IAM (Identity & Access Management) to manage the cloud security and the rights of access to the environment.
V. Monitor In Real Conditions
In the same logic, it is important to monitor the application once it is put into production. Availability of functions, latency, bug detection. Application monitoring allows you to quickly identify runtime errors and prevent them from affecting other functions. This monitoring also aims to control the consumption of the cloud service to better optimize. The challenge? Find the right balance between the small size of the functions and their number in order to pay as little as possible while limiting the overall complexity of the architecture. Amazon, Google, and Microsoft display tailored solutions to monitor serverless applications implemented on their respective cloud.
VI. Choosing The Right Cloud
All serverless cloud service providers do not offer the same possibilities. In view of the needs, it may be advisable to favor one of them. As noted above, they do not support the same languages. But they are also different in terms of features (read the article: Three cloud offers “serverless” on the grill). While Amazon stands out by its maturity and Microsoft by its wealth of execution environments, Google allows it to deploy Docker containers via its Cloud Functions.