Setup
One or more web applications must be deployed on AWS or Azure and the objective is to use the serverless solutions of both Amazon and Microsoft. The web application provides API services and performs server-side rendering as would a Spring MVC or ASP.Net MVC application. This document intends to clearly demonstrate how to leverage AWS and Azure serverless platforms to host a web application following vendor recommended practices.
Concepts
The solutions presented here are based on the use of a combination of managed and serverless cloud services to run fully serverless workloads.
Managed Cloud Services
Managed cloud services provide architects, operators and developers with all the necessary tools to achieve their business goals. For example, load balancers, API managers, queues and queues, publication/notifications, firewalls, authorization/authentication services and many more are managed cloud services that are used when building a cloud solution.
Cloud services can be classified according to their mission, such as storage, networking and computation.
We consider a service "managed" when the supplier takes full responsibility for both the operation and for delegating the responsibility of configuration to the consumer. At Azure and AWS there are more than a hundred managed services ranging from unlimited object storage to facial recognition.
Serverless Services
Obviously, it isn’t possible to process data without a computing infrastructure: applications cannot run without having at least one server to do the job.
The concept of "serverless" simply means that the application execution environment is extremely mutualized. This requires relying on the supplier to allocate computing power on demand. We can then rely on an infrastructure with almost unlimited computing power and focus only on the business objective we want to achieve.
The Serverless Advantage
As with managed services, we delegate the responsibilities for managing infrastructure, operating systems and application frameworks to the supplier while retaining responsibility for the application code and configuration.
The undeniable advantage lies in the extreme elasticity of the serverless model. Since the acquisition of computing resources is extremely elastic, only the computing time used is billed. For example, AWS Lambda charges for its serverless feature service in increments of 100 milliseconds. As with managed services, we delegate the responsibilities for managing infrastructure, operating systems and application frameworks to the supplier while retaining responsibility for the application code and configuration.
Unlike conventional cloud solutions on VM (IaaS), we only pay for the computation time consumed and the management of elasticity remains the responsibility of the supplier, reducing administrative tasks at the same time.
There are many cases of using serverless solutions. Examples include batch data processing, hosting of micro services, operations automation or running containerized applications.
Solutions
The goal is to deploy web applications by leveraging cloud services and serverless solutions from AWS and Azure. The solutions, one for each supplier, must be secure, elastic and based on serverless infrastructure.
AWS Solution
In order to obtain a solution that meets the initial need to deploy containerized web applications that are accessible, secure and above all serverless, we base ourselves on the following architecture:
High Availability
High availability is made possible using multiple availability zones which guarantee physical distribution within a region. If one zone fails, another takes over. Availability is also ensured through the use of managed cloud services with a guaranteed service level of 99.99% for most services and 100% for Amazon Route 53.
Security
Network security is provided through the use of security groups (instance firewalls), network access control lists (network firewalls) and routing tables. Tight control over network access ensures that additional sensitive items that could be added to the solution, such as an Amazon Aurora database, are not accessible from the public Internet.
Elasticity
Elasticity is achieved by using a combination of managed services (Amazon Elastic Kubernetes Service, Amazon Elastic Container Registry, NAT Gateway, and Application Load Balancer) and the Amazon Fargate serverless container execution engine. The elasticity results in the modification of the number of container execution instances managed by Amazon Fargate. All other managed services guarantee availability and elasticity.
Components
The entire solution is based on the use of the following main AWS services:
Amazon Fargate
The key to the solution is the use of Amazon Fargate. Amazon Fargate is a serverless computing engine for containers. It works with Amazon Elastic Container Service (ECS) and Amazon Elastic Kubernetes Service (EKS). The service will instantiate container images in private subnets. In order to securely expose the containers, they are associated with a network interface in a private subnet.
Amazon Elastic Kubernetes Service
Amazon EKS is a managed service that lets you easily run Kubernetes on AWS without having to install and run your own nodes: control or workbench. As part of the current solution, work nodes are made available by Amazon Fargate.
Amazon Elastic Container Registry
Amazon Elastic Container Registry (ECR) is a fully managed Docker container registry that allows developers to easily store, manage, and deploy Docker container images. Amazon ECR integrates with Amazon Elastic Container Service (ECS) and Amazon Elastic Kubernetes Service (EKS) to simplify your workflows, from development to production.
Application Load Balancer
The application load balancer is used to distribute calls to the web application to an instance of the container created by the EKS orchestrator. The load balancer continuously performs health tests to ensure that each of the targets is still able to receive requests (http and https).
Amazon CloudFront
Amazon CloudFront is a fast content delivery network (CDN) that securely distributes data, videos, apps and APIs to your users, with low latency and high transfer speeds.
Azure Solution
The solution on Azure aims to maximize managed services and serverless compute resources through Azure Container Instances. This service allows you to supply computing capacity for Kubernetes pods without managing a server or acquiring computing capacity in advance. The serverless solution uses managed load balancing, firewall, container registry and content delivery services:
High Availability
High availability is made possible through the use of managed container orchestration services such as Azure Kubernetes Service. Microsoft Azure guarantees a level of service for each of the components and automatically balances the loads. In addition, Azure regions operate on a minimum of two data centers.
Security
Network security is provided through the use of network security services such as Azure Application Gateway. Additionally, using a content delivery infrastructure (Azure CDN) effectively protects resources from denial of service attacks. The integration of security and computing services is seamless, which simplifies resource configuration in Azure.
Elasticity
Elasticity is achieved by using a combination of managed services (Azure Kubernetes Service, Azure Container Registry, Azure Load Balancer) and the Azure Container Instances serverless container execution engine. Elasticity is obtained by changing the number of container instances running.
Components
The solution uses the following Azure services:
Azure Container Instances
Azure Container Instances provides a quick and easy way to run a container in Azure, without having to manage virtual machines and without having to adopt a higher-level service.
Azure Kubernetes Service
Azure Kubernetes Service (AKS) simplifies the deployment of a managed Kubernetes cluster in Azure. AKS helps reduce the complexity and operational overhead of managing a Kubernetes cluster by delegating much of this responsibility to Azure.
Azure Container Registry
Azure Container Registry allows you to create, store and manage container images and artifacts in a private container for all types of container deployments.
Azure Application Gateway
Azure Application Gateway is a web traffic load balancer that allows you to manage traffic to your web applications. Application Gateway can make routing decisions based on additional attributes of an HTTP request, such as host headers or the path of a Uniform Resource Identifier (URI).
Azure CDN
A content delivery network (CDN) is a distributed network of servers capable of efficiently delivering web content to users. It stores cached content on Edge Servers in a present point (POP) close to end users to reduce latency.
Conclusion
Through this article we’ve demonstrated how both AWS and Microsoft Azure offers the possibility of deploying web applications in an entirely managed and crucially serverless way. These solutions will help enterprises and teams use the cloud in the most economical way when their workload is unpredictable and spikey.