| from Mustafa & Samuel

Using Terraform to deploy a Barebone serverless API in Azure API Management Service (Part 1)

How is this blog organised?

1. Firstly, we will introduce the Framework and how we are using Infrastructure as a Code to offer an API Service in the Cloud.

2. Secondly, a “Technologies used” section is added as a quick reminder (you can skip this part if you are already familiar with these technologies)

2.1 Azure API Management Service

2.2 Azure Functions

2.3 Terraform - Infrastructure as a Code

2.4 Python Flask

2.5 OpenAPI

3. Architecture

3.1  List of resources and manual deployment

3.2. Barebone serverless API

3.3 Deployment of business functionality

3.4 Alternative way for deploying an API Service

4. Conclusion

5. References

 

1. Introduction

In the past few years, serverless APIs have emerged as a powerful and innovative solution to help build scalable and cost-effective applications which are able to adapt to different workload requirements. Moreover, by leveraging the power of infrastructure as a code (IaC), efficiency of applications improves by becoming a reproducible and fast-deployable solution.

In this blog post series, we will show and explain a solution architecture to deploy an API Service in the Azure Cloud. At first, we will explain the idea and what kind of technologies and ideas were used to accomplish that service. After being on the same page on the core ideas, we will follow up with the implementation.

This Architecture comprises a Barebone serverless API that is preconnected to work with Azure API Management Service. The deployment process is carried out by using Terraform and Azure DevOps. As a result, users can leverage the full suite of features provided by the API Management Service, including serverless API capabilities. Subsequently, the application development team just needs to focus on deploying the business functionality to the Barebone serverless API. In the following sections we will explain what the Barebone serverless API is and how it can be seamlessly integrated into existing Azure environments.

The core concept here is embracing IaC to offer a service rather than manually deploying resources. For example, setting up monitoring and analytics features in Azure API Management Service can take a considerable amount of time. The main effort here lies in connecting the different infrastructure resources. A fundamental concept that guided our approach was the separation of infrastructure deployment from API deployment. Given that the API relies on Azure resources, this may initially appear contradictory. Our proposed solution is a Barebone serverless API, which consists of pre-deployed infrastructure resources that are maintained and updated through a specific pipeline responsible for deploying the business functionality of the API.

 

2. Technologies used

2.1 Azure API Management Service:

Located at the core of our architecture is the API Management Service, a platform from Microsoft Azure that allows organizations to publish, secure and manage APIs in a scalable manner. The API Management Service provides a centralized gateway to expose APIs to customers, partners, or even internal teams. This service offers various advantages like API governance and security, API analytics and application insights, as well as integration with other Azure services. Also, one can do analysis on the number of requests per API resource, geolocation of requests, debugging with additional log information, run SQL queries on the log information etc. But the most relevant advantage is the robust and secure ecosystem that the service allows by leveraging the scalability and high availability of serverless architectures [1].

2.2 Azure Functions:

Azure Functions is a serverless computing service that enables developers to write, deploy, and run code in response to various events or triggers, such as HTTP requests, database changes, or timer-based schedules. It automatically scales based on demand, ensuring that you pay only for the resources used during execution. Azure Functions supports multiple programming languages, including C#, Python, JavaScript, and more, making it suitable for a wide range of application scenarios, from simple automation tasks to complex microservices architectures. It provides integration with other Azure services, extensive monitoring and logging capabilities, and a rich ecosystem of extensions and bindings to connect to various data sources and services. Azure Functions empowers developers to focus on writing code to address specific business logic while abstracting away infrastructure management concerns, resulting in faster development and efficient resource utilization.

2.3 Terraform – Infrastructure as Code:

To orchestrate an efficient and automatic deployment of our service, we bring into play the power of Infrastructure as Code, respectively, Terraform from HashiCorp. This cross-platform and open-source technology allows us to define and provision cloud infrastructure and services in a declarative manner. By using Terraform for building our cloud infrastructure, we take advantage of automation and efficient deployments. Moreover, infrastructure changes become more predictable, reproducible, and auditable [2].

It is important to mention here the Terraform state and how it tracks infrastructure changes. The file that serves as the Terraform state is the single point of truth for our Terraform resources. When a resource undergoes a change that does not align with the configurations stored within this file, Terraform will automatically rectify it during the next terraform apply run, except when specific changes have been configured to be ignored by Terraform.

2.4 Python Flask:

Python Flask is a web application framework that allows developers to build APIs quickly and efficiently. It is based on the WSGI (Web Server Gateway Interface) standard and integrated with Python libraries, making it a common and versatile choice for API developers. Flask also enhances scalability of applications by using various extensions and third-party libraries, thus enabling customization and extension of functionalities based on project requirements [3].

2.5 OpenAPI:

The OpenAPI Specification (formerly Swagger Specification) is an API description format for REST APIs. An OpenAPI file allows you to describe your entire API, including:

a. Available endpoints (e.g.: /users) and operations on each endpoint (e.g.: GET /users, POST /users).

b. Operation parameters input and output for each operation.

c. Authentication methods.

d. Contact information, license, terms of use and other relevant information.

API specifications can be written in YAML or JSON. The format is easy to learn and readable to both humans and machines [4].

 

3. Architecture

The architecture diagram below (see Figure 1) illustrates how to seamlessly integrate this service into your existing infrastructure repositories. If you are currently using Azure DevOps and Azure for deploying your projects, you will find this concept to be highly valuable. By bootstrapping specific use cases with an API service, the software team can directly push their business logic into a service that empowers them to harness the cloud’s capabilities, including additional services such as API Management Service. To comprehend how to deliver this service, the reader needs to understand the following:

i. Separation of infrastructure deployment and API business logic deployment gives us a service for APIs. This way we can easily provide the infrastructure of this API Service to other projects.

ii. The separate deployment of the API business functionality which includes operations, endpoints and the business logic was the complex part of this setup and will be explained in this blog series.

iii. To make the second point work, we applied the approach of code first implementation. This means we develop the API first and then out of it we generate the needed information for the update of the Barebone serverless API.

Afterwards we will go through the list of resources in order of their deployment with additional information. This way we not only explain the depicted items in the diagram, but it also should convince the reader why automating the deployment of these resources is a good idea.

Use-Case architecture for deploying APIs in Azure API-Management Service
Figure 1. Use-Case architecture for deploying APIs in Azure API-Management Service by using Terraform. You can see the desired split between the deployment of the resources and the deployment of the API.

3.1 List of resources and manual deployment:

i. Deploy an Azure Function App and add, for example, a Flask API to handle the requests. Any request handler would do the job, it is up to you. There is also Django for programming in Python, but we chose Flask since it is lightweight and ideal for RESTful APIs or microservices, as well as highly customizable.

ii. Deploy the API Management Service and in addition to that:

a. Deploy Application Insights. Azure Application Insights is a comprehensive application performance monitoring and diagnostics service offered by Microsoft Azure. This service aids developers and IT professionals in obtaining valuable insights into the health, performance, and usage of their applications.

b. Deploy Log Analytics Workspace. This is a service that allows you to collect, store, and analyze log and telemetry data from various sources, including applications, infrastructure, and services. It is often used in conjunction with Azure Monitor or Applications Insights to gain deep insights into the performance, availability, and behavior of your applications and resources.

iii. Now you need to add an API to the API Management Service. Here you choose the previously deployed Azure Function App.

a. The Endpoints and Operations also need to be configured.

b. A deployment pipeline for the API development could also be set up, and API Management Service could be used for version management.

iv. You will also need to choose the service plan. The Consumption plan is a good option if you only want to pay for the time you use the resources.

While the list of steps may seem short, there are many additional configurations that need to be addressed. These configurations must be replicated for each similar setup, making it highly desirable to automate these tasks. Additionally, Terraform takes care of resource monitoring.

3.2 Barebone serverless API

Firstly, we will define what the Barebone serverless API is and give some additional information and explain how it enables us to realize an API Service. After that we will explain the deployment of the business logic procedure. Lastly, we will demonstrate another way to implement an API Service that does not separate infrastructure from business logic. We will also show how this approach differs from our own.

When using the API Management Service, it is necessary to register an API within it. This registered API, together with the Azure Function, collectively forms what we refer to as the “Barebone serverless API”. To make this set up functional, specific configurations are required within the API Management Service. These configurations enable the service to redirect incoming requests to the Azure Function app and specify the necessary operations and endpoints for accurate request forwarding.

In Figure 2, you can observe an example of such a registered API, highlighting the importance of the endpoints and operations set to seamless work with the API Management Service. These settings are established using the OpenAPI document. When combined with the Azure Function and the Flask API contained within, this assembly can be viewed as a cohesive unit, which we have named the "Barebone serverless API." Barebone because after deploying the Flask API with default business logic, we only need to update said business logic and for that to work, we of course also must update the operations and endpoints of the registered API. Most importantly is that all elements related to infrastructure setup and connections remain unaffected by these updates.

A key idea in this setup is the Azure Function ‘run from package’ capability. This allows us to deploy and run our Function app by using a pre-compiled package or zip file. This approach offers several advantages such as simplified deployment. We only need to upload a single package that contains the function app code and dependencies. Even more importantly we need to tell Terraform to ignore changes to the package. This enables us to develop our API and deploy new versions, which will not be reset in the next Terraform apply run. Also, changes to the registered API in the API Management Service like its name, the endpoints, or the operations need to be ignored.

API being registered in the API Management Service
Figure 2. An example of an API being registered in the API Management Service [5].

3.3 Deployment of Business Functionality

A key idea we need to emphasize is the Code First approach, which means here, that the business functionality for the API gets developed first and out of it the endpoints and operations for the API will be extracted using an OpenAPI document. This OpenAPI document will then be used to update the Operations and Endpoints in the registered API. This leads to the favorable outcome of preventing undesired provisioning of endpoints that the API cannot serve, or the absence of expected endpoints within the API Management Service.

The following essential steps are necessary for the deployment pipeline. It is important to note that Azure offers CLI commands for registering new resources or updating existing ones, such as an Azure Function or an API in the API Management Service which are used in the deployment pipeline.

i. Package the Flask API.

ii. Generate Open API document.

iii. Update Azure Function.

iv. Update Registered API.

Let us elaborate further on these points and describe a DevOps procedure. Develop and build your API by using the Python Flask API Framework. Operations, endpoints, and business logic are developed according to the requirements and applying best practices (for example, structuring the code based on maintainability, scalability and developing unit tests).

We then let the pipeline automatically generate an Open API document that describes the structure and the functionality. It also serves as documentation and specification for our API endpoints, requests/response schemas and any other additional metadata. At this point we need to package the API with all its dependencies into a ZIP File. After that we can use this package to update the business logic of the Azure Function in the cloud. Lastly, we can update the registered API with the corresponding endpoints and operations using the Open API document.

3.4 Alternative way for deploying an API Service

To better understand the separation of the infrastructure and API development, we will discuss an alternative method for implementing an API service and explain the reasons for our preferred architecture solution.

We could deploy the operations and endpoints of the API using the terraform resource block azurerm_api_management_api_operation. Following that, one could deploy the business logic into the Azure Function responsible for request handling. We are not using this approach. Our objective is to achieve a complete separation between the development of the API and the infrastructure code.

Our solution has an obvious advantage, it could technically happen that the business logic of your API has operations and endpoints set, which are not included in the registered API or maybe has endpoints which are no longer handled by the business logic. This problem cannot occur with our setup.

 

4. Conclusion

In this post you have been presented with the idea and key points on how to set up an API Service in Azure using Terraform. Ultimately, this Barebone serverless API idea can be abstracted, and other services can be offered. As you probably already realized we did not go into the specifics on how to implement all of this. In the next part we will explain the necessary code and what we had to figure out to make this work.

The following is a list of the key aspects we will address in the next blog:

  1. Run Azure Function from package.
  2. Generate the Open API document.
  3. Register an API in API Management Service.
  4. Update the Azure Function.
  5. Update the registered API using the Open API document.
  6. We will show how to use Managed Identity for hiding the Azure Function behind the API Management Service.

 

5. References

[1]: https://learn.microsoft.com/en-us/azure/api-management/

[2]:https://developer.hashicorp.com/terraform/tutorials/aws-get-started/infrastructure-as-code

[3]: https://flask-restx.readthedocs.io/en/latest/

[4]: https://swagger.io/

[5]: https://learn.microsoft.com/en-us/azure/api-management/import-and-publish

Share this article with others

About the authors

Mustafa Akman has been working in his position as a Software and Data Engineer at Woodmark since 2019. His work strongly focuses on the areas of Big Data, Cloud, and DevOps. With great passion, he develops customized solutions in client projects, relying on technologies such as Python, Scala, SQL, and Databricks.

Samuel Fernandez has been working as an Associated Consultant at Woodmark since September 2022.  In his role, he has actively engaged with topics such as Data Engineering and Software Development. He is also enthusiastic about DevOps and Cloud technologies (AWS Certified).

To overview blog posts