Automated Azure Resource Naming and Deployment Made Easy: A Self-Service Infrastructure (Preview) Approach
- Arlan Nugara

- Jan 20
- 6 min read
Updated: May 4
In today's rapidly evolving cloud landscape, efficient and automated management of Azure resources is essential. This blog post delves into a cutting-edge solution that streamlines both resource naming and deployment through a pipeline-driven process—all without relying on the Terraform Plus license or third-party tools like Morpheus Data. By harnessing multiple technology stacks, this approach provides a self-service infrastructure (currently in preview) that simplifies the complexities of Azure resource management while ensuring clarity, ease of maintenance, and streamlined operations.
The solution outlines a comprehensive process beginning with prerequisites such as configuring an Azure Storage Account (with a dedicated container to store Terraform state files) and establishing a robust service connection in Azure DevOps. It then walks you through setting up essential components like provider file configurations, updating pipeline YAML files, and integrating the Azure Naming Tool API for dynamic resource naming. Following this, detailed instructions guide you in creating both deploy and destroy pipelines via Azure DevOps, highlighting the necessary parameters and ensuring smooth operational transitions.
Overview of the process
The code utilizes multiple technological stack to overcome the combined hurdle of automated Azure resources naming and resource deployment and also include properties of self-service Infrastructure (preview) without using Terraform Plus licence or any third party tool such as Morpheus data. Despite the complex process the code is simplified for ease of understanding and maintenance. The process is mostly Pipeline driven. Let's understand how the code works.
Deploy Process

Destroy Process

Prerequisites
Azure Storage Account
Please create an Azure Storage Account and a container named tfstate to store Terraform State Files (You may change the name of the container in providers.tf). Please note the Service Principle should have access to the Storage Account. Note the Access Key for the Storage Account from Azure Portal.
Note: The details of the Storage Account must be filled in provider.tf file backend configuration.
Service Connection
Azure DevOps Pipeline requires Service Connection to run tasks. The Service Principle should have access to Key Vault Secrets (Get and List Permission) to retrieve Key Vault Secret Values required during running the task. Please refer to this official article for creating the Service Connection from a Service Principle. Note the following values for a Service Principle from Azure Portal.
Configure Azure Naming Tool
It is out of the scope of this process to include full tutorial about Azure Naming Tool. However, please check few tutorial available here and here. As part of this POC project we have configured the Naming Tool to generate resource name.
Azure Naming Tool API Keys
The API key is used to authenticate external tools for getting names from the Azure Naming Tool. In order to obtain an API key please follow the process -
Go to Azure Naming App Web Portal.
Login by clicking on Admin option from left menu.
Check the Keys under API Keys option.
Note the following values
Key Vault
An Azure Key Vault is required to store Secrets which are used by the pipeline to authenticate against Azure and Azure DevOps to perform it's desired operation. Please note the Service Principle mentioned above must have GET and LIST for the Key Vault Secrets. Please create the secrets in Azure Key Vault. You may refer to the Service Connection and Azure DevOps PAT and URL section for values.
Secrets to be created in Azure Key Vault
ARM-CLIENT-ID
ARM-CLIENT-SECRET
ARM-TENANT-ID
ARM-ACCESS-KEY
SA-NAME
API-KEY
Variable Groups
The code needs an Azure DevOps Pipeline Variable Group linked to an existing Azure Key Vault containing the Secrets. Please refer to this official article for more details.
Updating Pipeline YAML file with values
Once done with all the above steps update the both the pipeline files inside .pipelines folder in the repository. You should also change values of Azure DevOps Pipeline Parameters as pr your Infrastructure. Please check here for more details.
variables:
- name: AZURE_SERVICE_CONNECTION
value: '< SERVICE CONNECTION NAME >'
- group: '< VARIABLE GROUP NAME LINKED TO KEY VAULT >'
- name: SUBSCRIPTION_ID
${{ if eq(parameters.Subscription, 'SUB-1') }}:
value: xxxxx-xxxxx-xxxxx-xxxxx-xxxxx
${{ if eq(parameters.Subscription, 'SUB-2') }}:
value: yyyyy-yyyyy-yyyyy-yyyyy-yyyyy
Update Provider file with values
You need to update provider.tf file with values for the Azure Storage Account which will host the Terraform State file.
backend "azurerm" {
resource_group_name = "< Storage Account Resource Group Name >"
storage_account_name = "< Storage Account Name >"
container_name = "tfstate"
key = "PLACEHOLDER"
}
Note :: You can make changes for all the values excluding the key value which is dynamically passed in pipeline task command to achieve dynamic state file name.
Pipelines
Once the updates to the code is complete and the latest code is pushed to the repository please proceed to create the pipelines in Azure DevOps. Please follow the below instruction to create both (Deploy and Destroy) Pipelines.
Creating Deploy Pipeline
Please follow this instruction to create the deploy pipeline
Go to Pipelines in Azure DevOps
Click on New Pipeline from right top corner
Select Azure Repos Git
Select your repository containing this code
Select Existing Azure Pipelines YAML file
Select the branch and select path as self-service-deploy-storage-account.yaml
Click on Continue
Click on Save from Run drop down menu on top right corner
You may rename the pipeline by choosing Rename/move from top right corner Kebab menu
Running the Deploy Pipeline
Please follow the instruction to run deploy pipelines
Creating Destroy Pipeline
Please follow this instruction to create the destroy pipeline
Go to Pipelines in Azure DevOps
Click on New Pipeline from right top corner
Select Azure Repos Git
Select your repository containing this code
Select Existing Azure Pipelines YAML file
Select the branch and select path as elf-service-destroy-storage-account.yaml
Click on Continue
Click on Variables button and then New Variable
Provide Name as RESOURCE_NAME and keep Value empty. Select Let users override this value when running this pipeline
Click on OK button and then on Save button
Click on Save button
You may rename the pipeline by choosing Rename/move from top right corner Kebab menu
Running the Destroy Pipeline
Please follow the instruction to run destroy pipelines
Go to Pipelines in Azure DevOps.
Click on All option and click on the destroy pipeline created above
Click on Run Pipeline from top right corner
Select Apply Option as No and click on Variables option.
Click on the variable name RESOURCE_NAME and provide the resource name to be destroyed.
Click on Update button and go back.
Click on Run button
Follow the Pipeline Status
Note : - It is recommended to keep Apply Option as No for first time. Once satisfied with the Terraform Plan output you neeed to rerun the Pipeline keeping Apply Option as Yes.
Pipeline Parameters
General Parameters
Storage Account Parameters
Note - Azure DevOps does not support conditional parameters so choose the correct values. Ex -
Hierarchical Namespace can only be true when Account Tier is Standard or when Account Tier is Premium and Account Kind is BlockBlobStorage
Verify Deployed Resource
The resources can be verified from Azure Web Portal. Additionally you can verify the TFVARS and TFSTATE file from the Storage Account used in backend configuration.
Storage Account
Additional Files
Destroy the Infrastructure
Verify Destroyed Resource
The Storage Account should be deleted from the Azure Subscription including the TFSTATE and TFVARS file associated with it.


Comments