Skip to main content

Entra ID login to a CIS Hardened Linux Azure Virtual Machine

· 7 min read

Currently, CIS (Center for Internet Security) Azure Marketplace images, do not support being Entra ID (Azure Active Directory) Joined.

Although this article is about allowing Entra ID login to a Ubuntu machine, its worth noting the current decisions around Windows as well currently:

'The Windows CIS Benchmarks are written for Active Directory domain-joined systems using Group Policy, not standalone/workgroup systems. Adjustments/tailoring to some recommendations will be needed to maintain functionality if attempting to implement CIS hardening on standalone systems or a system running in the cloud. 'Currently, we do not support Azure Active Directory and it is not compatible with our EXISTING Hardened Images.

In fact when you, attempt to create a CIS Level 1 Ubuntu image in the Azure Portal, you get:

"This image does not support Login with Azure AD."

CIS Image does not support Login with Azure AD

However, as I go into below, we can indeed login to the CIS hardened image, using the Microsoft Azure AD based SSH Login extension.

Be wary, that although this works, you may run into issues with operational support of this from CIS, due to the hardening. This is also Entra ID LOGIN (not JOINED!). You won't see the device under Entra ID Devices.

Entra ID login to a CIS Hardened Linux Azure Virtual Machine

There are many security benefits of using Azure AD with SSH log in to Linux VMs in Azure, including:

  1. Use of your Entra ID (Azure AD) credentials to log in to Azure Linux VMs.
  2. Get SSH certificate-based authentication without needing to distribute SSH keys to users or provision SSH public keys on any Azure Linux VMs you deploy.
  3. Reduce reliance on local administrator accounts, credential theft, and weak credentials.
  4. Password complexity and password lifetime policies configured for Azure AD help secure Linux VMs as well.
  5. With Azure role-based access control, specify who can login to a VM as a regular user or with administrator privileges. When users join or leave your team, you can update the Azure RBAC policy for the VM to grant access as appropriate. When employees leave your organization and their user account is disabled or removed from Azure AD, they no longer have access to your resources.
  6. With Conditional Access, configure policies to require multi-factor authentication and/or require client device you are using to SSH be a managed device (for example: compliant device or hybrid Azure AD joined) before you can SSH to Linux VMs.
  7. Use Azure deploy and audit policies to require Azure AD login for Linux VMs and to flag use of non-approved local accounts on the VMs.

So after your CIS Hardened Image, in my case I am using Ubuntu 20.04 has been deployed in Azure. Lets set this up.

You will need to make sure you have a few prerequsites.

Prerequisites

Network

VM network configuration must permit outbound access to the following endpoints over TCP port 443.

https://packages.microsoft.com: For package installation and upgrades. http://169.254.169.254: Azure Instance Metadata Service endpoint. https://login.microsoftonline.com: For PAM-based (pluggable authentication modules) authentication flows. https://pas.windows.net: For Azure RBAC (Role Based Access Control) flows.

Also make sure you have enabled TCP Port 80 for: ubuntu.com, specifically http://archive.ubuntu.com as the Microsoft Azure AD based SSH Login, will need to download and install the following packages: aadsshlogin and aadsshlogin-selinux as needed.

Virtual Machine

The CIS hardened image, will need to have a System Managed Identity setup. This can be easily enabled in the Identity blade of the Virtual Machine.

The Entra ID (Azure Active Directory) users that need to login with to the Linux Virtual Machine are a member of one of the following Azure RBAC (Role Based Access Control) groups, as per their requirements:

RBAC RoleNotes
Virtual Machine Administrator LoginView Virtual Machines in the portal and login as administrator
Virtual Machine User LoginView Virtual Machines in the portal and login as a regular user.

Only one role is required. These roles are supported for both Windows and Linux.

Client

On the jumphost, client PC you will be connecting to the Linux virtual machine from, you need the latest Azure CLI with the Azure CLI extension 'ssh' installed.

az extension add --name ssh

The minimum version required for the extension is 0.1.4.

az extension show --name ssh

Install Extension

Make sure the Virtual Machine is on, for the extension to install.

Install using the Azure Portal

Once the pre-requsites have been met, it is time to install the extension.

  1. Login to the Azure Portal
  2. Navigate to your CIS Hardened Virtual Machine
  3. Click on Extensions + Applications
  4. Click + Add
  5. Azure Portal - Extensions
  6. Search for: Azure AD based SSH Login
  7. Azure Portal - SSH Extension
  8. Select the Azure AD base SSH Login extension
  9. Click Next
  10. Click Review and Create
  11. Click Create

After a few moments, the extension and supporting components, will be installed. You can check the status under the Extensions + Application blade make sure that Provisioning has been succeded, before provisioning to the next step to login.

Azure Virtual Machine Extension

Install using Terraform

You can use the following Terraform code snippet, to install the extension to a Linux Virtual Machine:

Make sure you assign the extension, to the virtual machine, using the ID.

resource "azurerm_virtual_machine_extension" "aad_login" {
name = "AADLogin"
virtual_machine_id =
publisher = "Microsoft.Azure.ActiveDirectory"
type = "AADSSHLoginForLinux" # For Windows VMs: AADLoginForWindows
type_handler_version = "1.0" # There may be a more recent version
}
Install using PowerShell

To install the AADLogin extension to a Linux Virtual Machine in Microsoft Azure using PowerShell, you can follow these steps:

  1. Open PowerShell on your local machine, or Azure Cloud Shell.

    Connect-AzAccount Select-AzSubscription -SubscriptionName "Your Subscription Name" $vm = Get-AzVM -ResourceGroupName "Your Resource Group Name" -Name "Your VM Name" Set-AzVMExtension -ResourceGroupName $vm.ResourceGroupName -VMName $vm.Name -Name "AADLoginForLinux" -Publisher "Microsoft.Azure.ActiveDirectory.LinuxSSH" -Type "AADLoginForLinux" -TypeHandlerVersion "1.0"

Login to Virtual Machine

Now that the extension is stood, up its time to connect.

REMEMBER to make sure that the Entra ID (AAD) accounts you want to login to the Linux image is a member of either of the following roles, directly assigned to the Virtual Machine/Resource Group, or in an Entra ID (AAD) group that has been delegated the permissions.

RBAC RoleNotes
Virtual Machine Administrator LoginView Virtual Machines in the portal and login as administrator
Virtual Machine User LoginView Virtual Machines in the portal and login as a regular user.
  1. Open a Command Prompt
  2. Log in to the Azure using: az login
  3. A web browser will prompt and ask you to authenticate, where you will go through the MFA (Multifactor Authentication) and complete login to your Entra ID account.
  4. Once authenicated you can run: az ssh vm -n cistest -g cistest Note: -n is the VM name and -g is the Resource Group, that the VM is located inside.

Entra ID Login - Linux VM

You should now have successfully authenticated to your Linux virtual machine using Entra ID credendials.

Troubleshooting

If you are unable to connect, it may be due to an issue with the AADSSHLogin extension. You can troubleshoot by reviewing the extension log.

cat /var/log/azure/Microsoft.Azure.ActiveDirectory.AADSSHLoginForLinux/CommandExecution.log

The Azure directory is protected, so you will need Administrator rights to delve into the logs.

Changing the default Management Group in Azure

· 2 min read

By default, when a Management Group gets created, it goes under the Root Management Group, the same is true for Subscriptions.

This works fine, when you have a simple Microsoft Azure environment, but as soon as you start expanding into areas such as Subscription vending or limited access to who can see the Root Management Group and start to look into Visual Studio Enterprise subscriptions, you may want to consider moving new subscriptions, under its own Management Group, away from any policies or RBAC controls, essentially into a Management Group that acts as a shopping cart, to then be picked up and moved later.

If we refer to a recommendation on the Microsoft Cloud Adoption Framework:

Configure a default, dedicated management group for new subscriptions. This group ensures that no subscriptions are placed under the root management group. This group is especially important if there are users eligible for Microsoft Developer Network (MSDN) or Visual Studio benefits and subscriptions. A good candidate for this type of management group is a sandbox management group.

So, how can we change the default Management Group, that new Subscriptions go into?

Lets take a look at the different ways we could use to update the default management group, that new subscriptions go into.

Configure using Azure Portal

  1. Use the search bar to search for and select 'Management groups'.
  2. On the root management group, select details next to the name of the management group.
  3. Under Settings, select Hierarchy settings.
  4. Select the Change default management group button.

Configuring using Bicep

resource symbolicname 'Microsoft.Management/managementGroups/settings@2021-04-01' = {
name: 'default'
parent: resourceSymbolicName
properties: {
defaultManagementGroup: 'string'
requireAuthorizationForGroupCreation: bool
}
}

Reference: Microsoft.Management managementGroups/settings

Configure using REST API using PowerShell

$root_management_group_id = "Enter the ID of root management group"
$default_management_group_id = "Enter the ID of default management group (or use the same ID of the root management group)"
$body = '{
"properties": {
"defaultManagementGroup": "/providers/Microsoft.Management/managementGroups/' + $default_management_group_id + '",
"requireAuthorizationForGroupCreation": true
}
}'
$token = (Get-AzAccessToken).Token
$headers = @{"Authorization"= "Bearer $token"; "Content-Type"= "application/json"}
$uri = "https://management.azure.com/providers/Microsoft.Management/managementGroups/$root_management_group_id/settings/default?api-version=2021-04-01"
Invoke-RestMethod -Method PUT -Uri $uri -Headers $headers -Body $body

Configure using Terraform

resource "azurerm_management_group_subscription_association" "example" {
management_group_id = data.azurerm_management_group.example.id
subscription_id = data.azurerm_subscription.example.id
}

Note: Not quite the same, as configuring the default setting as above - but you can specify the Managament Group association for subscriptions using Terraform.

Access denied on Azure VM when using aztfexport

· 2 min read

When attempting to use aztfexport, a tool designed to export currently deployed Azure resources into HashiCorp Configuration Language (HCL) for use with Terraform, you may get: Access denied.

When using aztfexport, the first thing you need to do is make sure you have the Azure CLI installed, and run an:

az login

After logging in, you need to verify you are on the right subscription, by running:

az account show

If you are on the right subscription, you don't need to do anything. If you are in the wrong subscription then run:

az account list

Find the susbcription ID then use:

az account set --subscription <name or id>

You only need Reader rights to be able to export the Terraform configuration.

If you find you are still running into access denied issues, such as below:

aztfexport - Access denied

And you are running the aztfexport program on an Azure Virtual Machine, such as Azure Virtual Desktop or Devbox, what is happening is the Managed Identity permissions of your Azure Virtual Machine is overriding your own permissions you used to login to Azure using the CLI.

To get around this, you either have to run aztfexport locally, on a device thats not an Azure Virtual Machine, or supply the Managed Identity of the Virtual Machine, Reader rights to the subscription you wish to do the export from.

You can do this, by navigating to your Azure Virtual Machine, in the Azure Portal, click on the Virtual Machine, select Identity, select Azure role assignments and grant it Reader rights to the Resource Group or Subscription you are targeting.

You could try Disabling the System Assigned Managed Identity as well, which appeared to work for me.

For more information about this error, please refer to the following Github issue: Access Denied during xport on Azure Virtual Desktop.

Azure OpenAI error log summarization with completion

· 2 min read

I was assisting a user on Microsoft Q&A with an issue, that involved looking over some event logs.

The issue itself was related to the Nested Virtualization, with the user unable to install Hyper-V or WSL (Windows Subsystem for Linux), it turned out to be incompatilibies with the SKU size and Secure boot.

But as part of troubleshooting this issue, I recreated the Azure compute environment, this user had and started to delve into the Windows logs.

However, in this case I did something a bit different, I exported the logs as text file and opened up Azure OpenAI, then navigated to Azure OpenAI Studio, clicked on Completion and used the summarization powers of the GPT 3.5 Large language model, to delve into the logs for me:

Azure OpenAI - Summarize Error Log

Copying, the Log into the Completion pane of Azure OpenAI

And using the Prompt of:

----
Summarize all the errors and warnings from above and sort by potential cause of the issues, with the most likely cause first. Format as a table.

Azure OpenAI was able to use the reasoning ability of the GPT 3.5 LLM (Large language Model) to sort through 115 lines of Logs, and work out the probability of what could be causing the root cause of the issue.

As you can see, Azure OpenAI and the LLMs can not just be used as an assistant in writing, studying it can be learned to assist in Incident and root-cause resolution.

Bring Your Data to Life with Azure OpenAI

· 13 min read

Today, we will look at using Azure OpenAI and 'Bring Your Data' to allow recent documentation to be referenced and bring life (and relevance) to your data.

Bring Your Data to Life with Azure OpenAI

The example we are going to use today is the Microsoft Learn documentation for Microsoft Azure.

Our scenario is this:

  • We would like to use ChatGPT functionality to query up-to-date information on Microsoft Azure; in this example, we will look for information on Azure Elastic SAN (which was added in late 2022).

When querying ChatGPT for Azure Elastic SAN, we get the following:

ChatGPT - Azure Elastic SAN Query

Just like the prompt states, ChatGPT only has data up to September 2021 and isn't aware of Elastic SAN (or any other changes/updates or new (or retired) services after this date).

So let us use the Azure OpenAI and bring in outside data (ground data), in this case, the Azure document library, to overlay on top of the GPT models, giving the illusion the model is aware of the data.

To do this, we will leverage native 'Bring Your Own Data' functionality, now in Azure OpenAI - this is in Public Preview as of 04/07/2023.

To be clear, I don't expect you to start downloading from GitHub; this is just an example I have used to add your data. The ability to bring in updated data on Azure, specifically, will be solved by Plugins, such as Bing Search.

To do this, we will need to provision a few Microsoft Azure services, such as:

  1. Azure Storage Account (this will hold the Azure document library (markdown files) in a blob container)
  2. Cognitive Search (this search functionality, is the glue that will hold this solution together, by breaking down and indexing the documents in the Azure blob store)
  3. Azure OpenAI (to do this, we will need GPT3.5 turbo or GPT4 models deployed)
  4. Optional - Azure Web App (this can be created by the Azure OpenAI service, to give users access to your custom data)

Make sure you have Azure OpenAI approved for your subscription

We will use the following tools to provision and configure our services:

  1. Azure Portal
  2. PowerShell (Az Modules)
  3. AzCopy

Download Azure Documents

First, we will need the Azure documents to add to our blob storage.

The Microsoft Learn documentation is open-sourced and constantly updated using a git repository hosted on GitHub. We will download and extract the document repository locally (roughly 6 GB). To do this, we will use a PowerShell script:

     $gitRepoUrl = "https://github.com/MicrosoftDocs/azure-docs"
$localPath = "C:\temp\azuredocs"
$zipPath = "C:\temp\azurdocs.zip"
#Download the Git repository and extract
Invoke-WebRequest -Uri "$gitRepoUrl/archive/master.zip" -OutFile $zipPath
Expand-Archive -Path $zipPath -DestinationPath $localPath

Create Azure Storage Account

Now that we have a copy of the Azure document repository, it's time to create an Azure Storage account to copy the data into. To create this storage account, we will use PowerShell.

     # Login to Azure
Connect-AzAccount
# Set variables
$resourceGroupName = "azuredocs-ai-rg"
$location = "eastus"
$uniqueId = [guid]::NewGuid().ToString().Substring(0,4)
$storageAccountName = "myaistgacc$uniqueId"
$containerName = "azuredocs"
# Create a new resource group
New-AzResourceGroup -Name $resourceGroupName -Location $location
# Create a new storage account
New-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName -Location $location -SkuName Standard_LRS -AllowBlobPublicAccess $false
# Create a new blob container
New-AzStorageContainer -Name $containerName -Context (New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value[0])

We have created our Resource Group and Storage account to hold our Azure documentation.

Upload Microsoft Learn documentation to an Azure blob container

Now that we have the Azure docs repo downloaded and extracted and an Azure Storage account to hold the documents, it's time to use AzCopy to copy the documentation to the Azure storage account. We will use PowerShell to create a SAS token (open for a day) and use that with AzCopy to copy the Azure repo into our newly created container.

     # Set variables
$resourceGroupName = "azuredocs-ai-rg"
$storageAccountName = "myaistgacc958b"
$containerName = "azuredocs"
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value[0]
$localPath = "C:\Temp\azuredocs\azure-docs-main"
$azCopyPath = "C:\tools\azcopy_windows_amd64_10.19.0\AzCopy.exe"
# Construct SAS URL for destination container
$sasToken = (New-AzStorageContainerSASToken -Name $containerName -Context (New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey) -Permission rwdl -ExpiryTime (Get-Date).AddDays(1)).TrimStart("?")
$destinationUrl = "https://$storageAccountName.blob.core.windows.net/$containerName/?$sasToken"
# Run AzCopy command as command line
$command = "& `"$azCopyPath`" copy `"$localPath`" `"$destinationUrl`" --recursive=true"
Invoke-Expression $command

Note: I took roughly 6 minutes to copy the Azure docs repo from my local computer (in New Zealand) into a blob storage account in East US, so roughly a gigabyte a minute.

Azure Storage Account - Microsoft Learn Docs

Now that we have our Azure Blob storage accounts, it's time to create our Cognitive Search. We will need to create a Cognitive Search, with an SKU of Standard, to support the 6GBs of Azure documents that must be indexed. Please check your costs; this is roughly NZ$377.96 a month to run; you can reduce this cost by limiting the amount of data you need to index (i.e., only certain documents, not an entire large repository of markdown files). Make sure you refer to the Pricing Calculator.

     # Set variables
$resourceGroupName = "azuredocs-ai-rg"
$searchServiceName = "azuredocssearchservice" #Cognitive Service name needs to be lowercase.
# Create a search service
Install-Module Az.Search
$searchService = New-AzSearchService -ResourceGroupName $resourceGroupName -Name $searchServiceName -Location "eastus" -Sku Standard

Now that the cognitive search has been created, we need to create the index, and indexers, which will index our Azure documents to be used by Azure OpenAI by creating the index and linking it to the Azuredocs blob container, we created earlier.

Note: There is no PowerShell cmdlet support for Azure Cognitive Search indexes; you can create using the RestAPI - but we will do this in the Azure Portal as part of the next step.

Create Azure Cognitive Search Index

It's time to time to create the Cognitive Search Index, an indexer that will index the content.

We will move away from PowerShell and into the Microsoft Azure Portal to do this.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Cognitive Search
  3. Click on Cognitive Search
  4. Click on your newly created Cognitive Search Azure Portal - Cognitive Search
  5. Select Import Data
  6. Select Azure Blob Storage Azure Portal - Cognitive Search - Add Azure Blob Storage
  7. Type in your data source name (i.e., azuredocs)
  8. For the Connection string, select Choose an existing connection
  9. Select your Azure Storage account and a container containing the Azure document repository uploaded earlier.
  10. Click Select Azure Portal - Cognitive Search - Add Azure Blob Storage
  11. Click Next: Add cognitive skills (Optional)
  12. Here, you can Enrich your data, such as enabling OCR (extracting text from images automatically) or extracting people's names, and translating text from one language to another; these enrichments are billed separately, and we won't be using any enrichments so we will select Skip to: Customize target index.
  13. Here is the index mapping that was done by Cognitive Search automatically by scanning the schema of the documents. You can bring in additional data about your documents if you want, but I am happy with the defaults, so I click: Next: Create an indexer Azure Portal - Cognitive Search - Search Index
  14. The indexer is what is going to create your index, which will be referenced by Azure OpenAI later; you can schedule an indexer to run hourly, if new data is being added to the Azure blob container where your source files are sitting, for my purposes I am going leave the Schedule as: Once
  15. Uncollapse Advanced Options and scroll down a bit
  16. Here, we can select to only index certain files; for our purposes, we are going to exclude png files, the Azure document repository contains png images files that aren't able to be indexed (we aren't using OCR), so I am going to optimize the indexing time slightly by excluding them. You can also exclude gif/jpg image files. Azure Portal - Cognitive Search - Create Search Indexer
  17. Finally, hit Submit to start the indexing process. This could take a while, depending on the amount of data
  18. Leave this running in the background and navigate to the Cognitive Search resource, Overview pane to see the status. Azure Portal - Cognitive Search - Indexer

Note: You can also run the Import Data in Azure OpenAI Studio, which will trigger an index - but you need to keep your browser open and responsive. Depending on how much data you are indexing, doing it manually through this process could be preferred to avoid browser timeout. You also get more options around the index.

Create Azure OpenAI

Now that we have our base Azure resources, it's time to create Azure OpenAI. Make sure your region and subscription have been approved for Azure OpenAI.

Run the following PowerShell cmdlets to create the Azure OpenAI service:

To create the Azure OpenAI service, we will be using the Azure Portal.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Azure OpenAI
  3. In the Cognitive Services, Azure OpenAI section, click + Create
  4. Select your subscription, region, name, and pricing tier of your Azure OpenAI service (remember certain models are only available in specific regions - we need GPT 3.5+), then select Next Azure OpenAI - Create Resource
  5. Update your Network Configuration; for this demo, we will select 'All Networks' - but the best practice is to restrict it. Click Next Azure OpenAI - Create Resource
  6. If you have any Tags, enter them, then click Next
  7. The Azure platform will now validate your deployment (an example is ensuring that the Azure OpenAI has a unique name). Review the configuration, then click Create to create your resource. Azure OpenAI - Create Resource

Now that the Azure OpenAI service has been created, you should now have the following:

  • An Azure OpenAI service
  • A Storage account
  • A Cognitive Search service

Azure OpenAI - RAG Deployed Resources

Deploy Model

Now that we have our Azure OpenAI instance, it's time to deploy our Chat model.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Azure OpenAI
  3. Open your Azure OpenAI instance to the Overview page, and click: Go to Azure OpenAI Studio
  4. Click on Models, and verify that you have gpt models (ie, gpt-36-turbo, or gpt-4). If you don't, then make sure you have been onboarded.
  5. Once verified, click on Deployments
  6. Click on + Create new deployment
  7. Select your model (I am going to go with gpt-35-turbo), type in a deployment name, and then click Create
  8. Once deployment has been completed, you may have to wait up to 5 minutes for the Chat API to be aware of your new deployment.

Azure OpenAI - Deploy Model

Run Chat against your data

Finally, it's time to query and work with our Azure documents.

We can do this using the Chat Playground, a feature of Azure OpenAI that allows us to work with our Chat models and adjust prompts.

We will not change the System Prompt (although to get the most out of your own data, I recommend giving it ago); the System Prompt will remain as: You are an AI assistant that helps people find information.

  1. Navigate to the Microsoft Azure Portal
  2. In the top center search bar, type in Azure OpenAI
  3. Open your Azure OpenAI instance to the Overview page, and click: Go to Azure OpenAI Studio
  4. Click Chat
  5. Click Add your data (preview) - if this doesn't show, ensure you have deployed a GPT model as part of the previous steps.
  6. Click on + Add a data source
  7. Select the dropdown list in the Select data source pane and select Azure Cognitive Search
  8. Select your Cognitive Search service, created earlier
  9. Select your Index
  10. Check, I acknowledge that connecting to an Azure Cognitive Search account will incur usage to my account. View Pricing
  11. Click Next
  12. It should bring in the index metadata; for example - our content data is mapped to content - so I will leave this as is; click Next
  13. I am not utilizing Semantic search, so I click Next
  14. Finally, review my settings and click Save and Close
  15. Now we can verify our own data is getting checked by leaving the: Limit responses to your own data content checked
  16. Then, in the Chat session, in the User message, I type: Tell me about Azure Elastic SAN?
  17. It will now reference the Cognitive Search and bring in the data from the index, including references to the location where it found the data!

Azure OpenAI - Chat Playground

Optional - Deploy to an Azure WebApp

Interacting with our data in the Chat playground can be an enlightening experience, but we can go a step further and leverage the native tools to a chat interface - straight to an Azure web app.

To do this, we need to navigate back to the Chat playground, ensure you have added your own cognitive search, and can successfully retrieve data from your index.

  1. Click on Deploy to
  2. Select A new web app...
  3. If you have an existing WebApp, you can update it with an updated System Message or Search Index from the Chat playground settings, but we will: Create a new web app
  4. Enter a suitable name (i.e., AI-WebApp-Tst - this needs to be unique)
  5. Select the Subscription and Resource Group and location to deploy to. I had issues accessing my custom data, when deployed to Australia East (as AI services are in East US), so I will deploy in the same region as my OpenAI and Cognitive Search service - i.e., East US.
  6. Specify a plan (i.e., Standard (S1))
  7. Click Deploy

Azure OpenAI - Deploy

Note: Deployment may take up to 10 minutes to deploy; you can navigate to the Resource Group you are deploying to, select Deployments, and monitor the deployment. Once deployed, it can take another 10 minutes for authentication to be configured.

Note: By default, the Azure WebApp is restricted to only be accessible by yourself; you can expand this out by adjusting the authentication.

Once it is deployed, you can access the Chat interface, directly from an Azure WebApp!

Azure OpenAI - Run Azure WebApp

If you navigate to the WebApp resource in the Azure Portal, and look at the Configuration and Application Settings of your WebApp, you can see variables used as part of the deployment. You can adjust these, but be wary as it could break the WebApp, I would recommend redeploying/updating the WebApp for major changes, from Azure OpenAI studio. Azure OpenAI - App Service - App Settings