Azure Functions developer guide
In Azure Functions, all functions share some core technical concepts and components, regardless of your preferred language or development environment. This article is language-specific. Choose your preferred language at the top of the article.
This article assumes that you've already read the Azure Functions overview.
If you prefer to jump right in, you can complete a quickstart tutorial using Visual Studio, Visual Studio Code, or from the command prompt.
If you prefer to jump right in, you can complete a quickstart tutorial using Maven (command line), Eclipse, IntelliJ IDEA, Gradle, Quarkus, Spring Cloud, or Visual Studio Code.
If you prefer to jump right in, you can complete a quickstart tutorial using Visual Studio Code or from the command prompt.
If you prefer to jump right in, you can complete a quickstart tutorial using Visual Studio Code or from the command prompt.
If you prefer to jump right in, you can complete a quickstart tutorial using Visual Studio Code or from the command prompt.
If you prefer to jump right in, you can complete a quickstart tutorial using Visual Studio Code or from the command prompt.
Code project
At the core of Azure Functions is a language-specific code project that implements one or more units of code execution called functions. Functions are simply methods that run in the Azure cloud based on events, in response to HTTP requests, or on a schedule. Think of your Azure Functions code project as a mechanism for organizing, deploying, and collectively managing your individual functions in the project when they're running in Azure. For more information, see Organize your functions.
The way that you lay out your code project and how you indicate which methods in your project are functions depends on the development language of your project. For detailed language-specific guidance, see the C# developers guide.
The way that you lay out your code project and how you indicate which methods in your project are functions depends on the development language of your project. For language-specific guidance, see the Java developers guide.
The way that you lay out your code project and how you indicate which methods in your project are functions depends on the development language of your project. For language-specific guidance, see the Node.js developers guide.
The way that you lay out your code project and how you indicate which methods in your project are functions depends on the development language of your project. For language-specific guidance, see the PowerShell developers guide.
The way that you lay out your code project and how you indicate which methods in your project are functions depends on the development language of your project. For language-specific guidance, see the Python developers guide.
All functions must have a trigger, which defines how the function starts and can provide input to the function. Your functions can optionally define input and output bindings. These bindings simplify connections to other services without you having to work with client SDKs. For more information, see Azure Functions triggers and bindings concepts.
Azure Functions provides a set of language-specific project and function templates that make it easy to create new code projects and add functions to your project. You can use any of the tools that support Azure Functions development to generate new apps and functions using these templates.
Development tools
The following tools provide an integrated development and publishing experience for Azure Functions in your preferred language:
Azure Functions Core Tools (command prompt)
These tools integrate with Azure Functions Core Tools so that you can run and debug on your local computer using the Functions runtime. For more information, see Code and test Azure Functions locally.
There's also an editor in the Azure portal that lets you update your code and your function.json definition file directly in the portal. You should only use this editor for small changes or creating proof-of-concept functions. You should always develop your functions locally, when possible. For more information, see Create your first function in the Azure portal.
Portal editing is only supported for Node.js version 3, which uses the function.json file.
Portal editing is only supported for Python version 1, which uses the function.json file.
Deployment
When you publish your code project to Azure, you're essentially deploying your project to an existing function app resource. A function app provides an execution context in Azure in which your functions run. As such, it's the unit of deployment and management for your functions. From an Azure Resource perspective, a function app is equivalent to a site resource (Microsoft.Web/sites
) in Azure App Service, which is equivalent to a web app.
A function app is composed of one or more individual functions that are managed, deployed, and scaled together. All of the functions in a function app share the same pricing plan, deployment method, and runtime version. For more information, see How to manage a function app.
When the function app and any other required resources don't already exist in Azure, you first need to create these resources before you can deploy your project files. You can create these resources in one of these ways:
- During Visual Studio publishing
Using Visual Studio Code
Programmatically using Azure CLI, Azure PowerShell, ARM templates, or Bicep templates
In the Azure portal
In addition to tool-based publishing, Functions supports other technologies for deploying source code to an existing function app. For more information, see Deployment technologies in Azure Functions.
Connect to services
A major requirement of any cloud-based compute service is reading data from and writing data to other cloud services. Functions provides an extensive set of bindings that makes it easier for you to connect to services without having to work with client SDKs.
Whether you use the binding extensions provided by Functions or you work with client SDKs directly, you securely store connection data and do not include it in your code. For more information, see Connections.
Bindings
Functions provides bindings for many Azure services and a few third-party services, which are implemented as extensions. For more information, see the complete list of supported bindings.
Binding extensions can support both inputs and outputs, and many triggers also act as input bindings. Bindings let you configure the connection to services so that the Functions host can handle the data access for you. For more information, see Azure Functions triggers and bindings concepts.
If you're having issues with errors coming from bindings, see the Azure Functions Binding Error Codes documentation.
Client SDKs
While Functions provides bindings to simplify data access in your function code, you're still able to use a client SDK in your project to directly access a given service, if you prefer. You might need to use client SDKs directly should your functions require a functionality of the underlying SDK that's not supported by the binding extension.
When using client SDKs, you should use the same process for storing and accessing connection strings used by binding extensions.
When you create a client SDK instance in your functions, you should get the connection info required by the client from Environment variables.
When you create a client SDK instance in your functions, you should get the connection info required by the client from Environment variables.
When you create a client SDK instance in your functions, you should get the connection info required by the client from Environment variables.
When you create a client SDK instance in your functions, you should get the connection info required by the client from Environment variables.
When you create a client SDK instance in your functions, you should get the connection info required by the client from Environment variables.
Connections
As a security best practice, Azure Functions takes advantage of the application settings functionality of Azure App Service to help you more securely store strings, keys, and other tokens required to connect to other services. Application settings in Azure are stored encrypted and can be accessed at runtime by your app as environment variable name
value
pairs. For triggers and bindings that require a connection property, you set the application setting name instead of the actual connection string. You can't configure a binding directly with a connection string or key.
For example, consider a trigger definition that has a connection
property. Instead of the connection string, you set connection
to the name of an environment variable that contains the connection string. Using this secrets access strategy both makes your apps more secure and makes it easier for you to change connections across environments. For even more security, you can use identity-based connections.
The default configuration provider uses environment variables. These variables are defined in application settings when running in the Azure and in the local settings file when developing locally.
Connection values
When the connection name resolves to a single exact value, the runtime identifies the value as a connection string, which typically includes a secret. The details of a connection string depend on the service to which you connect.
However, a connection name can also refer to a collection of multiple configuration items, useful for configuring identity-based connections. Environment variables can be treated as a collection by using a shared prefix that ends in double underscores __
. The group can then be referenced by setting the connection name to this prefix.
For example, the connection
property for an Azure Blob trigger definition might be Storage1
. As long as there's no single string value configured by an environment variable named Storage1
, an environment variable named Storage1__blobServiceUri
could be used to inform the blobServiceUri
property of the connection. The connection properties are different for each service. Refer to the documentation for the component that uses the connection.
Note
When using Azure App Configuration or Key Vault to provide settings for Managed Identity connections, setting names should use a valid key separator such as :
or /
in place of the __
to ensure names are resolved correctly.
For example, Storage1:blobServiceUri
.
Configure an identity-based connection
Some connections in Azure Functions can be configured to use an identity instead of a secret. Support depends on the extension using the connection. In some cases, a connection string may still be required in Functions even though the service to which you're connecting supports identity-based connections. For a tutorial on configuring your function apps with managed identities, see the creating a function app with identity-based connections tutorial.
Note
When running in a Consumption or Elastic Premium plan, your app uses the WEBSITE_AZUREFILESCONNECTIONSTRING
and WEBSITE_CONTENTSHARE
settings when connecting to Azure Files on the storage account used by your function app. Azure Files doesn't support using managed identity when accessing the file share. For more information, see Azure Files supported authentication scenarios
The following components support identity-based connections:
When hosted in the Azure Functions service, identity-based connections use a managed identity. The system-assigned identity is used by default, although a user-assigned identity can be specified with the credential
and clientID
properties. Note that configuring a user-assigned identity with a resource ID is not supported. When run in other contexts, such as local development, your developer identity is used instead, although this can be customized. See Local development with identity-based connections.
Grant permission to the identity
Whatever identity is being used must have permissions to perform the intended actions. For most Azure services, this means you need to assign a role in Azure RBAC, using either built-in or custom roles which provide those permissions.
Important
Some permissions might be exposed by the target service that are not necessary for all contexts. Where possible, adhere to the principle of least privilege, granting the identity only required privileges. For example, if the app only needs to be able to read from a data source, use a role that only has permission to read. It would be inappropriate to assign a role that also allows writing to that service, as this would be excessive permission for a read operation. Similarly, you would want to ensure the role assignment is scoped only over the resources that need to be read.
Choose one of these tabs to learn about permissions for each component:
- Azure Blobs extension
- Azure Queues extension
- Azure Tables extension
- Event Hubs extension
- Service Bus extension
- Event Grid extension
- Azure Cosmos DB extension
- Azure SignalR extension
- Durable Functions storage provider
- Functions host storage
You need to create a role assignment that provides access to your blob container at runtime. Management roles like Owner aren't sufficient. The following table shows built-in roles that are recommended when using the Blob Storage extension in normal operation. Your application may require further permissions based on the code you write.
Binding type | Example built-in roles |
---|---|
Trigger | Storage Blob Data Owner and Storage Queue Data Contributor1 Extra permissions must also be granted to the AzureWebJobsStorage connection.2 |
Input binding | Storage Blob Data Reader |
Output binding | Storage Blob Data Owner |
1 The blob trigger handles failure across multiple retries by writing poison blobs to a queue on the storage account specified by the connection.
2 The AzureWebJobsStorage connection is used internally for blobs and queues that enable the trigger. If it's configured to use an identity-based connection, it needs extra permissions beyond the default requirement. The required permissions are covered by the Storage Blob Data Owner, Storage Queue Data Contributor, and Storage Account Contributor roles. To learn more, see Connecting to host storage with an identity.
Common properties for identity-based connections
An identity-based connection for an Azure service accepts the following common properties, where <CONNECTION_NAME_PREFIX>
is the value of your connection
property in the trigger or binding definition:
Property | Environment variable template | Description |
---|---|---|
Token Credential | <CONNECTION_NAME_PREFIX>__credential |
Defines how a token should be obtained for the connection. This setting should be set to managedidentity if your deployed Azure Function intends to use managed identity authentication. This value is only valid when a managed identity is available in the hosting environment. |
Client ID | <CONNECTION_NAME_PREFIX>__clientId |
When credential is set to managedidentity , this property can be set to specify the user-assigned identity to be used when obtaining a token. The property accepts a client ID corresponding to a user-assigned identity assigned to the application. It's invalid to specify both a Resource ID and a client ID. If not specified, the system-assigned identity is used. This property is used differently in local development scenarios, when credential shouldn't be set. |
Resource ID | <CONNECTION_NAME_PREFIX>__managedIdentityResourceId |
When credential is set to managedidentity , this property can be set to specify the resource Identifier to be used when obtaining a token. The property accepts a resource identifier corresponding to the resource ID of the user-defined managed identity. It's invalid to specify both a resource ID and a client ID. If neither are specified, the system-assigned identity is used. This property is used differently in local development scenarios, when credential shouldn't be set. |
Other options may be supported for a given connection type. Refer to the documentation for the component making the connection.
Local development with identity-based connections
Note
Local development with identity-based connections requires version 4.0.3904
of Azure Functions Core Tools, or a later version.
When you're running your function project locally, the above configuration tells the runtime to use your local developer identity. The connection attempts to get a token from the following locations, in order:
- A local cache shared between Microsoft applications
- The current user context in Visual Studio
- The current user context in Visual Studio Code
- The current user context in the Azure CLI
If none of these options are successful, an error occurs.
Your identity may already have some role assignments against Azure resources used for development, but those roles may not provide the necessary data access. Management roles like Owner aren't sufficient. Double-check what permissions are required for connections for each component, and make sure that you have them assigned to yourself.
In some cases, you may wish to specify use of a different identity. You can add configuration properties for the connection that point to the alternate identity based on a client ID and client Secret for a Microsoft Entra service principal. This configuration option is not supported when hosted in the Azure Functions service. To use an ID and secret on your local machine, define the connection with the following extra properties:
Property | Environment variable template | Description |
---|---|---|
Tenant ID | <CONNECTION_NAME_PREFIX>__tenantId |
The Microsoft Entra tenant (directory) ID. |
Client ID | <CONNECTION_NAME_PREFIX>__clientId |
The client (application) ID of an app registration in the tenant. |
Client secret | <CONNECTION_NAME_PREFIX>__clientSecret |
A client secret that was generated for the app registration. |
Here's an example of local.settings.json
properties required for identity-based connection to Azure Blobs:
{
"IsEncrypted": false,
"Values": {
"<CONNECTION_NAME_PREFIX>__blobServiceUri": "<blobServiceUri>",
"<CONNECTION_NAME_PREFIX>__queueServiceUri": "<queueServiceUri>",
"<CONNECTION_NAME_PREFIX>__tenantId": "<tenantId>",
"<CONNECTION_NAME_PREFIX>__clientId": "<clientId>",
"<CONNECTION_NAME_PREFIX>__clientSecret": "<clientSecret>"
}
}
Connecting to host storage with an identity
The Azure Functions host uses the storage connection set in AzureWebJobsStorage
to enable core behaviors such as coordinating singleton execution of timer triggers and default app key storage. This connection can also be configured to use an identity.
Caution
Other components in Functions rely on AzureWebJobsStorage
for default behaviors. You should not move it to an identity-based connection if you are using older versions of extensions that do not support this type of connection, including triggers and bindings for Azure Blobs, Event Hubs, and Durable Functions. Similarly, AzureWebJobsStorage
is used for deployment artifacts when using server-side build in Linux Consumption, and if you enable this, you will need to deploy via an external deployment package.
In addition, your function app might be reusing AzureWebJobsStorage
for other storage connections in their triggers, bindings, and/or function code. Make sure that all uses of AzureWebJobsStorage
are able to use the identity-based connection format before changing this connection from a connection string.
To use an identity-based connection for AzureWebJobsStorage
, configure the following app settings:
Setting | Description | Example value |
---|---|---|
AzureWebJobsStorage__blobServiceUri |
The data plane URI of the blob service of the storage account, using the HTTPS scheme. | https://<storage_account_name>.blob.core.windows.net |
AzureWebJobsStorage__queueServiceUri |
The data plane URI of the queue service of the storage account, using the HTTPS scheme. | https://<storage_account_name>.queue.core.windows.net |
AzureWebJobsStorage__tableServiceUri |
The data plane URI of a table service of the storage account, using the HTTPS scheme. | https://<storage_account_name>.table.core.windows.net |
Common properties for identity-based connections may also be set as well.
If you're configuring AzureWebJobsStorage
using a storage account that uses the default DNS suffix and service name for global Azure, following the https://<accountName>.[blob|queue|file|table].core.windows.net
format, you can instead set AzureWebJobsStorage__accountName
to the name of your storage account. The endpoints for each storage service are inferred for this account. This doesn't work when the storage account is in a sovereign cloud or has a custom DNS.
Setting | Description | Example value |
---|---|---|
AzureWebJobsStorage__accountName |
The account name of a storage account, valid only if the account isn't in a sovereign cloud and doesn't have a custom DNS. This syntax is unique to AzureWebJobsStorage and can't be used for other identity-based connections. |
<storage_account_name> |
You will need to create a role assignment that provides access to the storage account for "AzureWebJobsStorage" at runtime. Management roles like Owner are not sufficient. The Storage Blob Data Owner role covers the basic needs of Functions host storage - the runtime needs both read and write access to blobs and the ability to create containers. Several extensions use this connection as a default location for blobs, queues, and tables, and these uses may add requirements as noted in the table below. You may need additional permissions if you use "AzureWebJobsStorage" for any other purposes.
Extension | Roles required | Explanation |
---|---|---|
No extension (host only) | Storage Blob Data Owner | Used for general coordination, default key store |
Azure Blobs (trigger only) | All of: Storage Account Contributor, Storage Blob Data Owner, Storage Queue Data Contributor |
The blob trigger internally uses Azure Queues and writes blob receipts. It uses AzureWebJobsStorage for these, regardless of the connection configured for the trigger. |
Azure Event Hubs (trigger only) | (no change from default requirement) Storage Blob Data Owner |
Checkpoints are persisted in blobs using the AzureWebJobsStorage connection. |
Timer trigger | (no change from default requirement) Storage Blob Data Owner |
To ensure one execution per event, locks are taken with blobs using the AzureWebJobsStorage connection. |
Durable Functions | All of: Storage Blob Data Contributor, Storage Queue Data Contributor, Storage Table Data Contributor |
Durable Functions uses blobs, queues, and tables to coordinate activity functions and maintain orchestration state. It uses the AzureWebJobsStorage connection for all of these by default, but you can specify a different connection in the Durable Functions extension configuration. |
Reporting Issues
Item | Description | Link |
---|---|---|
Runtime | Script Host, Triggers & Bindings, Language Support | File an Issue |
Templates | Code Issues with Creation Template | File an Issue |
Portal | User Interface or Experience Issue | File an Issue |
Open source repositories
The code for Azure Functions is open source, and you can find key components in these GitHub repositories:
Next steps
For more information, see the following resources:
Feedback
https://aka.ms/ContentUserFeedback.
Coming soon: Throughout 2024 we will be phasing out GitHub Issues as the feedback mechanism for content and replacing it with a new feedback system. For more information see:Submit and view feedback for