Azure Functions Overview
Azure Functions Overview
Azure Functions Overview
Functions Documentation
Overview
About Azure Functions
Durable Functions
Serverless comparison
Quickstarts
Create function - Visual Studio
Create function - Visual Studio Code
Create function - Java/Maven
Create function - Python
Create function - PowerShell
Create function - Azure CLI
Create function - portal
Create function - Linux
Triggers
Azure Cosmos DB
Blob storage
Queue storage
Timer
Integrate
Azure Cosmos DB
Storage - Python
Storage - portal
Tutorials
Functions with Logic Apps
Create a serverless API
Create an OpenAPI definition
Connect to a Virtual Network
Image resize with Event Grid
Create a serverless web app
Create a custom Linux image
Functions on IoT Edge device
Samples
Code samples
Azure CLI
Concepts
Compare versions 1.x and 2.x
Premium plan
Scale and hosting
Triggers and bindings
About triggers and bindings
Binding example
Register binding extensions
Binding expression patterns
Use binding return values
Handle binding errors
Languages
Supported languages
C# (class library)
C# script (.csx)
F#
JavaScript
Java
PowerShell
Python
Diagnostics
Performance considerations
Functions Proxies
Networking options
IP addresses
Functions in Kubernetes
How-to guides
Develop
Developer guide
Testing functions
Develop and debug locally
Visual Studio development
Core Tools development
IntelliJ IDEA development
Eclipse development
Create Linux function app
Debug local PowerShell functions
Dependency injection
Manage connections
Error handling
Manually run a non HTTP-triggered function
Debug Event Grid trigger locally
Azure for Students Starter
Deploy
Continuous deployment
Build and deploy using Azure Pipelines
Zip deployment
Run from package
Automate resource deployment
On-premises functions
Deploy using the Jenkins plugin
Configure
Manage a function app
Set the runtime version
Manually register an extension
Disable a function
Monitor
Secure
Buy SSL cert
Authenticate users
Authenticate with Azure AD
Authenticate with Facebook
Authenticate with Google
Authenticate with Microsoft account
Authenticate with Twitter
Advanced auth
Restrict IPs
Use a managed identity
Reference secrets from Key Vault
Integrate
Connect to SQL Database
Connect to a virtual Network
Create an Open API 2.0 definition
Export to PowerApps and Microsoft Flow
Call a function from PowerApps
Call a function from Microsoft Flow
Use a managed identity
Troubleshoot
Troubleshoot storage
Reference
API references
Java
Python
App settings reference
Bindings
Blob storage
Azure Cosmos DB
Functions 1.x
Functions 2.x
Event Grid
Event Hubs
IoT Hub
HTTP and webhooks
Microsoft Graph
Mobile Apps
Notification Hubs
Queue storage
SendGrid
Service Bus
SignalR Service
Table storage
Timer
Twilio
host.json reference
Functions 2.x
Functions 1.x
Networking FAQ
OpenAPI reference
Resources
Build your skills with Microsoft Learn
Azure Roadmap
Pricing
Pricing calculator
Quota information
Regional availability
Videos
MSDN forum
Stack Overflow
Twitter
Provide product feedback
Azure Functions GitHub repository
Service updates
An introduction to Azure Functions
12/18/2018 • 4 minutes to read • Edit Online
Azure Functions is a solution for easily running small pieces of code, or "functions," in the cloud. You can write just
the code you need for the problem at hand, without worrying about a whole application or the infrastructure to
run it. Functions can make development even more productive, and you can use your development language of
choice, such as C#, F#, Node.js, Java, or PHP. Pay only for the time your code runs and trust Azure to scale as
needed. Azure Functions lets you develop serverless applications on Microsoft Azure.
This topic provides a high-level overview of Azure Functions. If you want to jump right in and get started with
Functions, start with Create your first Azure Function. If you are looking for more technical information about
Functions, see the developer reference.
Features
Here are some key features of Functions:
Choice of language - Write functions using your choice of C#, F#, or Javascript. See Supported languages
for other options.
Pay-per-use pricing model - Pay only for the time spent running your code. See the Consumption hosting
plan option in the pricing section.
Bring your own dependencies - Functions supports NuGet and NPM, so you can use your favorite libraries.
Integrated security - Protect HTTP -triggered functions with OAuth providers such as Azure Active Directory,
Facebook, Google, Twitter, and Microsoft Account.
Simplified integration - Easily leverage Azure services and software-as-a-service (SaaS ) offerings. See the
integrations section for some examples.
Flexible development - Code your functions right in the portal or set up continuous integration and deploy
your code through GitHub, Azure DevOps Services, and other supported development tools.
Open-source - The Functions runtime is open-source and available on GitHub.
Integrations
Azure Functions integrates with various Azure and 3rd-party services. These services can trigger your function
and start execution, or they can serve as input and output for your code. The following service integrations are
supported by Azure Functions:
Azure Cosmos DB
Azure Event Hubs
Azure Event Grid
Azure Notification Hubs
Azure Service Bus (queues and topics)
Azure Storage (blob, queues, and tables)
On-premises (using Service Bus)
Twilio (SMS messages)
Next Steps
Create your first Azure Function
Jump right in and create your first function using the Azure Functions quickstart.
Azure Functions developer reference
Provides more technical information about the Azure Functions runtime and a reference for coding functions
and defining triggers and bindings.
Testing Azure Functions
Describes various tools and techniques for testing your functions.
How to scale Azure Functions
Discusses service plans available with Azure Functions, including the Consumption hosting plan, and how to
choose the right plan.
Learn more about Azure App Service
Azure Functions leverages Azure App Service for core functionality like deployments, environment variables,
and diagnostics.
What are Durable Functions?
5/17/2019 • 2 minutes to read • Edit Online
Durable Functions are an extension of Azure Functions that lets you write stateful functions in a serverless
environment. The extension manages state, checkpoints, and restarts for you.
Benefits
The extension lets you define stateful workflows using an orchestrator function, which can provide the following
benefits:
You can define your workflows in code. No JSON schemas or designers are needed.
Other functions can be called both synchronously and asynchronously. Output from called functions can be
saved to local variables.
Progress is automatically checkpointed when the function awaits. Local state is never lost when the process
recycles or the VM reboots.
Application patterns
The primary use case for Durable Functions is simplifying complex, stateful coordination requirements in
serverless applications. The following are some typical application patterns that can benefit from Durable
Functions:
Chaining
Fan-out/fan-in
Async HTTP APIs
Monitoring
Human interaction
Supported languages
Durable Functions currently supports the following languages:
C#: both precompiled class libraries and C# script.
F#: precompiled class libraries and F# script. F# script is only supported for version 1.x of the Azure Functions
runtime.
JavaScript: supported only for version 2.x of the Azure Functions runtime. Requires version 1.7.0 of the
Durable Functions extension, or a later version.
Durable Functions has a goal of supporting all Azure Functions languages. See the Durable Functions issues list
for the latest status of work to support additional languages.
Like Azure Functions, there are templates to help you develop Durable Functions using Visual Studio 2019, Visual
Studio Code, and the Azure portal.
Billing
Durable Functions are billed the same as Azure Functions. For more information, see Azure Functions pricing.
Jump right in
You can get started with Durable Functions in under 10 minutes by completing one of these language-specific
quickstart tutorials:
C# using Visual Studio 2019
JavaScript using Visual Studio Code
In both quickstarts, you locally create and test a "hello world" durable function. You then publish the function code
to Azure. The function you create orchestrates and chains together calls to other functions.
Learn more
The following video highlights the benefits of Durable Functions:
Because Durable Functions is an advanced extension for Azure Functions, it isn't appropriate for all applications. To
learn more about Durable Functions, see Durable Functions patterns and technical concepts. For a comparison
with other Azure orchestration technologies, see Compare Azure Functions and Azure Logic Apps.
Next steps
Durable Functions patterns and technical concepts
What are Microsoft Flow, Logic Apps, Functions, and
WebJobs?
4/23/2019 • 6 minutes to read • Edit Online
Users Office workers, business users, Pro integrators and developers, IT pros
SharePoint administrators
Design tool In-browser and mobile app, UI only In-browser and Visual Studio, Code
view available
Application lifecycle management (ALM) Design and test in non-production Azure DevOps: source control, testing,
environments, promote to production support, automation, and manageability
when ready in Azure Resource Manager
Admin experience Manage Microsoft Flow environments Manage resource groups, connections,
and data loss prevention (DLP) policies, access management, and logging: Azure
track licensing: Microsoft Flow Admin portal
Center
MICROSOFT FLOW LOGIC APPS
Security Office 365 Security and Compliance Security assurance of Azure: Azure
audit logs, DLP, encryption at rest for security, Azure Security Center, audit
sensitive data logs
Actions Each activity is an Azure function; write Large collection of ready-made actions
code for activity functions
Management REST API, Visual Studio Azure portal, REST API, PowerShell,
Visual Studio
Execution context Can run locally or in the cloud Runs only in the cloud
Pay-per-use pricing ✔
1 WebJobs (without the WebJobs SDK) supports C#, JavaScript, Bash, .cmd, .bat, PowerShell, PHP, TypeScript,
Python, and more. This is not a comprehensive list. A WebJob can run any program or script that can run in the
App Service sandbox.
2 WebJobs (without the WebJobs SDK) supports NPM and NuGet.
Summary
Azure Functions offers more developer productivity than Azure App Service WebJobs does. It also offers more
options for programming languages, development environments, Azure service integration, and pricing. For most
scenarios, it's the best choice.
Here are two scenarios for which WebJobs may be the best choice:
You need more control over the code that listens for events, the JobHost object. Functions offers a limited
number of ways to customize JobHost behavior in the host.json file. Sometimes you need to do things that
can't be specified by a string in a JSON file. For example, only the WebJobs SDK lets you configure a custom
retry policy for Azure Storage.
You have an App Service app for which you want to run code snippets, and you want to manage them together
in the same Azure DevOps environment.
For other scenarios where you want to run code snippets for integrating Azure or third-party services, choose
Azure Functions over WebJobs with the WebJobs SDK.
Microsoft Flow, Logic Apps, Functions, and WebJobs together
You don't have to choose just one of these services. They integrate with each other as well as they do with external
services.
A flow can call a logic app. A logic app can call a function, and a function can call a logic app. See, for example,
Create a function that integrates with Azure Logic Apps.
The integration between Microsoft Flow, Logic Apps, and Functions continues to improve over time. You can build
something in one service and use it in the other services.
You can get more information on integration services by using the following links:
Leveraging Azure Functions & Azure App Service for integration scenarios by Christopher Anderson
Integrations Made Simple by Charles Lamanna
Logic Apps Live webcast
Microsoft Flow frequently asked questions
Next steps
Get started by creating your first flow, logic app, or function app. Select any of the following links:
Get started with Microsoft Flow
Create a logic app
Create your first Azure function
Create your first function using Visual Studio
5/17/2019 • 5 minutes to read • Edit Online
Azure Functions lets you execute your code in a serverless environment without having to first create a VM or
publish a web application.
In this article, you learn how to use the Visual Studio 2019 tools for Azure Functions to locally create and test a
"hello world" function. You then publish the function code to Azure. These tools are available as part of the Azure
development workload in Visual Studio 2019.
This topic includes a video that demonstrates the same basic steps.
Prerequisites
To complete this tutorial:
Install Visual Studio 2019 and ensure that the Azure development workload is also installed.
Make sure you have the latest Azure Functions tools.
If you don't have an Azure subscription, create a free account before you begin.
NOTE
Make sure you set the Access rights to Anonymous . If you choose the default level of Function , you're required
to present the function key in requests to access your function endpoint.
When you choose Select Existing, all files in the existing function app in Azure are overwritten by files
from the local project. Only use this option when republishing updates to an existing function app.
3. If you haven't already connected Visual Studio to your Azure account, select Add an account....
4. In the Create App Service dialog, use the Hosting settings as specified in the table below the image:
SETTING SUGGESTED VALUE DESCRIPTION
App Name Globally unique name Name that uniquely identifies your
new function app.
Storage Account General purpose storage account An Azure storage account is required
by the Functions runtime. Click New
to create a general purpose storage
account. You can also use an existing
account that meets the storage
account requirements.
5. Click Create to create a function app and related resources in Azure with these settings and deploy your
function project code.
6. After the deployment is complete, make a note of the Site URL value, which is the address of your function
app in Azure.
http://<APP_NAME>.azurewebsites.net/api/<FUNCTION_NAME>?name=<YOUR_NAME>
2. Paste this new URL for the HTTP request into your browser's address bar. The following shows the
response in the browser to the remote GET request returned by the function:
Next steps
You have used Visual Studio to create and publish a C# function app with a simple HTTP triggered function.
Learn how to add input and output bindings that integrate with other services.
Learn more about developing functions as .NET class libraries.
Create your first function using Visual Studio Code
4/9/2019 • 6 minutes to read • Edit Online
Azure Functions lets you execute your code in a serverless environment without having to first create a VM or
publish a web application.
In this article, you learn how to use the Azure Functions extension for Visual Studio Code to create and test a
"hello world" function on your local computer using Microsoft Visual Studio Code. You then publish the function
code to Azure from Visual Studio Code.
The extension currently fully supports C#, JavaScript, and Java functions, with Python support currently in
Preview. The steps in this article may vary depending on your choice of language for your Azure Functions
project. The extension is currently in preview. To learn more, see the Azure Functions extension for Visual Studio
Code extension page.
Prerequisites
To complete this quickstart:
Install Visual Studio Code on one of the supported platforms. This article was developed and tested on a
device running macOS (High Sierra).
Install version 2.x of the Azure Functions Core Tools, which is still in preview.
Install the specific requirements for your chosen language:
LANGUAGE EX TENSION
3. Restart Visual Studio Code and select the Azure icon on the Activity bar. You should see an Azure Functions
area in the Side Bar.
Create an Azure Functions project
The Azure Functions project template in Visual Studio Code creates a project that can be published to a function
app in Azure. A function app lets you group functions as a logical unit for management, deployment, and sharing
of resources.
1. In Visual Studio Code, select the Azure logo to display the Azure: Functions area, and then select the
Create New Project icon.
3. Select the language for your function app project. In this article, JavaScript is used.
2. Select the folder with your function app project and select the HTTP trigger function template.
3. Type HTTPTrigger for the function name and press Enter, then select Anonymous authentication.
A function is created in your chosen language using the template for an HTTP -triggered function.
You can add input and output bindings to your function by modifying the function.json file. For more information,
see Azure Functions triggers and bindings concepts.
Now that you've created your function project and an HTTP -triggered function, you can test it on your local
computer.
3. Paste the URL for the HTTP request into your browser's address bar. Append the query string
?name=<yourname> to this URL and execute the request. Execution is paused when the breakpoint is hit.
4. When you continue the execution, the following shows the response in the browser to the GET request:
Sign in to Azure
Before you can publish your app, you must sign in to Azure.
1. In the Azure: Functions area, choose Sign in to Azure.... If you don't already have one, you can Create a
free Azure account.
2. When prompted, select Copy & Open, or copy the displayed code and open https://aka.ms/devicelogin in
your browser.
3. Paste the copied code in the Device Login page, verify the sign in for Visual Studio Code, then select
Continue.
4. Complete the sign in using your Azure account credentials. After you have successfully signed in, you can
close the browser.
IMPORTANT
Publishing to an existing function app overwrites the content of that app in Azure.
1. In the Azure: Functions area, select the Deploy to Function App icon.
2. If not signed-in, you are prompted to Sign in to Azure. You can also Create a free Azure account. After
successful sign in from the browser, go back to Visual Studio Code.
3. If you have multiple subscriptions, Select a subscription for the function app, then choose + Create New
Function App in Azure.
4. Type a globally unique name that identifies your function app and press Enter. Valid characters for a
function app name are a-z , 0-9 , and - .
5. Choose + Create New Resource Group, type a resource group name, like myResourceGroup , and press
enter. You can also use an existing resource group.
6. Choose + Create New Storage Account, type a globally unique name of the new storage account used
by your function app and press Enter. Storage account names must be between 3 and 24 characters in
length and may contain numbers and lowercase letters only. You can also use an existing account.
7. Choose a location in a region near you or near other services your functions access.
When you press Enter, the following Azure resources are created in your subscription:
Resource group: Contains all of the created Azure resources. The name is based on your function app
name.
Storage account: A standard Storage account is created with a unique name that is based on your
function app name.
Hosting plan: A consumption plan is created in the West US region to host your serverless function
app.
Function app: Your project is deployed to and runs in this new function app.
A notification is displayed after your function app is created and the deployment package is applied. Select
View Output in this notification to view the creation and deployment results, including the Azure
resources that you created.
8. Back in the Azure: Functions area, expand the new function app under your subscription. Expand
Functions, right-click HttpTrigger, and then choose Copy function URL.
Test your function in Azure
1. Copy the URL of the HTTP trigger from the Output panel. As before, make sure to add the query string
?name=<yourname> to the end of this URL and execute the request.
The URL that calls your HTTP -triggered function should be in the following format:
http://<functionappname>.azurewebsites.net/api/<functionname>?name=<yourname>
2. Paste this new URL for the HTTP request into your browser's address bar. The following shows the
response in the browser to the remote GET request returned by the function:
Next steps
You have used Visual Studio Code to create a function app with a simple HTTP -triggered function. You may also
want to learn more about local testing and debugging from the Terminal or command prompt using the Azure
Functions Core Tools.
Enable Application Insights integration
Create your first function with Java and Maven
5/16/2019 • 4 minutes to read • Edit Online
This article guides you through using the Maven command line tool to build and publish a Java function to
Azure Functions. When you're done, your function code runs on the Consumption Plan in Azure and can be
triggered using an HTTP request.
If you don't have an Azure subscription, create a free account before you begin.
Prerequisites
To develop functions using Java, you must have the following installed:
Java Developer Kit, version 8
Apache Maven, version 3.0 or above
Azure CLI
Azure Functions Core Tools version 2.6.666 or above
IMPORTANT
The JAVA_HOME environment variable must be set to the install location of the JDK to complete this quickstart.
mvn archetype:generate \
-DarchetypeGroupId=com.microsoft.azure \
-DarchetypeArtifactId=azure-functions-archetype
NOTE
If you're experiencing issues with running the command, take a look at what maven-archetype-plugin version is used.
Because you are running the command in an empty directory with no .pom file, it might be attempting to use a plugin
of the older version from ~/.m2/repository/org/apache/maven/plugins/maven-archetype-plugin if you upgraded
your Maven from an older version. If so, try deleting the maven-archetype-plugin directory and re-running the
command.
Windows
mvn archetype:generate `
"-DarchetypeGroupId=com.microsoft.azure" `
"-DarchetypeArtifactId=azure-functions-archetype"
mvn archetype:generate ^
-DarchetypeGroupId=com.microsoft.azure ^
-DarchetypeArtifactId=azure-functions-archetype
Maven will ask you for values needed to finish generating the project. For groupId, artifactId, and version values,
see the Maven naming conventions reference. The appName value must be unique across Azure, so Maven
generates an app name based on the previously entered artifactId as a default. The packageName value
determines the Java package for the generated function code.
The com.fabrikam.functions and fabrikam-functions identifiers below are used as an example and to make later
steps in this quickstart easier to read. You are encouraged to supply your own values to Maven in this step.
Maven creates the project files in a new folder with a name of artifactId, in this example fabrikam-functions .
The ready to run generated code in the project is a simple HTTP triggered function that echoes the body of the
request:
if (name == null) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body("Please pass a name on the
query string or in the request body").build();
} else {
return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
}
}
}
Reference bindings
To reference the Azure Functions 2.x default bindings, open the host.json file and update contents to match the
following code.
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
cd fabrikam-function
mvn clean package
mvn azure-functions:run
NOTE
If you're experiencing this exception: javax.xml.bind.JAXBException with Java 9, see the workaround on GitHub.
You see this output when the function is running locally on your system and ready to respond to HTTP requests:
Listening on http://localhost:7071
Hit CTRL-C to exit...
Http Functions:
hello: http://localhost:7071/api/hello
Trigger the function from the command line using curl in a new terminal window:
Hello LocalFunction!
az login
Deploy your code into a new Function app using the azure-functions:deploy Maven target.
NOTE
When you use Visual Studio Code to deploy your Function app, remember to choose a non-free subscription, or you will
get an error. You can watch your subscription on the left side of the IDE.
mvn azure-functions:deploy
When the deploy is complete, you see the URL you can use to access your Azure function app:
Test the function app running on Azure using cURL . You'll need to change the URL from the sample below to
match the deployed URL for your own function app from the previous step.
NOTE
Make sure you set the Access rights to Anonymous . When you choose the default level of Function , you are required
to present the function key in requests to access your function endpoint.
Hello AzureFunctions!
To the following:
Save the changes. Run mvn clean package and redeploy by running azure-functions:deploy from the terminal
as before. The function app will be updated and this request:
Hi, AzureFunctionsTest
Next steps
You have created a Java function app with a simple HTTP trigger and deployed it to Azure Functions.
Review the Java Functions developer guide for more information on developing Java functions.
Add additional functions with different triggers to your project using the azure-functions:add Maven target.
Write and debug functions locally with Visual Studio Code, IntelliJ, and Eclipse.
Debug functions deployed in Azure with Visual Studio Code. See the Visual Studio Code serverless Java
applications documentation for instructions.
Create an HTTP triggered function in Azure
5/17/2019 • 6 minutes to read • Edit Online
NOTE
Python for Azure Functions is currently in preview. To receive important updates, subscribe to the Azure App Service
announcements repository on GitHub.
This article shows you how to use command-line tools to create a Python project that runs in Azure Functions.
The function you create is triggered by HTTP requests. Finally, you publish your project to run as a serverless
function in Azure.
This article is the first of two quickstarts for Azure Functions. After you complete this article, you add an Azure
Storage queue output binding to your function.
Prerequisites
Before you start, you must have the following:
Install Python 3.6.
Install Azure Functions Core Tools version 2.6.666 or later.
Install the Azure CLI version 2.x or later.
An active Azure subscription.
If you don't have an Azure subscription, create a free account before you begin.
A folder named MyFunctionProj is created, which contains the following three files:
local.settings.json is used to store app settings and connection strings when running locally. This file
doesn't get published to Azure.
requirements.txt contains the list of packages to be installed on publishing to Azure.
host.json contains global configuration options that affect all functions in a function app. This file does get
published to Azure.
Navigate to the new MyFunctionProj folder:
cd MyFunctionProj
Reference bindings
Extension bundles makes it easier to add binding extensions down the road. It also removes the requirement of
installing the .NET Core 2.x SDK. Extension bundles requires version 2.6.1071 of the Core Tools, or a later
version.
To reference the Azure Functions 2.x default bindings, open the host.json file and update contents to match the
following code.
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
Create a function
To add a function to your project, run the following command:
func new
Choose the HTTP trigger template, type HttpTrigger as the name for the function, then press Enter.
A subfolder named HttpTrigger is created, which contains the following files:
function.json: configuration file that defines the function, trigger, and other bindings. Review this file and
see that the value for scriptFile points to the file containing the function, while the invocation trigger
and bindings are defined in the bindings array.
Each binding requires a direction, type and a unique name. The HTTP trigger has an input binding of type
httpTrigger and output binding of type http .
__init__.py: script file that is your HTTP triggered function. Review this script and see that it contains a
default main() . HTTP data from the trigger is passed to this function using the req named binding
parameter. Defined in function.json, req is an instance of the azure.functions.HttpRequest class.
The return object, defined as $return in function.json, is an instance of azure.functions.HttpResponse
class. To learn more, see Azure Functions HTTP triggers and bindings.
When the Functions host starts, it writes something like the following output, which has been truncated for
readability:
%%%%%%
%%%%%%
@ %%%%%% @
@@ %%%%%% @@
@@@ %%%%%%%%%%% @@@
@@ %%%%%%%%%% @@
@@ %%%% @@
@@ %%% @@
@@ %% @@
%%
%
...
...
Http Functions:
HttpTrigger: http://localhost:7071/api/MyHttpTrigger
Copy the URL of your HttpTrigger function from the runtime output and paste it into your browser's address
bar. Append the query string ?name=<yourname> to this URL and execute the request. The following shows the
response in the browser to the GET request returned by the local function:
Now that you have run your function locally, you can create the function app and other required resources in
Azure.
You generally create your resource group and the resources in a region near you. To see all supported locations
for App Service plans, run the az appservice list-locations command.
az storage account create --name <storage_name> --location westeurope --resource-group myResourceGroup --sku
Standard_LRS
After the storage account has been created, the Azure CLI shows information similar to the following example:
{
"creationTime": "2017-04-15T17:14:39.320307+00:00",
"id": "/subscriptions/bbbef702-e769-477b-9f16-bc4d3aa97387/resourceGroups/myresourcegroup/...",
"kind": "Storage",
"location": "westeurope",
"name": "myfunctionappstorage",
"primaryEndpoints": {
"blob": "https://myfunctionappstorage.blob.core.windows.net/",
"file": "https://myfunctionappstorage.file.core.windows.net/",
"queue": "https://myfunctionappstorage.queue.core.windows.net/",
"table": "https://myfunctionappstorage.table.core.windows.net/"
},
....
// Remaining output has been truncated for readability.
}
NOTE
Linux and Windows apps cannot be hosted in the same resource group. If you have an existing resource group named
myResourceGroup with a Windows function app or web app, you must use a different resource group.
You're now ready to publish your local functions project to the function app in Azure.
You will see output similar to the following, which has been truncated for readability.
Preparing archive...
Uploading content...
Upload completed successfully.
Deployment completed successfully.
Syncing triggers...
Functions in myfunctionapp:
HttpTrigger - [httpTrigger]
Invoke url: https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....
Copy the Invoke URL value for your HttpTrigger, which you can now use to test your function in Azure. The
URL contains a code query string value that is your function key. This key makes it difficult for others to call
your HTTP trigger endpoint in Azure.
curl https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....&name=<yourname>
You can also paste the copied URL in to the address of your web browser. Again, append the query string
&name=<yourname> to the URL before you execute the request.
Next steps
You've created a Python functions project with an HTTP triggered function, run it on your local machine, and
deployed it to Azure. Now, extend your function by...
Adding an Azure Storage queue output binding
Create your first PowerShell function in Azure
(preview)
5/9/2019 • 6 minutes to read • Edit Online
NOTE
PowerShell for Azure Functions is currently in preview. To receive important updates, subscribe to the Azure App Service
announcements repository on GitHub.
This quickstart article walks you through how to create your first serverless PowerShell function using Visual
Studio Code.
You use the Azure Functions extension for Visual Studio Code to create a PowerShell function locally and then
deployed it to a new function app in Azure. The extension is currently in preview. To learn more, see the Azure
Functions extension for Visual Studio Code extension page.
NOTE
PowerShell support for the Azure Functions extension is currently disabled by default. Enabling PowerShell support is one of
the steps in this article.
The following steps are supported on macOS, Windows, and Linux-based operating systems.
Prerequisites
To complete this quickstart:
Install PowerShell Core
Install Visual Studio Code on one of the supported platforms.
Install PowerShell extension for Visual Studio Code.
Install .NET Core SDK 2.2+ (required by Azure Functions Core Tools and available on all supported
platforms).
Install version 2.x of the Azure Functions Core Tools.
You also need an active Azure subscription.
If you don't have an Azure subscription, create a free account before you begin.
3. Restart Visual Studio Code and select the Azure icon on the Activity bar. You should see an Azure Functions
area in the Side Bar.
Create a function app project
The Azure Functions project template in Visual Studio Code creates a project that can be published to a function
app in Azure. A function app lets you group functions as a logical unit for management, deployment, and sharing
of resources.
1. In Visual Studio Code, select the Azure logo to display the Azure: Functions area, and then select the
Create New Project icon.
2. Choose a location for your Functions project workspace and choose Select.
NOTE
This article was designed to be completed outside of a workspace. In this case, do not select a project folder that is
part of a workspace.
3. Select the Powershell (preview) as the language for your function app project and then Azure Functions
v2.
4. Choose HTTP Trigger as the template for your first function, use HTTPTrigger as the function name, and
choose an authorization level of Function.
NOTE
The Function authorization level requires a function key value when calling the function endpoint in Azure. This
makes it harder for just anyone to call your function.
3. Append the query string ?name=<yourname> to this URL, and then use Invoke-RestMethod to execute the
request, as follows:
NOTE
Remember to remove any calls to Wait-Debugger before you publish your functions to Azure.
NOTE
Creating a Function App in Azure will only prompt for Function App name. Set azureFunctions.advancedCreation to true to
be prompted for all other values.
IMPORTANT
Publishing to an existing function app overwrites the content of that app in Azure.
1. In the Azure: Functions area, select the Deploy to Function App icon.
2. If not signed-in, you are prompted to Sign in to Azure. You can also Create a free Azure account. After
successful sign in from the browser, go back to Visual Studio Code.
3. If you have multiple subscriptions, Select a subscription for the function app, then choose + Create New
Function App in Azure.
4. Type a globally unique name that identifies your function app and press Enter. Valid characters for a
function app name are a-z , 0-9 , and - .
5. Choose + Create New Resource Group, type a resource group name, like myResourceGroup , and press
enter. You can also use an existing resource group.
6. Choose + Create New Storage Account, type a globally unique name of the new storage account used by
your function app and press Enter. Storage account names must be between 3 and 24 characters in length
and may contain numbers and lowercase letters only. You can also use an existing account.
7. Choose a location in a region near you or near other services your functions access.
When you press Enter, the following Azure resources are created in your subscription:
Resource group: Contains all of the created Azure resources. The name is based on your function app
name.
Storage account: A standard Storage account is created with a unique name that is based on your
function app name.
Hosting plan: A consumption plan is created in the West US region to host your serverless function
app.
Function app: Your project is deployed to and runs in this new function app.
A notification is displayed after your function app is created and the deployment package is applied. Select
View Output in this notification to view the creation and deployment results, including the Azure resources
that you created.
8. Back in the Azure: Functions area, expand the new function app under your subscription. Expand
Functions, right-click HttpTrigger, and then choose Copy function URL.
Run the function in Azure
To verify that your published function runs in Azure, execute the following PowerShell command, replacing the
Uri parameter with the URL of the HTTPTrigger function from the previous step. As before, append the query
string &name=<yourname> to the URL, as in the following example:
StatusCode : 200
StatusDescription : OK
Content : Hello PowerShell
RawContent : HTTP/1.1 200 OK
Content-Length: 16
Content-Type: text/plain; charset=utf-8
Date: Thu, 25 Apr 2019 16:01:22 GMT
Hello PowerShell
Forms : {}
Headers : {[Content-Length, 16], [Content-Type, text/plain; charset=utf-8], [Date, Thu, 25 Apr 2019
16:01:22 GMT]}
Images : {}
InputFields : {}
Links : {}
ParsedHtml : mshtml.HTMLDocumentClass
RawContentLength : 16
Next steps
You have used Visual Studio Code to create a PowerShell function app with a simple HTTP -triggered function. You
may also want to learn more about debugging a PowerShell function locally using the Azure Functions Core Tools.
Check out the Azure Functions PowerShell developer guide.
Enable Application Insights integration
Create your first function from the command line
4/28/2019 • 6 minutes to read • Edit Online
This quickstart topic walks you through how to create your first function from the command line or terminal. You
use the Azure CLI to create a function app, which is the serverless infrastructure that hosts your function. The
function code project is generated from a template by using the Azure Functions Core Tools, which is also used to
deploy the function app project to Azure.
You can follow the steps below using a Mac, Windows, or Linux computer.
Prerequisites
Before running this sample, you must have the following:
Install Azure Functions Core Tools version 2.6.666 or later.
Install the Azure CLI. This article requires the Azure CLI version 2.0 or later. Run az --version to find the
version you have. You can also use the Azure Cloud Shell.
An active Azure subscription.
If you don't have an Azure subscription, create a free account before you begin.
When prompted, select a worker runtime from the following language choices:
dotnet : creates a .NET class library project (.csproj).
node : creates a JavaScript project.
When the command executes, you see something like the following output:
Writing .gitignore
Writing host.json
Writing local.settings.json
Initialized empty Git repository in C:/functions/MyFunctionProj/.git/
Use the following command to navigate to the new MyFunctionProj project folder.
cd MyFunctionProj
Reference bindings
To reference the Azure Functions 2.x default bindings, open the host.json file and update contents to match the
following code.
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
Create a function
The following command creates an HTTP -triggered function named MyHttpTrigger .
When the command executes, you see something like the following output:
The function "MyHttpTrigger" was created successfully from the "HttpTrigger" template.
The --build option is required to compile C# projects. You don't need this option for a JavaScript project.
When the Functions host starts, it write something like the following output, which has been truncated for
readability:
%%%%%%
%%%%%%
@ %%%%%% @
@@ %%%%%% @@
@@@ %%%%%%%%%%% @@@
@@ %%%%%%%%%% @@
@@ %%%% @@
@@ %%% @@
@@ %% @@
%%
%
...
...
Http Functions:
HttpTrigger: http://localhost:7071/api/MyHttpTrigger
Copy the URL of your HttpTrigger function from the runtime output and paste it into your browser's address bar.
Append the query string ?name=<yourname> to this URL and execute the request. The following shows the response
in the browser to the GET request returned by the local function:
Now that you have run your function locally, you can create the function app and other required resources in
Azure.
You generally create your resource group and the resources in a region near you. To see all supported locations for
App Service plans, run the az appservice list-locations command.
az storage account create --name <storage_name> --location westeurope --resource-group myResourceGroup --sku
Standard_LRS
After the storage account has been created, the Azure CLI shows information similar to the following example:
{
"creationTime": "2017-04-15T17:14:39.320307+00:00",
"id": "/subscriptions/bbbef702-e769-477b-9f16-bc4d3aa97387/resourceGroups/myresourcegroup/...",
"kind": "Storage",
"location": "westeurope",
"name": "myfunctionappstorage",
"primaryEndpoints": {
"blob": "https://myfunctionappstorage.blob.core.windows.net/",
"file": "https://myfunctionappstorage.file.core.windows.net/",
"queue": "https://myfunctionappstorage.queue.core.windows.net/",
"table": "https://myfunctionappstorage.table.core.windows.net/"
},
....
// Remaining output has been truncated for readability.
}
Setting the consumption-plan-location parameter means that the function app is hosted in a Consumption hosting
plan. In this serverless plan, resources are added dynamically as required by your functions and you only pay when
functions are running. For more information, see Choose the correct hosting plan.
After the function app has been created, the Azure CLI shows information similar to the following example:
{
"availabilityState": "Normal",
"clientAffinityEnabled": true,
"clientCertEnabled": false,
"containerSize": 1536,
"dailyMemoryTimeQuota": 0,
"defaultHostName": "quickstart.azurewebsites.net",
"enabled": true,
"enabledHostNames": [
"quickstart.azurewebsites.net",
"quickstart.scm.azurewebsites.net"
],
....
// Remaining output has been truncated for readability.
}
You will see output similar to the following, which has been truncated for readability.
Preparing archive...
Uploading content...
Upload completed successfully.
Deployment completed successfully.
Syncing triggers...
Functions in myfunctionapp:
HttpTrigger - [httpTrigger]
Invoke url: https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....
Copy the Invoke URL value for your HttpTrigger, which you can now use to test your function in Azure. The URL
contains a code query string value that is your function key. This key makes it difficult for others to call your HTTP
trigger endpoint in Azure.
curl https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....&name=<yourname>
You can also paste the copied URL in to the address of your web browser. Again, append the query string
&name=<yourname> to the URL before you execute the request.
Clean up resources
Other quickstarts in this collection build upon this quickstart. If you plan to continue on to work with subsequent
quickstarts or with the tutorials, do not clean up the resources created in this quickstart. If you do not plan to
continue, use the following command to delete all resources created by this quickstart:
Next steps
Learn more about developing Azure Functions locally using the Azure Functions Core Tools.
Code and test Azure Functions locally
Create your first function in the Azure portal
5/17/2019 • 4 minutes to read • Edit Online
Azure Functions lets you execute your code in a serverless environment without having to first create a VM
or publish a web application. In this article, learn how to use Functions to create a "hello world" function in
the Azure portal.
If you don't have an Azure subscription, create a free account before you begin.
NOTE
C# developers should consider creating your first function in Visual Studio 2019 instead of in the portal.
Log in to Azure
Sign in to the Azure portal at https://portal.azure.com with your Azure account.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
2. Paste the function URL into your browser's address bar. Add the query string value
&name=<yourname> to the end of this URL and press the Enter key on your keyboard to execute the
request. You should see the response returned by the function displayed in the browser.
The following example shows the response in the browser:
The request URL includes a key that is required, by default, to access your function over HTTP.
3. When your function runs, trace information is written to the logs. To see the trace output from the
previous execution, return to your function in the portal and click the arrow at the bottom of the
screen to expand the Logs.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick
starts, tutorials, or with any of the services you have created in this quick start, do not clean up the
resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on
your account status and service pricing. If you don't need the resources anymore, here's how to delete
them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link
under Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group
that you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones
you want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You
can also select the bell icon at the top of the page to view the notification.
Next steps
You've created a function app with a simple HTTP triggered function.
Now that you have created your first function, let's add an output binding to the function that writes a
message to a Storage queue.
Add messages to an Azure Storage queue using Functions
For more information, see Azure Functions HTTP bindings.
Create your first function hosted on Linux using Core
Tools and the Azure CLI (preview)
4/28/2019 • 6 minutes to read • Edit Online
Azure Functions lets you execute your code in a serverless Linux environment without having to first create a VM
or publish a web application. Linux-hosting requires the Functions 2.0 runtime. Support to run a function app on
Linux in the serverless Consumption plan is currently in preview. To learn more, see this preview considerations
article.
This quickstart article walks you through how to use the Azure CLI to create your first function app running on
Linux. The function code is created locally and then deployed to Azure by using the Azure Functions Core Tools.
The following steps are supported on a Mac, Windows, or Linux computer. This article shows you how to create
functions in either JavaScript or C#. To learn how to create Python functions, see Create your first Python function
using Core Tools and the Azure CLI (preview ).
Prerequisites
Before running this sample, you must have the following:
Install Azure Functions Core Tools version 2.6.666 or above.
Install the Azure CLI. This article requires the Azure CLI version 2.0 or later. Run az --version to find the
version you have. You can also use the Azure Cloud Shell.
An active Azure subscription.
If you don't have an Azure subscription, create a free account before you begin.
When prompted, use the arrow keys to select a worker runtime from the following language choices:
dotnet : creates a .NET class library project (.csproj).
node : creates a JavaScript or TypeScript project. When prompted, choose JavaScript .
python : creates a Python project. For Python functions, see the Python quickstart.
When the command executes, you see something like the following output:
Writing .gitignore
Writing host.json
Writing local.settings.json
Initialized empty Git repository in C:/functions/MyFunctionProj/.git/
Use the following command to navigate to the new MyFunctionProj project folder.
cd MyFunctionProj
Reference bindings
To reference the Azure Functions 2.x default bindings, open the host.json file and update contents to match the
following code.
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
Create a function
The following command creates an HTTP -triggered function named MyHttpTrigger .
When the command executes, you see something like the following output:
The function "MyHttpTrigger" was created successfully from the "HttpTrigger" template.
The --build option is required to compile C# projects. You don't need this option for a JavaScript project.
When the Functions host starts, it write something like the following output, which has been truncated for
readability:
%%%%%%
%%%%%%
@ %%%%%% @
@@ %%%%%% @@
@@@ %%%%%%%%%%% @@@
@@ %%%%%%%%%% @@
@@ %%%% @@
@@ %%% @@
@@ %% @@
%%
%
...
...
Http Functions:
HttpTrigger: http://localhost:7071/api/MyHttpTrigger
Copy the URL of your HttpTrigger function from the runtime output and paste it into your browser's address bar.
Append the query string ?name=<yourname> to this URL and execute the request. The following shows the response
in the browser to the GET request returned by the local function:
Now that you have run your function locally, you can create the function app and other required resources in
Azure.
You generally create your resource group and the resources in a region near you. To see all supported locations for
App Service plans, run the az appservice list-locations command.
az storage account create --name <storage_name> --location westeurope --resource-group myResourceGroup --sku
Standard_LRS
After the storage account has been created, the Azure CLI shows information similar to the following example:
{
"creationTime": "2017-04-15T17:14:39.320307+00:00",
"id": "/subscriptions/bbbef702-e769-477b-9f16-bc4d3aa97387/resourceGroups/myresourcegroup/...",
"kind": "Storage",
"location": "westeurope",
"name": "myfunctionappstorage",
"primaryEndpoints": {
"blob": "https://myfunctionappstorage.blob.core.windows.net/",
"file": "https://myfunctionappstorage.file.core.windows.net/",
"queue": "https://myfunctionappstorage.queue.core.windows.net/",
"table": "https://myfunctionappstorage.table.core.windows.net/"
},
....
// Remaining output has been truncated for readability.
}
After the function app has been created, you see the following message:
Your serverless Linux function app 'myfunctionapp' has been successfully created.
To active this function app, publish your app content using Azure Functions Core Tools or the Azure portal.
Now, you can publish your project to the new function app in Azure.
You will see output similar to the following, which has been truncated for readability.
Preparing archive...
Uploading content...
Upload completed successfully.
Deployment completed successfully.
Syncing triggers...
Functions in myfunctionapp:
HttpTrigger - [httpTrigger]
Invoke url: https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....
Copy the Invoke URL value for your HttpTrigger, which you can now use to test your function in Azure. The URL
contains a code query string value that is your function key. This key makes it difficult for others to call your HTTP
trigger endpoint in Azure.
curl https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....&name=<yourname>
You can also paste the copied URL in to the address of your web browser. Again, append the query string
&name=<yourname> to the URL before you execute the request.
Clean up resources
Other quickstarts in this collection build upon this quickstart. If you plan to continue on to work with subsequent
quickstarts or with the tutorials, do not clean up the resources created in this quickstart. If you do not plan to
continue, use the following command to delete all resources created by this quickstart:
az group delete --name myResourceGroup
Next steps
Learn more about developing Azure Functions locally using the Azure Functions Core Tools.
Code and test Azure Functions locally
Create a function triggered by Azure Cosmos DB
3/18/2019 • 8 minutes to read • Edit Online
Learn how to create a function triggered when data is added to or changed in Azure Cosmos DB. To learn more
about Azure Cosmos DB, see Azure Cosmos DB: Serverless database computing using Azure Functions.
Prerequisites
To complete this tutorial:
If you don't have an Azure subscription, create a free account before you begin.
NOTE
Azure Cosmos DB bindings are only supported for use with the SQL API. For all other Azure Cosmos DB APIs, you should
access the database from your function by using the static client for your API, including Azure Cosmos DB's API for
MongoDB, Cassandra API, Gremlin API, and Table API.
Account Name Enter a unique name Enter a name to identify your Azure
Cosmos account. Because
documents.azure.com is appended to
the ID that you provide to create
your URI, use a unique ID.
Location Select the region closest to your Select a geographic location to host
users your Azure Cosmos DB account. Use
the location that is closest to your
users to give them the fastest access
to the data.
4. Select Review + create. You can skip the Network and Tags sections.
5. Review the account settings, and then select Create. It takes a few minutes to create the account. Wait for
the portal page to display Your deployment is complete.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
5. Configure the new trigger with the settings as specified in the table below the image.
SETTING SUGGESTED VALUE DESCRIPTION
Azure Cosmos DB account New setting Select New, then choose your
connection Subscription, the Database
account you created earlier, and
Select. This creates an application
setting for your account connection.
This setting is used by the binding to
connection to the database.
Create lease collection if it doesn't Checked The collection doesn't already exist, so
exist create it.
6. Click Create to create your Azure Cosmos DB triggered function. After the function is created, the template-
based function code is displayed.
This function template writes the number of documents and the first document ID to the logs.
Next, you connect to your Azure Cosmos DB account and create the Items collection in the Tasks database.
3. Choose your Azure Cosmos DB account, then select the Data Explorer.
4. In Collections, choose taskDatabase and select New Collection.
5. In Add Collection, use the settings shown in the table below the image.
Storage capacity Fixed (10 GB) Use the default value. This value is
the storage capacity of the database.
6. Click OK to create the Items collection. It may take a short time for the collection to get created.
After the collection specified in the function binding exists, you can test the function by adding documents to this
new collection.
2. Replace the contents of the new document with the following content, then choose Save.
{
"id": "task1",
"category": "general",
"description": "some task"
}
3. Switch to the first browser tab that contains your function in the portal. Expand the function logs and verify
that the new document has triggered the function. See that the task1 document ID value is written to the
logs.
4. (Optional) Go back to your document, make a change, and click Update. Then, go back to the function logs
and verify that the update has also triggered the function.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts,
tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your
account status and service pricing. If you don't need the resources anymore, here's how to delete them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under
Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that
you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones you
want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can
also select the bell icon at the top of the page to view the notification.
Next steps
You have created a function that runs when a document is added or modified in your Azure Cosmos DB. For more
information about Azure Cosmos DB triggers, see Azure Cosmos DB bindings for Azure Functions.
Now that you have created your first function, let's add an output binding to the function that writes a message to a
Storage queue.
Add messages to an Azure Storage queue using Functions
Create a function triggered by Azure Blob storage
1/24/2019 • 5 minutes to read • Edit Online
Learn how to create a function triggered when files are uploaded to or updated in Azure Blob storage.
Prerequisites
Download and install the Microsoft Azure Storage Explorer.
An Azure subscription. If you don't have one, create a free account before you begin.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
3. In the search field, type blob and then choose the Blob trigger template.
4. If prompted, select Install to install the Azure Storage extension any dependencies in the function app. After
installation succeeds, select Continue.
5. Use the settings as specified in the table below the image.
Name Unique in your function app Name of this blob triggered function.
Storage account connection AzureWebJobsStorage You can use the storage account
connection already being used by
your function app, or create a new
one.
2. Run the Microsoft Azure Storage Explorer tool, click the connect icon on the left, choose Use a storage
account name and key, and click Next.
3. Enter the Account name and Account key from step 1, click Next and then Connect.
4. Expand the attached storage account, right-click Blob containers, click Create blob container, type
samples-workitems , and then press enter.
Now that you have a blob container, you can test the function by uploading a file to the container.
3. In the Upload files dialog box, click the Files field. Browse to a file on your local computer, such as an
image file, select it and click Open and then Upload.
4. Go back to your function logs and verify that the blob has been read.
NOTE
When your function app runs in the default Consumption plan, there may be a delay of up to several minutes
between the blob being added or updated and the function being triggered. If you need low latency in your blob
triggered functions, consider running your function app in an App Service plan.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts,
tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your
account status and service pricing. If you don't need the resources anymore, here's how to delete them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under
Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that
you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones you
want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can
also select the bell icon at the top of the page to view the notification.
Next steps
You have created a function that runs when a blob is added to or updated in Blob storage. For more information
about Blob storage triggers, see Azure Functions Blob storage bindings.
Now that you have created your first function, let's add an output binding to the function that writes a message to a
Storage queue.
Add messages to an Azure Storage queue using Functions
Create a function triggered by Azure Queue storage
1/24/2019 • 5 minutes to read • Edit Online
Learn how to create a function that is triggered when messages are submitted to an Azure Storage queue.
Prerequisites
Download and install the Microsoft Azure Storage Explorer.
An Azure subscription. If you don't have one, create a free account before you begin.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
3. In the search field, type queue and then choose the Queue trigger template.
4. If prompted, select Install to install the Azure Storage extension any dependencies in the function app. After
installation succeeds, select Continue.
5. Use the settings as specified in the table below the image.
Storage account connection AzureWebJobStorage You can use the storage account
connection already being used by
your function app, or create a new
one.
2. Run the Microsoft Azure Storage Explorer tool, click the connect icon on the left, choose Use a storage
account name and key, and click Next.
3. Enter the Account name and Account key from step 1, click Next and then Connect.
4. Expand the attached storage account, right-click Queues, click Create Queue, type myqueue-items , and then
press enter.
Now that you have a storage queue, you can test the function by adding a message to the queue.
5. Back in Storage Explorer, click Refresh and verify that the message has been processed and is no longer in
the queue.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts,
tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your
account status and service pricing. If you don't need the resources anymore, here's how to delete them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under
Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that
you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones you
want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can
also select the bell icon at the top of the page to view the notification.
Next steps
You have created a function that runs when a message is added to a storage queue. For more information about
Queue storage triggers, see Azure Functions Storage queue bindings.
Now that you have a created your first function, let's add an output binding to the function that writes a message
back to another queue.
Add messages to an Azure Storage queue using Functions
Create a function in Azure that is triggered by a
timer
1/24/2019 • 4 minutes to read • Edit Online
Learn how to use Azure Functions to create a serverless function that runs based a schedule that you define.
Prerequisites
To complete this tutorial:
If you don't have an Azure subscription, create a free account before you begin.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
3. In the search field, type timer and configure the new trigger with the settings as specified in the table
below the image.
SETTING SUGGESTED VALUE DESCRIPTION
4. Click Create. A function is created in your chosen language that runs every minute.
5. Verify execution by viewing trace information written to the logs.
Now, you change the function's schedule so that it runs once every hour instead of every minute.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts,
tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your
account status and service pricing. If you don't need the resources anymore, here's how to delete them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under
Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that
you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones you
want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can
also select the bell icon at the top of the page to view the notification.
Next steps
You have created a function that runs based on a schedule. For more information about timer triggers, see
Schedule code execution with Azure Functions.
Now that you have created your first function, let's add an output binding to the function that writes a message to
a Storage queue.
Add messages to an Azure Storage queue using Functions
Store unstructured data using Azure Functions and
Azure Cosmos DB
1/24/2019 • 5 minutes to read • Edit Online
Azure Cosmos DB is a great way to store unstructured and JSON data. Combined with Azure Functions, Cosmos
DB makes storing data quick and easy with much less code than required for storing data in a relational database.
NOTE
At this time, the Azure Cosmos DB trigger, input bindings, and output bindings work with SQL API and Graph API accounts
only.
In Azure Functions, input and output bindings provide a declarative way to connect to external service data from
your function. In this article, learn how to update an existing function to add an output binding that stores
unstructured data in an Azure Cosmos DB document.
Prerequisites
To complete this tutorial:
This topic uses as its starting point the resources created in Create your first function from the Azure portal. If you
haven't already done so, please complete these steps now to create your function app.
3. On the Create Azure Cosmos DB Account page, enter the basic settings for the new Azure Cosmos
account.
Account Name Enter a unique name Enter a name to identify your Azure
Cosmos account. Because
documents.azure.com is appended to
the ID that you provide to create
your URI, use a unique ID.
Location Select the region closest to your Select a geographic location to host
users your Azure Cosmos DB account. Use
the location that is closest to your
users to give them the fastest access
to the data.
4. Select Review + create. You can skip the Network and Tags sections.
5. Review the account settings, and then select Create. It takes a few minutes to create the account. Wait for
the portal page to display Your deployment is complete.
3. If you get an Extensions not installed message, choose Install to install the Azure Cosmos DB bindings
extension in the function app. Installation may take a minute or two.
4. Use the Azure Cosmos DB output settings as specified in the table:
If true, creates the Cosmos DB Checked The collection doesn't already exist, so
database and collection create it.
Azure Cosmos DB account New setting Select New, then choose your
connection Subscription, the Database
account you created earlier, and
Select. Creates an application setting
for your account connection. This
setting is used by the binding to
connection to the database.
#r "Newtonsoft.Json"
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
public static IActionResult Run(HttpRequest req, out object taskDocument, ILogger log)
{
string name = req.Query["name"];
string task = req.Query["task"];
string duedate = req.Query["duedate"];
This code sample reads the HTTP Request query strings and assigns them to fields in the taskDocument object. The
taskDocument binding sends the object data from this binding parameter to be stored in the bound document
database. The database is created the first time the function runs.
4. Choose your Azure Cosmos DB account, then select the Data Explorer.
5. Expand the Collections nodes, select the new document, and confirm that the document contains your
query string values, along with some additional metadata.
You've successfully added a binding to your HTTP trigger to store unstructured data in an Azure Cosmos DB.
Clean up resources
In the preceding steps, you created Azure resources in a resource group. If you don't expect to need these
resources in the future, you can delete them by deleting the resource group.
From the left menu in the Azure portal, select Resource groups and then select myResourceGroup.
On the resource group page, make sure that the listed resources are the ones you want to delete.
Select Delete, type myResourceGroup in the text box, and then select Delete.
Next steps
For more information about binding to a Cosmos DB database, see Azure Functions Cosmos DB bindings.
Azure Functions triggers and bindings concepts
Learn how Functions integrates with other services.
Azure Functions developer reference
Provides more technical information about the Functions runtime and a reference for coding functions and
defining triggers and bindings.
Code and test Azure Functions locally
Describes the options for developing your functions locally.
Add an Azure Storage queue binding to your
function
4/28/2019 • 6 minutes to read • Edit Online
Azure Functions lets you connect Azure services and other resources to functions without having to write your
own integration code. These bindings, which represent both input and output, are declared within the function
definition. Data from bindings is provided to the function as parameters. A trigger is a special type of input
binding. While a function has only one trigger, it can have multiple input and output bindings. To learn more, see
Azure Functions triggers and bindings concepts.
This article shows you how to integrate the function you created in the previous quickstart article with an Azure
Storage queue. The output binding that you add to this function writes data from the HTTP request to a message
in the queue.
Most bindings require a stored connection string that Functions uses to access the bound service. To make it
easier, you use the Storage account that you created with your function app. The connection to this account is
already stored in an app setting named AzureWebJobsStorage .
Prerequisites
Before you start this article, complete the steps in part 1 of the Python quickstart.
IMPORTANT
Because it contains secrets, the local.settings.json file never gets published, and it should be excluded from source control.
You need the value AzureWebJobsStorage , which is the Storage account connection string. You use this connection
to verify that the output binding works as expected.
Your function.json file should now look like the following example:
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "outqueue",
"connection": "AzureWebJobsStorage"
}
]
}
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
msg.set(name)
return func.HttpResponse(f"Hello {name}!")
else:
return func.HttpResponse(
"Please pass a name on the query string or in the request body",
status_code=400
)
By using an output binding, you don't have to use the Azure Storage SDK code for authentication, getting a queue
reference, or writing data. The Functions runtime and queue output binding do those tasks for you.
NOTE
Because the previous article had you enable extension bundles in the host.json, the Storage binding extension was
downloaded and installed for you during startup.
Copy the URL of your HttpTrigger function from the runtime output and paste it into your browser's address bar.
Append the query string ?name=<yourname> to this URL and execute the request. You should see the same
response in the browser as you did in the previous article.
This time, the output binding also creates a queue named outqueue in your Storage account and adds a message
with this same string.
Next, you use the Azure CLI to view the new queue and verify that a message was added. You can also view your
queue using the Microsoft Azure Storage Explorer or in the Azure portal.
Set the Storage account connection
Open the local.settings.json file and copy the value of AzureWebJobsStorage , which is the Storage account
connection string. Set the AZURE_STORAGE_CONNECTION_STRING environment variable to the connection string using
the following Bash command:
export AZURE_STORAGE_CONNECTION_STRING=<STORAGE_CONNECTION_STRING>
With the connection string set in the AZURE_STORAGE_CONNECTION_STRING environment variable, you can access your
Storage account without having to provide authentication each time.
Query the Storage queue
You can use the az storage queue list command to view the Storage queues in your account, as in the following
example:
The output from this command includes a queue named outqueue , which is the queue that was created when the
function ran.
Next, use the az storage message peek command to view the messages in this queue, as in the following example.
echo `echo $(az storage message peek --queue-name outqueue -o tsv --query '[].{Message:content}') | base64 --
decode`
The string returned should be the same as the message you sent to test the function.
NOTE
The previous example decodes the returned string from base64. This is because the Queue storage bindings write to and
read from Azure Storage as base64 strings.
You will see output similar to the following, which has been truncated for readability.
Preparing archive...
Uploading content...
Upload completed successfully.
Deployment completed successfully.
Syncing triggers...
Functions in myfunctionapp:
HttpTrigger - [httpTrigger]
Invoke url: https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....
Copy the Invoke URL value for your HttpTrigger, which you can now use to test your function in Azure. The URL
contains a code query string value that is your function key. This key makes it difficult for others to call your HTTP
trigger endpoint in Azure.
Again, you can use cURL or a browser to test the deployed function. As before append the query string
&name=<yourname> to the URL, as in the following example:
curl https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....&name=<yourname>
You can Examine the Storage queue message to verify that the output binding again generates a new message in
the queue.
Clean up resources
Other quickstarts in this collection build upon this quickstart. If you plan to continue on to work with subsequent
quickstarts or with the tutorials, do not clean up the resources created in this quickstart. If you do not plan to
continue, use the following command to delete all resources created by this quickstart:
Next steps
You've updated your HTTP triggered function to write data to a Storage queue. To learn more about developing
Azure Functions using Python, see the Azure Functions Python developer guide and Azure Functions triggers and
bindings.
Next, you should enable Application Insights monitoring for your function app:
Enable Application Insights integration
Add messages to an Azure Storage queue using
Functions
3/15/2019 • 6 minutes to read • Edit Online
In Azure Functions, input and output bindings provide a declarative way to make data from external services
available to your code. In this quickstart, you use an output binding to create a message in a queue when a
function is triggered by an HTTP request. You use Azure Storage Explorer to view the queue messages that your
function creates:
Prerequisites
To complete this quickstart:
Follow the directions in Create your first function from the Azure portal and don't do the Clean up
resources step. That quickstart creates the function app and function that you use here.
Install Microsoft Azure Storage Explorer. This is a tool you'll use to examine queue messages that your
output binding creates.
5. If you get an Extensions not installed message, choose Install to install the Storage bindings extension
in the function app. This may take a minute or two.
6. Under Azure Queue Storage output, use the settings as specified in the table that follows this
screenshot:
SETTING SUGGESTED VALUE DESCRIPTION
Storage account connection AzureWebJobsStorage You can use the storage account
connection already being used by
your function app, or create a new
one.
In the body of the function just before the return statement, add code that uses the parameter to create a
queue message.
outputQueueItem.Add("Name passed to the function: " + name);
Notice that the Request body contains the name value Azure. This value appears in the queue message
that is created when the function is invoked.
As an alternative to selecting Run here, you can call the function by entering a URL in a browser and
specifying the name value in the query string. The browser method is shown in the previous quickstart.
2. Check the logs to make sure that the function succeeded.
A new queue named outqueue is created in your Storage account by the Functions runtime when the output
binding is first used. You'll use Storage Explorer to verify that the queue and a message in it were created.
Connect Storage Explorer to your account
Skip this section if you have already installed Storage Explorer and connected it to the storage account that you're
using with this quickstart.
1. Run the Microsoft Azure Storage Explorer tool, select the connect icon on the left, choose Use a storage
account name and key, and select Next.
2. In the Azure portal, on the function app page, select your function and then select Integrate.
3. Select the Azure Queue storage output binding that you added in an earlier step.
4. Expand the Documentation section at the bottom of the page.
The portal shows credentials that you can use in Storage Explorer to connect to the storage account.
5. Copy the Account Name value from the portal and paste it in the Account name box in Storage
Explorer.
6. Click the show/hide icon next to Account Key to display the value, and then copy the Account Key value
and paste it in the Account key box in Storage Explorer.
7. Select Next > Connect.
Examine the output queue
1. In Storage Explorer, select the storage account that you're using for this quickstart.
2. Expand the Queues node, and then select the queue named outqueue.
The queue contains the message that the queue output binding created when you ran the HTTP -triggered
function. If you invoked the function with the default name value of Azure, the queue message is Name
passed to the function: Azure.
3. Run the function again, and you'll see a new message appear in the queue.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts,
tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your
account status and service pricing. If you don't need the resources anymore, here's how to delete them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under
Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that
you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones you
want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can
also select the bell icon at the top of the page to view the notification.
Next steps
In this quickstart, you added an output binding to an existing function. For more information about binding to
Queue storage, see Azure Functions Storage queue bindings.
Azure Functions triggers and bindings concepts
Learn how Functions integrates with other services.
Azure Functions developer reference
Provides more technical information about the Functions runtime and a reference for coding functions and
defining triggers and bindings.
Code and test Azure Functions locally
Describes the options for developing your functions locally.
Create a function that integrates with Azure Logic
Apps
3/15/2019 • 9 minutes to read • Edit Online
Azure Functions integrates with Azure Logic Apps in the Logic Apps Designer. This integration lets you use the
computing power of Functions in orchestrations with other Azure and third-party services.
This tutorial shows you how to use Functions with Logic Apps and Cognitive Services on Azure to run sentiment
analysis from Twitter posts. An HTTP triggered function categorizes tweets as green, yellow, or red based on the
sentiment score. An email is sent when poor sentiment is detected.
Prerequisites
An active Twitter account.
An Outlook.com account (for sending notifications).
This article uses as its starting point the resources created in Create your first function from the Azure portal.
If you haven't already done so, complete these steps now to create your function app.
Resource group myResourceGroup Use the same resource group for all
services in this tutorial.
6. In the left navigation column, click Keys, and then copy the value of Key 1 and set it aside in a text editor.
You use the key to connect the logic app to your Cognitive Services API.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
#r "Newtonsoft.Json"
using System;
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
This function code returns a color category based on the sentiment score received in the request.
4. To test the function, click Test at the far right to expand the Test tab. Type a value of 0.2 for the Request
body, and then click Run. A value of RED is returned in the body of the response.
Now you have a function that categorizes sentiment scores. Next, you create a logic app that integrates your
function with your Twitter and Cognitive Services API.
4. Once you have entered the proper settings values, click Create to create your logic app.
5. After the app is created, click your new logic app pinned to the dashboard. Then in the Logic Apps
Designer, scroll down and click the Blank Logic App template.
You can now use the Logic Apps Designer to add services and triggers to your app.
Connect to Twitter
First, create a connection to your Twitter account. The logic app polls for tweets, which trigger the app to run.
1. In the designer, click the Twitter service, and click the When a new tweet is posted trigger. Sign in to
your Twitter account and authorize Logic Apps to use your account.
2. Use the Twitter trigger settings as specified in the table.
3. Type a connection name such as MyCognitiveServicesConnection , paste the key for your Cognitive Services
API and the Cognitive Services endpoint you set aside in a text editor, and click Create.
4. Next, enter Tweet Text in the text box and then click on New Step.
Now that sentiment detection is configured, you can add a connection to your function that consumes the
sentiment score output.
3. In IF TRUE, click Add an action, search for outlook.com , click Send an email, and sign in to your
Outlook.com account.
NOTE
If you don't have an Outlook.com account, you can choose another connector, such as Gmail or Office 365 Outlook
4. In the Send an email action, use the email settings as specified in the table.
SETTING SUGGESTED VALUE DESCRIPTION
To Type your email address The email address that receives the
notification.
Subject Negative tweet sentiment detected The subject line of the email
notification.
Body Tweet text, Location Click the Tweet text and Location
parameters.
1. Click Save.
Now that the workflow is complete, you can enable the logic app and see the function at work.
5. When a potentially negative sentiment is detected, you receive an email. If you haven't received an email,
you can change the function code to return RED every time:
After you have verified email notifications, change back to the original code:
Now you have seen how easy it is to integrate Functions into a Logic Apps workflow.
Next steps
In this tutorial, you learned how to:
Create a Cognitive Services API Resource.
Create a function that categorizes tweet sentiment.
Create a logic app that connects to Twitter.
Add sentiment detection to the logic app.
Connect the logic app to the function.
Send an email based on the response from the function.
Advance to the next tutorial to learn how to create a serverless API for your function.
Create a serverless API using Azure Functions
To learn more about Logic Apps, see Azure Logic Apps.
Create a serverless API using Azure Functions
2/4/2019 • 6 minutes to read • Edit Online
In this tutorial, you will learn how Azure Functions allows you to build highly scalable APIs. Azure Functions comes
with a collection of built-in HTTP triggers and bindings, which make it easy to author an endpoint in a variety of
languages, including Node.JS, C#, and more. In this tutorial, you will customize an HTTP trigger to handle specific
actions in your API design. You will also prepare for growing your API by integrating it with Azure Functions
Proxies and setting up mock APIs. All of this is accomplished on top of the Functions serverless compute
environment, so you don't have to worry about scaling resources - you can just focus on your API logic.
Prerequisites
This topic uses as its starting point the resources created in Create your first function from the Azure portal. If you
haven't already done so, please complete these steps now to create your function app.
The resulting function will be used for the rest of this tutorial.
Sign in to Azure
Open the Azure portal. To do this, sign in to https://portal.azure.com with your Azure account.
Allowed HTTP methods Selected methods Determines what HTTP methods may
be used to invoke this function
NOTE
Note that you did not include the /api base path prefix in the route template, as this is handled by a global setting.
3. Click Save.
You can learn more about customizing HTTP functions in Azure Functions HTTP bindings.
Test your API
Next, test your function to see it working with the new API surface.
1. Navigate back to the development page by clicking on the function's name in the left navigation.
2. Click Get function URL and copy the URL. You should see that it uses the /api/hello route now.
3. Copy the URL into a new browser tab or your preferred REST client. Browsers will use GET by default.
4. Add parameters to the query string in your URL e.g. /api/hello/?name=John
5. Hit 'Enter' to confirm that it is working. You should see the response "Hello John"
6. You can also try calling the endpoint with another HTTP method to confirm that the function is not executed.
For this, you will need to use a REST client, such as cURL, Postman, or Fiddler.
Proxies overview
In the next section, you will surface your API through a proxy. Azure Functions Proxies allows you to forward
requests to other resources. You define an HTTP endpoint just like with HTTP trigger, but instead of writing code to
execute when that endpoint is called, you provide a URL to a remote implementation. This allows you to compose
multiple API sources into a single API surface which is easy for clients to consume. This is particularly useful if you
wish to build your API as microservices.
A proxy can point to any HTTP resource, such as:
Azure Functions
API apps in Azure App Service
Docker containers in App Service on Linux
Any other hosted API
To learn more about proxies, see Working with Azure Functions Proxies.
NOTE
App settings are recommended for the host configuration to prevent a hard-coded environment dependency for the
proxy. Using app settings means that you can move the proxy configuration between environments, and the
environment-specific app settings will be applied.
4. Click Save.
Creating a proxy on the frontend
1. Navigate back to your frontend function app in the portal.
2. In the left-hand navigation, click the plus sign '+' next to "Proxies".
4. Note that Proxies does not provide the /api base path prefix, and this must be included in the route
template.
5. The %HELLO_HOST% syntax will reference the app setting you created earlier. The resolved URL will point to
your original function.
6. Click Create.
7. You can try out your new proxy by copying the Proxy URL and testing it in the browser or with your favorite
HTTP client.
a. For an anonymous function use:
a. https://YOURPROXYAPP.azurewebsites.net/api/remotehello?name="Proxies"
b. For a function with authorization use:
a. https://YOURPROXYAPP.azurewebsites.net/api/remotehello?code=YOURCODE&name="Proxies"
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"HelloProxy": {
"matchCondition": {
"route": "/api/remotehello"
},
"backendUri": "https://%HELLO_HOST%/api/hello"
}
}
}
Next you'll add your mock API. Replace your proxies.json file with the following:
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"HelloProxy": {
"matchCondition": {
"route": "/api/remotehello"
},
"backendUri": "https://%HELLO_HOST%/api/hello"
},
"GetUserByName" : {
"matchCondition": {
"methods": [ "GET" ],
"route": "/api/users/{username}"
},
"responseOverrides": {
"response.statusCode": "200",
"response.headers.Content-Type" : "application/json",
"response.body": {
"name": "{username}",
"description": "Awesome developer and master of serverless APIs",
"skills": [
"Serverless",
"APIs",
"Azure",
"Cloud"
]
}
}
}
}
}
This adds a new proxy, "GetUserByName", without the backendUri property. Instead of calling another resource, it
modifies the default response from Proxies using a response override. Request and response overrides can also be
used in conjunction with a backend URL. This is particularly useful when proxying to a legacy system, where you
might need to modify headers, query parameters, etc. To learn more about request and response overrides, see
Modifying requests and responses in Proxies.
Test your mock API by calling the <YourProxyApp>.azurewebsites.net/api/users/{username} endpoint using a
browser or your favorite REST client. Be sure to replace {username} with a string value representing a username.
Next steps
In this tutorial, you learned how to build and customize an API on Azure Functions. You also learned how to bring
multiple APIs, including mocks, together as a unified API surface. You can use these techniques to build out APIs of
any complexity, all while running on the serverless compute model provided by Azure Functions.
The following references may be helpful as you develop your API further:
Azure Functions HTTP bindings
Working with Azure Functions Proxies
Documenting an Azure Functions API (preview )
Create an OpenAPI definition for a function with
Azure API Management
5/9/2019 • 6 minutes to read • Edit Online
REST APIs are often described using an OpenAPI definition. This definition contains information about what
operations are available in an API and how the request and response data for the API should be structured.
In this tutorial, you create a function that determines whether an emergency repair on a wind turbine is cost-
effective. You then create an OpenAPI definition for the function app using Azure API Management so that the
function can be called from other apps and services.
In this tutorial, you learn how to:
Create a function in Azure
Generate an OpenAPI definition using Azure API Management
Test the definition by calling the function
Download the OpenAPI definition
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
This function code returns a message of Yes or No to indicate whether an emergency repair is cost-
effective, as well as the revenue opportunity that the turbine represents, and the cost to fix the turbine.
5. To test the function, click Test at the far right to expand the test tab. Enter the following value for the
Request body, and then click Run.
{
"hours": "6",
"capacity": "2500"
}
The following value is returned in the body of the response.
{"message":"Yes","revenueOpportunity":"$7200","costToFix":"$1600"}
Now you have a function that determines the cost-effectiveness of emergency repairs. Next, you generate an
OpenAPI definition for the function app.
2. Use the API Management settings as specified in the table below the image.
SETTING SUGGESTED VALUE DESCRIPTION
6. In the Create from Function App page, accept the defaults and select Create
The API is now created for the function.
{
"hours": "6",
"capacity": "2500"
}
Clean up resources
In the preceding steps, you created Azure resources in a resource group. If you don't expect to need these
resources in the future, you can delete them by deleting the resource group.
From the left menu in the Azure portal, select Resource groups and then select myResourceGroup.
On the resource group page, make sure that the listed resources are the ones you want to delete.
Select Delete, type myResourceGroup in the text box, and then select Delete.
Next steps
You have used API Management integration to generate an OpenAPI definition of your functions. You can now
edit the definition in API Management in the portal. You can also learn more about API Management.
Edit the OpenAPI definition in API Management
Tutorial: integrate Functions with an Azure virtual
network
5/17/2019 • 9 minutes to read • Edit Online
This tutorial shows you how to use Azure Functions to connect to resources in an Azure virtual network. you'll
create a function that has access to both the internet and to a VM running WordPress in virtual network.
Create a function app in the Premium plan
Deploy a WordPress site to VM in a virtual network
Connect the function app to the virtual network
Create a function proxy to access WordPress resources
Request a WordPress file from inside the virtual network
NOTE
This tutorial creates a function app in the Premium plan. This hosting plan is currently in preview. For more information, see
Premium plan.
Topology
The following diagram shows the architecture of the solution that you create:
Functions running in the Premium plan have the same hosting capabilities as web apps in Azure App Service,
which includes the VNet Integration feature. To learn more about VNet Integration, including troubleshooting and
advanced configuration, see Integrate your app with an Azure virtual network.
Prerequisites
For this tutorial, it's important that you understand IP addressing and subnetting. You can start with this article
that covers the basics of addressing and subnetting. Many more articles and videos are available online.
If you don’t have an Azure subscription, create a free account before you begin.
4. Select Create new, type an App Service plan name, choose a Location in a region near you or near other
services your functions access, and then select Pricing tier.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
9. Select Go to resource to view your new function app. You can also select Pin to dashboard. Pinning
makes it easier to return to this function app resource from your dashboard.
You can pin the function app to the dashboard by selecting the pin icon in the upper right-hand corner. Pinning
makes it easier to return to this function app after you create your VM.
Create a VM inside a virtual network
Next, create a preconfigured VM that runs WordPress inside a virtual network ( WordPress LEMP7 Max
Performance by Jetware). A WordPress VM is used because of its low cost and convenience. This same scenario
works with any resource in a virtual network, such as REST APIs, App Service Environments, and other Azure
services.
1. In the portal, choose + Create a resource on the left navigation pane, in the search field type
WordPress LEMP7 Max Performance , and press Enter.
2. Choose Wordpress LEMP Max Performance in the search results. Select a software plan of Wordpress
LEMP Max Performance for CentOS as the Software Plan and select Create.
3. In the Basics tab, use the VM settings as specified in the table below the image:
SETTING SUGGESTED VALUE DESCRIPTION
4. Choose the Networking tab and under Configure virtual networks select Create new.
5. In Create virtual network, use the settings in the table below the image:
SETTING SUGGESTED VALUE DESCRIPTION
Address range (subnet) 10.10.1.0/24 The subnet size defines how many
interfaces can be added to the
subnet. This subnet is used by the
WordPress site. A /24 subnet
provides 254 host addresses.
Virtual network address block 10.10.0.0/16 Choose the same address block used
by the WordPress site. You should
only have one address block defined.
SETTING SUGGESTED VALUE DESCRIPTION
5. Select OK to add the subnet. Close the VNet Integration and Network Feature Status pages to return to
your function app page.
The function app can now access the virtual network where the WordPress site is running. Next, you use Azure
Functions Proxies to return a file from the WordPress site.
Try it out
1. In your browser, try to access the URL you used as the Backend URL. As expected, the request times out. A
timeout occurs because your WordPress site is connected only to your virtual network and not the internet.
2. Copy the Proxy URL value from your new proxy and paste it into the address bar of your browser. The
returned image is from the WordPress site running inside your virtual network.
Your function app is connected to both the internet and your virtual network. The proxy is receiving a request over
the public internet, and then acting as a simple HTTP proxy to forward that request to the connected virtual
network. The proxy then relays the response back to you publicly over the internet.
Clean up resources
In the preceding steps, you created Azure resources in a resource group. If you don't expect to need these
resources in the future, you can delete them by deleting the resource group.
From the left menu in the Azure portal, select Resource groups and then select myResourceGroup.
On the resource group page, make sure that the listed resources are the ones you want to delete.
Select Delete, type myResourceGroup in the text box, and then select Delete.
Next steps
In this tutorial, the WordPress site serves as an API that is called by using a proxy in the function app. This
scenario makes a good tutorial because it's easy to set up and visualize. You could use any other API deployed
within a virtual network. You could also have created a function with code that calls APIs deployed within the
virtual network. A more realistic scenario is a function that uses data client APIs to call a SQL Server instance
deployed in the virtual network.
Functions running in a Premium plan share the same underlying App Service infrastructure as web apps on
PremiumV2 plans. All the documentation for web apps in Azure App Service applies to your Premium plan
functions.
Learn more about the networking options in Functions
Tutorial: Automate resizing uploaded images using
Event Grid
4/16/2019 • 9 minutes to read • Edit Online
Azure Event Grid is an eventing service for the cloud. Event Grid enables you to create subscriptions to events
raised by Azure services or third-party resources.
This tutorial is part two of a series of Storage tutorials. It extends the previous Storage tutorial to add serverless
automatic thumbnail generation using Azure Event Grid and Azure Functions. Event Grid enables Azure Functions
to respond to Azure Blob storage events and generate thumbnails of uploaded images. An event subscription is
created against the Blob storage create event. When a blob is added to a specific Blob storage container, a function
endpoint is called. Data passed to the function binding from Event Grid is used to access the blob and generate the
thumbnail image.
You use the Azure CLI and the Azure portal to add the resizing functionality to an existing image upload app.
.NET
Node.js V2 SDK
Node.js V10 SDK
In this tutorial, you learn how to:
Create a general Azure Storage account
Deploy serverless code using Azure Functions
Create a Blob storage event subscription in Event Grid
Prerequisites
NOTE
This article has been updated to use the new Azure PowerShell Az module. You can still use the AzureRM module, which will
continue to receive bug fixes until at least December 2020. To learn more about the new Az module and AzureRM
compatibility, see Introducing the new Azure PowerShell Az module. For Az module installation instructions, see Install Azure
PowerShell.
If you choose to install and use the CLI locally, this tutorial requires the Azure CLI version 2.0.14 or later. Run
az --version to find the version. If you need to install or upgrade, see Install Azure CLI.
If you are not using Cloud Shell, you must first sign in using az login .
resourceGroupName=myResourceGroup
2. Set a variable for the name of the new storage account that Azure Functions requires.
Now you must configure the function app to connect to the Blob storage account you created in the previous
tutorial.
blobStorageAccount=<name of the Blob storage account you created in the previous tutorial>
storageConnectionString=$(az storage account show-connection-string --resource-group $resourceGroupName \
--name $blobStorageAccount --query connectionString --output tsv)
The FUNCTIONS_EXTENSION_VERSION=~2 setting makes the function app run on version 2.x of the Azure Functions
runtime.
You can now deploy a function code project to this function app.
The image resize function is triggered by HTTP requests sent to it from the Event Grid service. You tell Event Grid
that you want to get these notifications at your function's URL by creating an event subscription. For this tutorial
you subscribe to blob-created events.
The data passed to the function from the Event Grid notification includes the URL of the blob. That URL is in turn
passed to the input binding to obtain the uploaded image from Blob storage. The function generates a thumbnail
image and writes the resulting stream to a separate container in Blob storage.
This project uses EventGridTrigger for the trigger type. Using the Event Grid trigger is recommended over
generic HTTP triggers. Event Grid automatically validates Event Grid Function triggers. With generic HTTP
triggers, you must implement the validation response.
.NET
Node.js V2 SDK
Node.js V10 SDK
To learn more about this function, see the function.json and run.csx files.
The function project code is deployed directly from the public sample repository. To learn more about deployment
options for Azure Functions, see Continuous deployment for Azure Functions.
2. Expand your function app, choose the Thumbnail function, and then select Add Event Grid subscription.
3. Use the event subscription settings as specified in the table.
Resource Your Blob storage account Choose the Blob storage account
you created.
Event types Blob created Uncheck all types other than Blob
created. Only event types of
Microsoft.Storage.BlobCreated
are passed to the function.
Now that the backend services are configured, you test the image resize functionality in the sample web app.
Azure Functions lets you host your functions on Linux in your own custom container. You can also host on a
default Azure App Service container. This functionality requires the Functions 2.x runtime.
In this tutorial, you learn how to deploy your functions to Azure as a custom Docker image. This pattern is useful
when you need to customize the built-in App Service container image. You may want to use a custom image when
your functions need a specific language version or require a specific dependency or configuration that isn't
provided within the built-in image. Supported base images for Azure Functions are found in the Azure Functions
base images repo. Python support is in preview at this time.
This tutorial walks you through how to use Azure Functions Core Tools to create a function in a custom Linux
image. You publish this image to a function app in Azure, which was created using the Azure CLI.
In this tutorial, you learn how to:
Create a function app and Dockerfile using Core Tools.
Build a custom image using Docker.
Publish a custom image to a container registry.
Create an Azure Storage account.
Create a Linux App Service plan.
Deploy a function app from Docker Hub.
Add application settings to the function app.
Enable continuous deployment
The following steps are supported on a Mac, Windows, or Linux computer.
Prerequisites
Before running this sample, you must have the following:
Install Azure Core Tools version 2.x.
Install the Azure CLI. This article requires the Azure CLI version 2.0 or later. Run az --version to find the
version you have.
You can also use the Azure Cloud Shell.
An active Azure subscription.
If you don't have an Azure subscription, create a free account before you begin.
When you include the --docker option, a dockerfile is generated for the project. This file is used to create a
custom container in which to run the project. The base image used depends on the worker runtime language
chosen.
When prompted, choose a worker runtime from the following languages:
dotnet : creates a .NET class library project (.csproj).
node : creates a JavaScript project.
python : creates a Python project.
NOTE
Python for Azure Functions is currently in preview. To receive important updates, subscribe to the Azure App Service
announcements repository on GitHub.
When the command executes, you see something like the following output:
Writing .gitignore
Writing host.json
Writing local.settings.json
Writing Dockerfile
Use the following command to navigate to the new MyFunctionProj project folder.
cd MyFunctionProj
Create a function
The following command creates an HTTP -triggered function named MyHttpTrigger .
When the command executes, you see something like the following output:
The function "MyHttpTrigger" was created successfully from the "HttpTrigger" template.
[FunctionName("MyHttpTrigger")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ILogger log)
JavaScript
Open the function.json file for your new function in a text editor, update the authLevel property in bindings to
anonymous , and save your changes.
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
Now you can call the function in Azure without having to supply the function key. The function key is never
required when running locally.
The --build option is required to compile C# projects. You don't need this option for a JavaScript project.
When the Functions host starts, it write something like the following output, which has been truncated for
readability:
%%%%%%
%%%%%%
@ %%%%%% @
@@ %%%%%% @@
@@@ %%%%%%%%%%% @@@
@@ %%%%%%%%%% @@
@@ %%%% @@
@@ %%% @@
@@ %% @@
%%
%
...
...
Http Functions:
HttpTrigger: http://localhost:7071/api/MyHttpTrigger
Copy the URL of your HttpTrigger function from the runtime output and paste it into your browser's address bar.
Append the query string ?name=<yourname> to this URL and execute the request. The following shows the
response in the browser to the GET request returned by the local function:
Now that you have run your function locally, you can create the function app and other required resources in
Azure.
FROM mcr.microsoft.com/azure-functions/node:2.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot
COPY . /home/site/wwwroot
NOTE
The complete list of supported base images for Azure Functions can be found in the Azure Functions base image page.
Run the build command
In the root folder, run the docker build command, and provide a name, mydockerimage , and tag, v1.0.0 . Replace
<docker-id> with your Docker Hub account ID. This command builds the Docker image for the container.
When the command executes, you see something like the following output, which in this case is for a JavaScript
worker runtime:
With the custom image running in a local Docker container, verify the function app and container are functioning
correctly by browsing to http://localhost:8080.
You can optionally test your function again, this time in the local container using the following URL:
http://localhost:8080/api/myhttptrigger?name=<yourname>
After you have verified the function app in the container, stop the execution. Now, you can push the custom image
to your Docker Hub account.
A "login succeeded" message confirms that you are logged in. After you have signed in, you push the image to
Docker Hub by using the docker push command.
Now, you can use this image as the deployment source for a new function app in Azure.
You generally create your resource group and the resources in a region near you. To see all supported locations
for App Service plans, run the az appservice list-locations command.
az storage account create --name <storage_name> --location westeurope --resource-group myResourceGroup --sku
Standard_LRS
After the storage account has been created, the Azure CLI shows information similar to the following example:
{
"creationTime": "2017-04-15T17:14:39.320307+00:00",
"id": "/subscriptions/bbbef702-e769-477b-9f16-bc4d3aa97387/resourceGroups/myresourcegroup/...",
"kind": "Storage",
"location": "westeurope",
"name": "myfunctionappstorage",
"primaryEndpoints": {
"blob": "https://myfunctionappstorage.blob.core.windows.net/",
"file": "https://myfunctionappstorage.file.core.windows.net/",
"queue": "https://myfunctionappstorage.queue.core.windows.net/",
"table": "https://myfunctionappstorage.table.core.windows.net/"
},
....
// Remaining output has been truncated for readability.
}
When the App Service plan has been created, the Azure CLI shows information similar to the following example:
{
"adminSiteName": null,
"appServicePlanName": "myAppServicePlan",
"geoRegion": "West Europe",
"hostingEnvironmentProfile": null,
"id": "/subscriptions/0000-
0000/resourceGroups/myResourceGroup/providers/Microsoft.Web/serverfarms/myAppServicePlan",
"kind": "linux",
"location": "West Europe",
"maximumNumberOfWorkers": 1,
"name": "myAppServicePlan",
< JSON data removed for brevity. >
"targetWorkerSizeId": 0,
"type": "Microsoft.Web/serverfarms",
"workerTierName": null
}
After the function app has been created, the Azure CLI shows information similar to the following example:
{
"availabilityState": "Normal",
"clientAffinityEnabled": true,
"clientCertEnabled": false,
"containerSize": 1536,
"dailyMemoryTimeQuota": 0,
"defaultHostName": "quickstart.azurewebsites.net",
"enabled": true,
"enabledHostNames": [
"quickstart.azurewebsites.net",
"quickstart.scm.azurewebsites.net"
],
....
// Remaining output has been truncated for readability.
}
The deployment-container-image-name parameter indicates the image hosted on Docker Hub to use to create the
function app. Use the az functionapp config container show command to view information about the image used
for deployment. Use the az functionapp config container set command to deploy from a different image.
NOTE
If your container is private, you would have to set the following application settings as well
DOCKER_REGISTRY_SERVER_USERNAME
DOCKER_REGISTRY_SERVER_PASSWORD
You will have to stop and then start your function app for these values to be picked up
curl https://myfunctionapp.azurewebsites.net/api/httptrigger?code=cCr8sAxfBiow548FBDLS1....&name=<yourname>
You can also paste the copied URL in to the address of your web browser. Again, append the query string
&name=<yourname> to the URL before you execute the request.
2. Create an Application Insights resource by using the settings specified in the table below the image:
SETTING SUGGESTED VALUE DESCRIPTION
Name Unique app name It's easiest to use the same name as
your function app, which must be
unique in your subscription.
3. Choose OK. The Application Insights resource is created in the same resource group and subscription as
your function app. After creation completes, close the Application Insights window.
4. Back in your function app, select Application settings, and scroll down to Application settings. When
you see a setting named APPINSIGHTS_INSTRUMENTATIONKEY , it means that Application Insights integration is
enabled for your function app running in Azure.
To learn more, see Monitor Azure Functions.
This command returns the deployment webhook URL after continuous deployment is enabled. You can also use
the az functionapp deployment container show -cd-url command to return this URL.
Copy the deployment URL and browse to your DockerHub repo, choose the Webhooks tab, type a Webhook
name for the webhook, paste your URL in Webhook URL, and then choose the plus sign (+).
With the webhook set, any updates to the linked image in DockerHub result in the function app downloading and
installing the latest image.
Clean up resources
Other quickstarts in this collection build upon this quickstart. If you plan to continue on to work with subsequent
quickstarts or with the tutorials, do not clean up the resources created in this quickstart. If you do not plan to
continue, use the following command to delete all resources created by this quickstart:
Next steps
In this tutorial, you learned how to:
Create a function app and Dockerfile using Core Tools.
Build a custom image using Docker.
Publish a custom image to a container registry.
Create an Azure Storage account.
Create a Linux App Service plan.
Deploy a function app from Docker Hub.
Add application settings to the function app.
Learn how to enable continuous integration functionality built into the core App Service platform. You can
configure your function app so that the container is redeployed when you update your image in Docker Hub.
Continuous deployment with Web App for Containers
Tutorial: Deploy Azure functions as IoT Edge modules
4/26/2019 • 9 minutes to read • Edit Online
You can use Azure Functions to deploy code that implements your business logic directly to your Azure IoT Edge
devices. This tutorial walks you through creating and deploying an Azure function that filters sensor data on the
simulated IoT Edge device. You use the simulated IoT Edge device that you created in the Deploy Azure IoT Edge
on a simulated device on Windows or Linux quickstarts. In this tutorial, you learn how to:
Use Visual Studio Code to create an Azure function.
Use VS Code and Docker to create a Docker image and publish it to a container registry.
Deploy the module from the container registry to your IoT Edge device.
View filtered data.
NOTE
Azure Function modules on Azure IoT Edge are in public preview.
The Azure function that you create in this tutorial filters the temperature data that's generated by your device. The
function only sends messages upstream to Azure IoT Hub when the temperature is above a specified threshold.
If you don't have an Azure subscription, create a free account before you begin.
Prerequisites
Before beginning this tutorial, you should have gone through the previous tutorial to set up your development
environment for Linux container development: Develop IoT Edge modules for Linux devices. By completing that
tutorial, you should have the following prerequisites in place:
A free or standard-tier IoT Hub in Azure.
A Linux device running Azure IoT Edge
A container registry, like Azure Container Registry.
Visual Studio Code configured with the Azure IoT Tools.
Docker CE configured to run Linux containers.
To develop an IoT Edge module in with Azure Functions, install the following additional prerequisites on your
development machine:
C# for Visual Studio Code (powered by OmniSharp) extension.
The .NET Core 2.1 SDK.
FIELD VALUE
Provide a solution name Enter a descriptive name for your solution, like
FunctionSolution, or accept the default.
Provide Docker image repository for the module An image repository includes the name of your container
registry and the name of your container image. Your
container image is prepopulated from the last step.
Replace localhost:5000 with the login server value from
your Azure container registry. You can retrieve the login
server from the Overview page of your container registry
in the Azure portal. The final string looks like <registry
name>.azurecr.io/CSharpFunction.
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.Devices.Client;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.EdgeHub;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace Functions.Samples
{
public static class CSharpFunction
{
[FunctionName("CSharpFunction")]
public static async Task FilterMessageAndSendMessage(
[EdgeHubTrigger("input1")] Message messageReceived,
[EdgeHub(OutputName = "output1")] IAsyncCollector<Message> output,
ILogger logger)
{
const int temperatureThreshold = 20;
byte[] messageBytes = messageReceived.GetBytes();
var messageString = System.Text.Encoding.UTF8.GetString(messageBytes);
if (!string.IsNullOrEmpty(messageString))
{
logger.LogInformation("Info: Received one non-empty message");
// Get the body of the message and deserialize it.
var messageBody = JsonConvert.DeserializeObject<MessageBody>(messageString);
When you're prompted for the password, paste the password for your container registry and press Enter.
3. In the VS Code explorer, right-click the deployment.template.json file and select Build and Push IoT Edge
solution.
When you tell Visual Studio Code to build your solution, it first takes the information in the deployment template
and generates a deployment.json file in a new folder named config. Then it runs two commands in the integrated
terminal: docker build and docker push . These two commands build your code, containerize the functions, and
then push the code to the container registry that you specified when you initialized the solution.
Clean up resources
If you plan to continue to the next recommended article, you can keep the resources and configurations that you
created and reuse them. You can also keep using the same IoT Edge device as a test device.
Otherwise, you can delete the local configurations and the Azure resources that you created in this article to avoid
charges.
Delete Azure resources
Deleting Azure resources and resource groups is irreversible. Make sure that you don't accidentally delete the
wrong resource group or resources. If you created the IoT hub inside an existing resource group that has resources
that you want to keep, delete only the IoT hub resource itself, instead of deleting the resource group.
To delete the resources:
1. Sign in to the Azure portal and select Resource groups.
2. Select the name of the resource group that contains your IoT Edge test resources.
3. Review the list of resources contained in your resource group. If you want to delete all of them, you can
select Delete resource group. If you want to delete only some of them, you can click into each resource to
delete them individually.
Next steps
In this tutorial, you created an Azure function module with code to filter raw data that's generated by your IoT Edge
device. When you're ready to build your own modules, you can learn more about how to Develop with Azure IoT
Edge for Visual Studio Code.
Continue on to the next tutorials to learn other ways that Azure IoT Edge can help you turn data into business
insights at the edge.
Find averages by using a floating window in Azure Stream Analytics
Azure CLI Samples
3/12/2019 • 2 minutes to read • Edit Online
The following table includes links to bash scripts for Azure Functions that use the Azure CLI.
Create app
Create a function app for serverless execution Creates a function app in a Consumption plan.
Create a function app in an App Service plan Create a function app in a dedicated App Service plan.
Integrate
Create a function app and connect to a storage account Create a function app and connect it to a storage account.
Create a function app and connect to an Azure Cosmos DB Create a function app and connect it to an Azure Cosmos DB.
Continuous deployment
Deploy from GitHub Create a function app that deploys from a GitHub repository.
Deploy from Azure DevOps Create a function app that deploys from an Azure DevOps
repository.
Configure app
Map a custom domain to a function app Define a custom domain for your functions.
Bind an SSL certificate to a function app Upload SSL certificates for functions in a custom domain.
Azure Functions runtime versions overview
3/15/2019 • 8 minutes to read • Edit Online
There are two major versions of the Azure Functions runtime: 1.x and 2.x. The current version where new feature
work and improvements are being made is 2.x, though both are supported for production scenarios. The
following details some of the differences between the two, how you can create each version, and upgrade from
1.x to 2.x.
NOTE
This article refers to the cloud service Azure Functions. For information about the preview product that lets you run Azure
Functions on-premises, see the Azure Functions Runtime Overview.
Cross-platform development
The version 2.x runtime runs on .NET Core 2, which enables it to run on all platforms supported by .NET Core,
including macOS and Linux. Running on .NET Core enables cross-platform development and hosting scenarios.
By comparison, the version 1.x runtime only supports development and hosting in the Azure portal or on
Windows computers.
Languages
The version 2.x runtime uses a new language extensibility model. In version 2.x, all functions in a function app
must share the same language. The language of functions in a function app is chosen when creating the app.
Azure Functions 1.x experimental languages won't be updated to use the new model, so they aren't supported in
2.x. The following table indicates which programming languages are currently supported in each runtime
version.
For information about planned changes to language support, see Azure roadmap.
For more information, see Supported languages.
<TargetFramework>net461</TargetFramework>
<AzureFunctionsVersion>v1</AzureFunctionsVersion>
Ve r si o n 2 .x
<TargetFramework>netcoreapp2.2</TargetFramework>
<AzureFunctionsVersion>v2</AzureFunctionsVersion>
When you debug or publish your project, the correct version of the runtime is used.
VS Code and Azure Functions Core Tools
Azure Functions Core Tools is used for command line development and also by the Azure Functions extension
for Visual Studio Code. To develop against version 2.x, install version 2.x of the Core Tools. Version 1.x
development requires version 1.x of the Core Tools. For more information, see Install the Azure Functions Core
Tools.
For Visual Studio Code development, you may also need to update the user setting for the
azureFunctions.projectRuntime to match the version of the tools installed. This setting also updates the
templates and languages used during function app creation.
Changing version of apps in Azure
The version of the Functions runtime used by published apps in Azure is dictated by the
FUNCTIONS_EXTENSION_VERSION application setting. A value of ~2 targets the version 2.x runtime and ~1 targets
the version 1.x runtime. Don't arbitrarily change this setting, because other app setting changes and code
changes in your functions are likely required. To learn about the recommended way to migrate your function
app to a different runtime version, see How to target Azure Functions runtime versions.
Bindings
The version 2.x runtime uses a new binding extensibility model that offers these advantages:
Support for third-party binding extensions.
Decoupling of runtime and bindings. This change allows binding extensions to be versioned and released
independently. You can, for example, opt to upgrade to a version of an extension that relies on a newer
version of an underlying SDK.
A lighter execution environment, where only the bindings in use are known and loaded by the runtime.
With the exception of HTTP and timer triggers, all bindings must be explicitly added to the function app project,
or registered in the portal. For more information, see Register binding extensions.
The following table shows which bindings are supported in each runtime version.
The following table shows the bindings that are supported in the two major versions of the Azure Functions
runtime.
Blob Storage ✔ ✔ ✔ ✔ ✔
Cosmos DB ✔ ✔ ✔ ✔ ✔
Event Grid ✔ ✔ ✔
Event Hubs ✔ ✔ ✔ ✔
HTTP & ✔ ✔ ✔ ✔
Webhooks
Microsoft Graph ✔ ✔ ✔
Excel tables
Microsoft Graph ✔ ✔ ✔
OneDrive files
Microsoft Graph ✔ ✔
Outlook email
Microsoft Graph ✔ ✔ ✔ ✔
Events
Microsoft Graph ✔ ✔
Auth tokens
Mobile Apps ✔ ✔ ✔
Notification ✔ ✔
Hubs
Queue storage ✔ ✔ ✔ ✔
TYPE 1.X 2.X TRIGGER INPUT OUTPUT
SendGrid ✔ ✔ ✔
Service Bus ✔ ✔ ✔ ✔
SignalR ✔ ✔ ✔
Table storage ✔ ✔ ✔ ✔
Timer ✔ ✔ ✔
Twilio ✔ ✔ ✔
1 In 2.x, all bindings except HTTP and Timer must be registered. See Register binding extensions.
Consumption 1.x 5 10
Consumption 2.x 5 10
NOTE
Regardless of the function app timeout setting, 230 seconds is the maximum amount of time that an HTTP triggered
function can take to respond to a request. This is because of the default idle timeout of Azure Load Balancer. For longer
processing times, consider using the Durable Functions async pattern or defer the actual work and return an immediate
response.
Next steps
For more information, see the following resources:
Code and test Azure Functions locally
How to target Azure Functions runtime versions
Release notes
Azure Functions Premium plan (preview)
5/23/2019 • 6 minutes to read • Edit Online
The Azure Functions Premium plan is a hosting option for function apps. The Premium plan provides features like
VNet connectivity, no cold start, and premium hardware. Multiple function apps can be deployed to the same
Premium plan, and the plan allows you to configure compute instance size, base plan size, and maximum plan size.
For a comparison of the Premium plan and other plan and hosting types, see function scale and hosting options.
NOTE
The Premium plan preview currently supports functions running in .NET, Node, or Java through Windows infrastructure.
4. Select Create new, type an App Service plan name, choose a Location in a region near you or near other
services your functions access, and then select Pricing tier.
5. Choose the EP1 (elastic Premium) plan, then select Apply.
6. Select OK to create the plan, then use the remaining function app settings as specified in the table below
the image.
SETTING SUGGESTED VALUE DESCRIPTION
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
9. Select Go to resource to view your new function app. You can also select Pin to dashboard. Pinning
makes it easier to return to this function app resource from your dashboard.
You can also create a Premium plan from the Azure CLI
You can also configure pre-warmed instances for an app with the Azure CLI
IMPORTANT
You are charged for each instance allocated in the minimum instance count regardless if functions are executing or not.
If your app requires instances beyond your plan size, it can continue to scale out until the number of instances hits
the maximum burst limit. You are billed for instances beyond your plan size only while they are running and
rented to you. We will make a best effort at scaling your app out to its defined maximum limit, whereas the
minimum plan instances are guaranteed for your app.
You can configure the plan size and maximums in the Azure portal by selected the Scale Out options in the plan
or a function app deployed to that plan (under Platform Features).
You can also increase the maximum burst limit from the Azure CLI:
Regions
Below are the currently supported regions for the public preview.
REGION
Australia East
Australia Southeast
REGION
Canada Central
Central US
East Asia
East US 2
France Central
Japan West
Korea Central
North Central US
North Europe
South Central US
South India
Southeast Asia
UK West
West Europe
West India
West US
Known Issues
You can track the status of known issues of the public preview on GitHub.
Next steps
Understand Azure Functions scale and hosting options
Azure Functions scale and hosting
5/17/2019 • 11 minutes to read • Edit Online
Azure Functions runs in two different plans: Consumption plan and Premium plan (public preview ). The
Consumption plan automatically adds compute power when your code is running. Your app is scaled out when
needed to handle load, and scaled down when code stops running. You don't have to pay for idle VMs or reserve
capacity in advance. The Premium plan will also automatically scale and add additional compute power when your
code is running. The Premium plan comes with additional features like premium compute instances, the ability to
keep instances warm indefinitely, and VNet connectivity. If you have an existing App Service Plan, you can also run
your function apps within them.
NOTE
Both Premium plan and Consumption plan for Linux are currently in preview.
If you aren't familiar with Azure Functions, see the Azure Functions overview.
When you create a function app, you choose the hosting plan for functions in the app. In either plan, an instance of
the Azure Functions host executes the functions. The type of plan controls:
How host instances are scaled out.
The resources that are available to each host.
Instance features like VNet connectivity.
NOTE
You can switch between Consumption and Premium plans by changing the plan property of the function app resource.
Consumption plan
When you're using the Consumption plan, instances of the Azure Functions host are dynamically added and
removed based on the number of incoming events. This serverless plan scales automatically, and you're charged for
compute resources only when your functions are running. On a Consumption plan, a function execution times out
after a configurable period of time.
Billing is based on number of executions, execution time, and memory used. Billing is aggregated across all
functions within a function app. For more information, see the Azure Functions pricing page.
The Consumption plan is the default hosting plan and offers the following benefits:
Pay only when your functions are running.
Scale out automatically, even during periods of high load.
NOTE
The premium plan preview currently supports functions running in .NET, Node, or Java through Windows infrastructure.
When running JavaScript functions on a Premium plan, you should choose an instance that has fewer vCPUs. For
more information, see the Choose single-core Premium plans.
Consumption 1.x 5 10
Consumption 2.x 5 10
NOTE
Regardless of the function app timeout setting, 230 seconds is the maximum amount of time that an HTTP triggered function
can take to respond to a request. This is because of the default idle timeout of Azure Load Balancer. For longer processing
times, consider using the Durable Functions async pattern or defer the actual work and return an immediate response.
You can also use the Azure CLI to determine the plan, as follows:
When the output from this command is dynamic , your function app is in the Consumption plan. When the output
from this command is ElasticPremium , your function app is in the Premium plan. All other values indicate tiers of
an App Service plan.
Even with Always On enabled, the execution timeout for individual functions is controlled by the functionTimeout
setting in the host.json project file.
Storage account requirements
On any plan, a function app requires a general Azure Storage account, which supports Azure Blob, Queue, Files,
and Table storage. This is because Functions rely on Azure Storage for operations such as managing triggers and
logging function executions, but some storage accounts do not support queues and tables. These accounts, which
include blob-only storage accounts (including premium storage) and general-purpose storage accounts with zone-
redundant storage replication, are filtered-out from your existing Storage Account selections when you create a
function app.
To learn more about storage account types, see Introducing the Azure Storage services.
NOTE
When you're using a blob trigger on a Consumption plan, there can be up to a 10-minute delay in processing new blobs. This
delay occurs when a function app has gone idle. After the function app is running, blobs are processed immediately. To avoid
this cold-start delay, use the Premium plan, or use the Event Grid trigger. For more information, see the blob trigger binding
reference article.
Runtime scaling
Azure Functions uses a component called the scale controller to monitor the rate of events and determine whether
to scale out or scale in. The scale controller uses heuristics for each trigger type. For example, when you're using an
Azure Queue storage trigger, it scales based on the queue length and the age of the oldest queue message.
The unit of scale is the function app. When the function app is scaled out, additional resources are allocated to run
multiple instances of the Azure Functions host. Conversely, as compute demand is reduced, the scale controller
removes function host instances. The number of instances is eventually scaled down to zero when no functions are
running within a function app.
Understanding scaling behaviors
Scaling can vary on a number of factors, and scale differently based on the trigger and language selected. However
there are a few aspects of scaling that exist in the system today:
A single function app only scales up to a maximum of 200 instances. A single instance may process more than
one message or request at a time though, so there isn't a set limit on number of concurrent executions.
For HTTP triggers, new instances will only be allocated at most once every 1 second.
For non-HTTP triggers, new instances will only be allocated at most once every 30 seconds.
Different triggers may also have different scaling limits as well as documented below:
Event Hub
Best practices and patterns for scalable apps
There are many aspects of a function app that will impact how well it will scale, including host configuration,
runtime footprint, and resource efficiency. For more information, see the scalability section of the performance
considerations article. You should also be aware of how connections behave as your function app scales. For more
information, see How to manage connections in Azure Functions.
Billing model
Billing for the Consumption plan is described in detail on the Azure Functions pricing page. Usage is aggregated at
the function app level and counts only the time that function code is executed. The following are units for billing:
Resource consumption in gigabyte-seconds (GB -s). Computed as a combination of memory size and
execution time for all functions within a function app.
Executions. Counted each time a function is executed in response to an event trigger.
Useful queries and information on how to understand your consumption bill can be found on the billing FAQ.
Service limits
The following table indicates the limits that apply to function apps when running in the various hosting plans:
RESOURCE CONSUMPTION PLAN PREMIUM PLAN APP SERVICE PLAN1
App Service plans 100 per region 100 per resource group 100 per resource group
Custom domain SSL support Not supported, wildcard unbounded SNI SSL and 1 IP unbounded SNI SSL and 1 IP
certificate for SSL connections included SSL connections included
*.azurewebsites.net available
by default
1For specific limits for the various App Service plan options, see the App Service plan limits.
2By default, the timeout for the Functions 1.x runtime in an App Service plan is unbounded.
3Requires the App Service plan be set to Always On. Pay at standard rates.
4 These limits are set in the host.
5 The actual number of function apps that you can host depends on the activity of the apps, the size of the machine
in a Premium plan or an App Service plan, you can map a custom domain using either a CNAME or an A record.
Azure Functions triggers and bindings concepts
2/22/2019 • 3 minutes to read • Edit Online
In this article you learn the high-level concepts surrounding functions triggers and bindings.
Triggers are what cause a function to run. A trigger defines how a function is invoked and a function
must have exactly one trigger. Triggers have associated data, which is often provided as the payload of
the function.
Binding to a function is a way of declaratively connecting another resource to the function; bindings may
be connected as input bindings, output bindings, or both. Data from bindings is provided to the function
as parameters.
You can mix and match different bindings to suit your needs. Bindings are optional and a function might
have one or multiple input and/or output bindings.
Triggers and bindings let you avoid hardcoding access to other services. Your function receives data (for
example, the content of a queue message) in function parameters. You send data (for example, to create a
queue message) by using the return value of the function.
Consider the following examples of how you could implement different functions.
The Event Grid is used to Event Grid Blob Storage and Cosmos SendGrid
read an image from Blob DB
Storage and a document
from Cosmos DB to send
an email.
The portal provides a UI for this configuration, but you can edit the file directly by opening the
Advanced editor available via the Integrate tab of your function.
In .NET, the parameter type defines the data type for input data. For instance, use string to bind to the
text of a queue trigger, a byte array to read as binary and a custom type to de-serialize to an object.
For languages that are dynamically typed such as JavaScript, use the dataType property in the
function.json file. For example, to read the content of an HTTP request in binary format, set dataType to
binary :
{
"dataType": "binary",
"type": "httpTrigger",
"name": "req",
"direction": "in"
}
Binding direction
All triggers and bindings have a direction property in the function.json file:
For triggers, the direction is always in
Input and output bindings use in and out
Some bindings support a special direction inout . If you use inout , only the Advanced editor is
available via the Integrate tab in the portal.
When you use attributes in a class library to configure triggers and bindings, the direction is provided in
an attribute constructor or inferred from the parameter type.
Supported bindings
The following table shows the bindings that are supported in the two major versions of the Azure
Functions runtime.
Blob Storage ✔ ✔ ✔ ✔ ✔
Cosmos DB ✔ ✔ ✔ ✔ ✔
Event Grid ✔ ✔ ✔
Event Hubs ✔ ✔ ✔ ✔
TYPE 1.X 2.X TRIGGER INPUT OUTPUT
HTTP & ✔ ✔ ✔ ✔
Webhooks
Microsoft ✔ ✔ ✔
Graph
Excel tables
Microsoft ✔ ✔ ✔
Graph
OneDrive files
Microsoft ✔ ✔
Graph
Outlook email
Microsoft ✔ ✔ ✔ ✔
Graph
Events
Microsoft ✔ ✔
Graph
Auth tokens
Mobile Apps ✔ ✔ ✔
Notification ✔ ✔
Hubs
Queue storage ✔ ✔ ✔ ✔
SendGrid ✔ ✔ ✔
Service Bus ✔ ✔ ✔ ✔
SignalR ✔ ✔ ✔
Table storage ✔ ✔ ✔ ✔
Timer ✔ ✔ ✔
Twilio ✔ ✔ ✔
1 In 2.x, all bindings except HTTP and Timer must be registered. See Register binding extensions.
For information about which bindings are in preview or are approved for production use, see Supported
languages.
Resources
Binding expressions and patterns
Using the Azure Function return value
How to register a binding expression
Testing:
Strategies for testing your code in Azure Functions
Manually run a non HTTP -triggered function
Handling binding errors
Next steps
Register Azure Functions binding extensions
Azure Functions trigger and binding example
2/22/2019 • 3 minutes to read • Edit Online
This article demonstrates how to configure a trigger and bindings in an Azure Function.
Suppose you want to write a new row to Azure Table storage whenever a new message appears in Azure Queue
storage. This scenario can be implemented using an Azure Queue storage trigger and an Azure Table storage
output binding.
Here's a function.json file for this scenario.
{
"bindings": [
{
"type": "queueTrigger",
"direction": "in",
"name": "order",
"queueName": "myqueue-items",
"connection": "MY_STORAGE_ACCT_APP_SETTING"
},
{
"type": "table",
"direction": "out",
"name": "$return",
"tableName": "outTable",
"connection": "MY_TABLE_STORAGE_ACCT_APP_SETTING"
}
]
}
The first element in the bindings array is the Queue storage trigger. The type and direction properties identify
the trigger. The name property identifies the function parameter that receives the queue message content. The
name of the queue to monitor is in queueName , and the connection string is in the app setting identified by
connection .
The second element in the bindings array is the Azure Table Storage output binding. The type and direction
properties identify the binding. The name property specifies how the function provides the new table row, in this
case by using the function return value. The name of the table is in tableName , and the connection string is in the
app setting identified by connection .
To view and edit the contents of function.json in the Azure portal, click the Advanced editor option on the
Integrate tab of your function.
NOTE
The value of connection is the name of an app setting that contains the connection string, not the connection string itself.
Bindings use connection strings stored in app settings to enforce the best practice that function.json does not contain service
secrets.
C# script example
Here's C# script code that works with this trigger and binding. Notice that the name of the parameter that provides
the queue message content is order ; this name is required because the name property value in function.json is
order
#r "Newtonsoft.Json"
using Microsoft.Extensions.Logging;
using Newtonsoft.Json.Linq;
// From an incoming queue message that is a JSON object, add fields and write to Table storage
// The method return value creates a new row in Table Storage
public static Person Run(JObject order, ILogger log)
{
return new Person() {
PartitionKey = "Orders",
RowKey = Guid.NewGuid().ToString(),
Name = order["Name"].ToString(),
MobileNumber = order["MobileNumber"].ToString() };
}
JavaScript example
The same function.json file can be used with a JavaScript function:
// From an incoming queue message that is a JSON object, add fields and write to Table Storage
// The second parameter to context.done is used as the value for the new row
module.exports = function (context, order) {
order.PartitionKey = "Orders";
order.RowKey = generateRandomId();
context.done(null, order);
};
function generateRandomId() {
return Math.random().toString(36).substring(2, 15) +
Math.random().toString(36).substring(2, 15);
}
You now have a working function that is triggered by Azure Table storage which outputs data to a queue.
Next steps
Azure Functions binding expression patterns
Register Azure Functions binding extensions
5/17/2019 • 2 minutes to read • Edit Online
In Azure Functions version 2.x, bindings are available as separate packages from the functions runtime. While
.NET functions access bindings through NuGet packages, extension bundles allow other functions access to all
bindings through a configuration setting.
Consider the following items related to binding extensions:
Binding extensions aren't explicitly registered in Functions 1.x except when creating a C# class library
using Visual Studio 2019.
HTTP and timer triggers are supported by default and don't require an extension.
The following table indicates when and how you register bindings.
REGISTRATION REGISTRATION
DEVELOPMENT ENVIRONMENT IN FUNCTIONS 1.X IN FUNCTIONS 2.X
Non-.NET languages or local Azure Automatic Use Azure Functions Core Tools and
Core Tools development extension bundles
C# class library using Visual Studio Use NuGet tools Use NuGet tools
2019
C# class library using Visual Studio N/A Use .NET Core CLI
Code
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
The id property references the namespace for Microsoft Azure Functions extension bundles.
The version references the version of the bundle.
Bundle versions increment as packages in the bundle changes. Major version changes happen only when
packages in the bundle move a major version. The version property uses the interval notation for specifying
version ranges. The Functions runtime always picks the maximum permissible version defined by the version
range or interval.
Once you reference the extension bundles in your project, then all default bindings are available to your
functions. The bindings available in the extension bundle are:
PACKAGE VERSION
Microsoft.Azure.WebJobs.Extensions.CosmosDB 3.0.3
Microsoft.Azure.WebJobs.Extensions.DurableTask 1.8.0
Microsoft.Azure.WebJobs.Extensions.EventGrid 2.0.0
Microsoft.Azure.WebJobs.Extensions.EventHubs 3.0.3
Microsoft.Azure.WebJobs.Extensions.SendGrid 3.0.0
Microsoft.Azure.WebJobs.Extensions.ServiceBus 3.0.3
Microsoft.Azure.WebJobs.Extensions.SignalRService 1.0.0
Microsoft.Azure.WebJobs.Extensions.Storage 3.0.4
Microsoft.Azure.WebJobs.Extensions.Twilio 3.0.0
The name of the package used for a given binding is provided in the reference article for that binding. For an
example, see the Packages section of the Service Bus binding reference article.
Replace <TARGET_VERSION> in the example with a specific version of the package, such as 3.0.0-beta5 . Valid
versions are listed on the individual package pages at NuGet.org. The major versions that correspond to
Functions runtime 1.x or 2.x are specified in the reference article for the binding.
The .NET Core CLI can only be used for Azure Functions 2.x development.
The name of the package to use for a given binding is provided in the reference article for that binding. For an
example, see the Packages section of the Service Bus binding reference article.
Replace <TARGET_VERSION> in the example with a specific version of the package, such as 3.0.0-beta5 . Valid
versions are listed on the individual package pages at NuGet.org. The major versions that correspond to
Functions runtime 1.x or 2.x are specified in the reference article for the binding.
Next steps
Azure Function trigger and binding example
Azure Functions binding expression patterns
2/26/2019 • 6 minutes to read • Edit Online
One of the most powerful features of triggers and bindings is binding expressions. In the function.json file and in
function parameters and code, you can use expressions that resolve to values from various sources.
Most expressions are identified by wrapping them in curly braces. For example, in a queue trigger function,
{queueTrigger} resolves to the queue message text. If the path property for a blob output binding is
container/{queueTrigger} and the function is triggered by a queue message HelloWorld , a blob named
HelloWorld is created.
{
"bindings": [
{
"name": "order",
"type": "queueTrigger",
"direction": "in",
"queueName": "%input-queue-name%",
"connection": "MY_STORAGE_ACCT_APP_SETTING"
}
]
}
You can use the same approach in class libraries:
[FunctionName("QueueTrigger")]
public static void Run(
[QueueTrigger("%input-queue-name%")]string myQueueItem,
ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
}
{
"bindings": [
{
"name": "image",
"type": "blobTrigger",
"path": "sample-images/{filename}",
"direction": "in",
"connection": "MyStorageConnection"
},
...
The expression filename can then be used in an output binding to specify the name of the blob being created:
...
{
"name": "imageSmall",
"type": "blob",
"path": "sample-images-sm/{filename}",
"direction": "out",
"connection": "MyStorageConnection"
}
],
}
Function code has access to this same value by using filename as a parameter name:
The same ability to use binding expressions and patterns applies to attributes in class libraries. In the following
example, the attribute constructor parameters are the same path values as the preceding function.json
examples:
[FunctionName("ResizeImage")]
public static void Run(
[BlobTrigger("sample-images/{filename}")] Stream image,
[Blob("sample-images-sm/{filename}", FileAccess.Write)] Stream imageSmall,
string filename,
ILogger log)
{
log.LogInformation($"Blob trigger processing: {filename}");
// ...
}
You can also create expressions for parts of the file name such as the extension. For more information on how to
use expressions and patterns in the Blob path string, see the Storage blob binding reference.
Trigger metadata
In addition to the data payload provided by a trigger (such as the content of the queue message that triggered a
function), many triggers provide additional metadata values. These values can be used as input parameters in C#
and F# or properties on the context.bindings object in JavaScript.
For example, an Azure Queue storage trigger supports the following properties:
QueueTrigger - triggering message content if a valid string
DequeueCount
ExpirationTime
Id
InsertionTime
NextVisibleTime
PopReceipt
These metadata values are accessible in function.json file properties. For example, suppose you use a queue
trigger and the queue message contains the name of a blob you want to read. In the function.json file, you can
use queueTrigger metadata property in the blob path property, as shown in the following example:
"bindings": [
{
"name": "myQueueItem",
"type": "queueTrigger",
"queueName": "myqueue-items",
"connection": "MyStorageConnection",
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"direction": "in",
"connection": "MyStorageConnection"
}
]
Details of metadata properties for each trigger are described in the corresponding reference article. For an
example, see queue trigger metadata. Documentation is also available in the Integrate tab of the portal, in the
Documentation section below the binding configuration area.
JSON payloads
When a trigger payload is JSON, you can refer to its properties in configuration for other bindings in the same
function and in function code.
The following example shows the function.json file for a webhook function that receives a blob name in JSON:
{"BlobName":"HelloWorld.txt"} . A Blob input binding reads the blob, and the HTTP output binding returns the
blob contents in the HTTP response. Notice that the Blob input binding gets the blob name by referring directly
to the BlobName property ( "path": "strings/{BlobName}" )
{
"bindings": [
{
"name": "info",
"type": "httpTrigger",
"direction": "in",
"webHookType": "genericJson"
},
{
"name": "blobContents",
"type": "blob",
"direction": "in",
"path": "strings/{BlobName}",
"connection": "AzureWebJobsStorage"
},
{
"name": "res",
"type": "http",
"direction": "out"
}
]
}
For this to work in C# and F#, you need a class that defines the fields to be deserialized, as in the following
example:
using System.Net;
using Microsoft.Extensions.Logging;
public static HttpResponseMessage Run(HttpRequestMessage req, BlobInfo info, string blobContents, ILogger
log)
{
if (blobContents == null) {
return req.CreateResponse(HttpStatusCode.NotFound);
}
log.LogInformation($"Processing: {info.BlobName}");
Dot notation
If some of the properties in your JSON payload are objects with properties, you can refer to those directly by
using dot notation. For example, suppose your JSON looks like this:
{
"BlobName": {
"FileName":"HelloWorld",
"Extension":"txt"
}
}
You can refer directly to FileName as BlobName.FileName . With this JSON format, here's what the path property
in the preceding example would look like:
"path": "strings/{BlobName.FileName}.{BlobName.Extension}",
Create GUIDs
The {rand-guid} binding expression creates a GUID. The following blob path in a function.json file creates a
blob with a name like 50710cb5 -84b9 -4d87 -9d83 -a03d6976a682.txt.
{
"type": "blob",
"name": "blobOutput",
"direction": "out",
"path": "my-output-container/{rand-guid}"
}
Current time
The binding expression DateTime resolves to DateTime.UtcNow . The following blob path in a function.json file
creates a blob with a name like 2018 -02 -16T17 -59 -55Z.txt.
{
"type": "blob",
"name": "blobOutput",
"direction": "out",
"path": "my-output-container/{DateTime}"
}
Binding at runtime
In C# and other .NET languages, you can use an imperative binding pattern, as opposed to the declarative
bindings in function.json and attributes. Imperative binding is useful when binding parameters need to be
computed at runtime rather than design time. To learn more, see the C# developer reference or the C# script
developer reference.
Next steps
Using the Azure Function return value
Using the Azure Function return value
2/22/2019 • 2 minutes to read • Edit Online
C# example
Here's C# code that uses the return value for an output binding, followed by an async example:
[FunctionName("QueueTrigger")]
[return: Blob("output-container/{id}")]
public static string Run([QueueTrigger("inputqueue")]WorkItem input, ILogger log)
{
string json = string.Format("{{ \"id\": \"{0}\" }}", input.Id);
log.LogInformation($"C# script processed queue message. Item={json}");
return json;
}
[FunctionName("QueueTrigger")]
[return: Blob("output-container/{id}")]
public static Task<string> Run([QueueTrigger("inputqueue")]WorkItem input, ILogger log)
{
string json = string.Format("{{ \"id\": \"{0}\" }}", input.Id);
log.LogInformation($"C# script processed queue message. Item={json}");
return Task.FromResult(json);
}
C# script example
Here's the output binding in the function.json file:
{
"name": "$return",
"type": "blob",
"direction": "out",
"path": "output-container/{id}"
}
F# example
Here's the output binding in the function.json file:
{
"name": "$return",
"type": "blob",
"direction": "out",
"path": "output-container/{id}"
}
JavaScript example
Here's the output binding in the function.json file:
{
"name": "$return",
"type": "blob",
"direction": "out",
"path": "output-container/{id}"
}
In JavaScript, the return value goes in the second parameter for context.done :
module.exports = function (context, input) {
var json = JSON.stringify(input);
context.log('Node.js script processed queue message', json);
context.done(null, json);
}
Python example
Here's the output binding in the function.json file:
{
"name": "$return",
"type": "blob",
"direction": "out",
"path": "output-container/{id}"
}
Next steps
Handle Azure Functions binding errors
Handle Azure Functions binding errors
2/22/2019 • 2 minutes to read • Edit Online
Azure Functions triggers and bindings communicate with various Azure services. When integrating with these
services, you may have errors raised that originate from the APIs of the underlying Azure services. Errors can also
occur when you try to communicate with other services from your function code by using REST or client libraries.
To avoid loss of data and ensure good behavior of your functions, it is important to handle errors from either
source.
The following triggers have built-in retry support:
Azure Blob storage
Azure Queue storage
Azure Service Bus (queue/topic)
By default, these triggers are retried up to five times. After the fifth retry, these triggers write a message to a
special poison queue.
For the other Functions triggers, there is no built-in retry when errors occur during function execution. To prevent
loss of trigger information should an error occur in your function, we recommend that you use try-catch blocks in
your function code to catch any errors. When an error occurs, write the information passed into the function by the
trigger to a special "poison" message queue. This approach is the same one used by the Blob storage trigger.
In this way, you can capture trigger events that could be lost due to errors and retry them at a later time using
another function to process messages from the poison queue using the stored information.
For links to all relevant error topics for the various services supported by Functions, see the Binding error codes
section of the Azure Functions error handling overview topic.
Supported languages in Azure Functions
5/6/2019 • 2 minutes to read • Edit Online
This article explains the levels of support offered for languages that you can use with Azure Functions.
Levels of support
There are three levels of support:
Generally available (GA ) - Fully supported and approved for production use.
Preview - Not yet supported but is expected to reach GA status in the future.
Experimental - Not supported and might be abandoned in the future; no guarantee of eventual preview or
GA status.
For information about planned changes to language support, see Azure roadmap.
Experimental languages
The experimental languages in version 1.x don't scale well and don't support all bindings.
Don't use experimental features for anything that you rely on, as there is no official support for them. Support
cases should not be opened for problems with experimental languages.
The version 2.x runtime doesn't support experimental languages. Support for new languages is added only when
the language can be supported in production.
Language extensibility
The 2.x runtime is designed to offer language extensibility. The JavaScript and Java languages in the 2.x runtime
are built with this extensibility.
Next steps
To learn more about how to use one of the GA or preview languages in Azure Functions, see the following
resources:
C#
F#
JavaScript
Java
Python
Azure Functions C# developer reference
5/17/2019 • 11 minutes to read • Edit Online
This article is an introduction to developing Azure Functions by using C# in .NET class libraries.
Azure Functions supports C# and C# script programming languages. If you're looking for guidance
on using C# in the Azure portal, see C# script (.csx) developer reference.
This article assumes that you've already read the following articles:
Azure Functions developers guide
Azure Functions Visual Studio 2019 Tools
<framework.version>
| - bin
| - MyFirstFunction
| | - function.json
| - MySecondFunction
| | - function.json
| - host.json
This directory is what gets deployed to your function app in Azure. The binding extensions required
in version 2.x of the Functions runtime are added to the project as NuGet packages.
IMPORTANT
The build process creates a function.json file for each function. This function.json file is not meant to be
edited directly. You can't change binding configuration or disable the function by editing this file. To learn
how to disable a function, see How to disable functions.
The FunctionName attribute marks the method as a function entry point. The name must be unique
within a project, start with a letter and only contain letters, numbers, _ , and - , up to 127
characters in length. Project templates often create a method named Run , but the method name can
be any valid C# method name.
The trigger attribute specifies the trigger type and binds input data to a method parameter. The
example function is triggered by a queue message, and the queue message is passed to the method
in the myQueueItem parameter.
The binding reference articles ( Storage queues, for example) explain which parameter types you can
use with trigger, input, or output binding attributes.
Binding expressions example
The following code gets the name of the queue to monitor from an app setting, and it gets the
queue message creation time in the insertionTime parameter.
public static class BindingExpressionsExample
{
[FunctionName("LogQueueMessage")]
public static void Run(
[QueueTrigger("%queueappsetting%")] string myQueueItem,
DateTimeOffset insertionTime,
ILogger log)
{
log.LogInformation($"Message content: {myQueueItem}");
log.LogInformation($"Created at: {insertionTime}");
}
}
Autogenerated function.json
The build process creates a function.json file in a function folder in the build folder. As noted earlier,
this file is not meant to be edited directly. You can't change binding configuration or disable the
function by editing this file.
The purpose of this file is to provide information to the scale controller to use for scaling decisions
on the consumption plan. For this reason, the file only has trigger info, not input or output bindings.
The generated function.json file includes a configurationSource property that tells the runtime to
use .NET attributes for bindings, rather than function.json configuration. Here's an example:
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.0.0",
"configurationSource": "attributes",
"bindings": [
{
"type": "queueTrigger",
"queueName": "%input-queue-name%",
"name": "myQueueItem"
}
],
"disabled": false,
"scriptFile": "..\\bin\\FunctionApp1.dll",
"entryPoint": "FunctionApp1.QueueTrigger.Run"
}
Microsoft.NET.Sdk.Functions
The function.json file generation is performed by the NuGet package Microsoft.NET.Sdk.Functions.
The same package is used for both version 1.x and 2.x of the Functions runtime. The target
framework is what differentiates a 1.x project from a 2.x project. Here are the relevant parts of
.csproj files, showing different target frameworks and the same Sdk package:
Functions 1.x
<PropertyGroup>
<TargetFramework>net461</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="1.0.8" />
</ItemGroup>
Functions 2.x
<PropertyGroup>
<TargetFramework>netcoreapp2.1</TargetFramework>
<AzureFunctionsVersion>v2</AzureFunctionsVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="1.0.8" />
</ItemGroup>
Among the Sdk package dependencies are triggers and bindings. A 1.x project refers to 1.x triggers
and bindings because those triggers and bindings target the .NET Framework, while 2.x triggers and
bindings target .NET Core.
The Sdk package also depends on Newtonsoft.Json, and indirectly on WindowsAzure.Storage.
These dependencies make sure that your project uses the versions of those packages that work with
the Functions runtime version that the project targets. For example, Newtonsoft.Json has version 11
for .NET Framework 4.6.1, but the Functions runtime that targets .NET Framework 4.6.1 is only
compatible with Newtonsoft.Json 9.0.1. So your function code in that project also has to use
Newtonsoft.Json 9.0.1.
The source code for Microsoft.NET.Sdk.Functions is available in the GitHub repo azure-functions-
vs-build-sdk.
Runtime version
Visual Studio uses the Azure Functions Core Tools to run Functions projects. The Core Tools is a
command-line interface for the Functions runtime.
If you install the Core Tools by using npm, that doesn't affect the Core Tools version used by Visual
Studio. For the Functions runtime version 1.x, Visual Studio stores Core Tools versions in
%USERPROFILE%\AppData\Local\Azure.Functions.Cli and uses the latest version stored there. For
Functions 2.x, the Core Tools are included in the Azure Functions and Web Jobs Tools extension.
For both 1.x and 2.x, you can see what version is being used in the console output when you run a
Functions project:
TIP
If you plan to use the HTTP or WebHook bindings, plan to avoid port exhaustion that can be caused by
improper instantiation of HttpClient . For more information, see How to manage connections in Azure
Functions.
Logging
To log output to your streaming logs in C#, include an argument of type ILogger. We recommend
that you name it log , as in the following example:
Avoid using Console.Write in Azure Functions. For more information, see Write logs in C#
functions in the Monitor Azure Functions article.
Async
To make a function asynchronous, use the async keyword and return a Task object.
public static class AsyncExample
{
[FunctionName("BlobCopy")]
public static async Task RunAsync(
[BlobTrigger("sample-images/{blobName}")] Stream blobInput,
[Blob("sample-images-copies/{blobName}", FileAccess.Write)] Stream blobOutput,
CancellationToken token,
ILogger log)
{
log.LogInformation($"BlobCopy function processed.");
await blobInput.CopyToAsync(blobOutput, 4096, token);
}
}
You can't use out parameters in async functions. For output bindings, use the function return value
or a collector object instead.
Cancellation tokens
A function can accept a CancellationToken parameter, which enables the operating system to notify
your code when the function is about to be terminated. You can use this notification to make sure
the function doesn't terminate unexpectedly in a way that leaves data in an inconsistent state.
The following example shows how to check for impending function termination.
Environment variables
To get an environment variable or an app setting value, use
System.Environment.GetEnvironmentVariable , as shown in the following code example:
public static class EnvironmentVariablesExample
{
[FunctionName("GetEnvironmentVariables")]
public static void Run([TimerTrigger("0 */5 * * * *")]TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
log.LogInformation(GetEnvironmentVariable("AzureWebJobsStorage"));
log.LogInformation(GetEnvironmentVariable("WEBSITE_SITE_NAME"));
}
App settings can be read from environment variables both when developing locally and when
running in Azure. When developing locally, app settings come from the Values collection in the
local.settings.json file. In both environments, local and Azure,
GetEnvironmentVariable("<app setting name>") retrieves the value of the named app setting. For
instance, when you're running locally, "My Site Name" would be returned if your local.settings.json
file contains { "Values": { "WEBSITE_SITE_NAME": "My Site Name" } } .
The System.Configuration.ConfigurationManager.AppSettings property is an alternative API for
getting app setting values, but we recommend that you use GetEnvironmentVariable as shown here.
Binding at runtime
In C# and other .NET languages, you can use an imperative binding pattern, as opposed to the
declarative bindings in attributes. Imperative binding is useful when binding parameters need to be
computed at runtime rather than design time. With this pattern, you can bind to supported input
and output bindings on-the-fly in your function code.
Define an imperative binding as follows:
Do not include an attribute in the function signature for your desired imperative bindings.
Pass in an input parameter Binder binder or IBinder binder .
Use the following C# pattern to perform the data binding.
BindingTypeAttribute is the .NET attribute that defines your binding, and T is an input or
output type that's supported by that binding type. T cannot be an out parameter type (such
as out JObject ). For example, the Mobile Apps table output binding supports six output
types, but you can only use ICollector or IAsyncCollector with imperative binding.
Single attribute example
The following example code creates a Storage blob output binding with blob path that's defined at
run time, then writes a string to the blob.
public static class IBinderExample
{
[FunctionName("CreateBlobUsingBinder")]
public static void Run(
[QueueTrigger("myqueue-items-source-4")] string myQueueItem,
IBinder binder,
ILogger log)
{
log.LogInformation($"CreateBlobUsingBinder function processed: {myQueueItem}");
using (var writer = binder.Bind<TextWriter>(new BlobAttribute(
$"samples-output/{myQueueItem}", FileAccess.Write)))
{
writer.Write("Hello World!");
};
}
}
BlobAttribute defines the Storage blob input or output binding, and TextWriter is a supported
output binding type.
Multiple attribute example
The preceding example gets the app setting for the function app's main Storage account connection
string (which is AzureWebJobsStorage ). You can specify a custom app setting to use for the Storage
account by adding the StorageAccountAttribute and passing the attribute array into BindAsync<T>() .
Use a Binder parameter, not IBinder . For example:
Blob Storage ✔ ✔ ✔ ✔ ✔
Cosmos DB ✔ ✔ ✔ ✔ ✔
TYPE 1.X 2.X TRIGGER INPUT OUTPUT
Event Grid ✔ ✔ ✔
Event Hubs ✔ ✔ ✔ ✔
HTTP & ✔ ✔ ✔ ✔
Webhooks
Microsoft ✔ ✔ ✔
Graph
Excel tables
Microsoft ✔ ✔ ✔
Graph
OneDrive files
Microsoft ✔ ✔
Graph
Outlook email
Microsoft ✔ ✔ ✔ ✔
Graph
Events
Microsoft ✔ ✔
Graph
Auth tokens
Mobile Apps ✔ ✔ ✔
Notification ✔ ✔
Hubs
Queue storage ✔ ✔ ✔ ✔
SendGrid ✔ ✔ ✔
Service Bus ✔ ✔ ✔ ✔
SignalR ✔ ✔ ✔
Table storage ✔ ✔ ✔ ✔
Timer ✔ ✔ ✔
Twilio ✔ ✔ ✔
1 In 2.x, all bindings except HTTP and Timer must be registered. See Register binding extensions.
Next steps
Learn more about triggers and bindings
Learn more about best practices for Azure Functions
Azure Functions C# script (.csx) developer reference
4/25/2019 • 11 minutes to read • Edit Online
Folder structure
The folder structure for a C# script project looks like the following:
FunctionsProject
| - MyFirstFunction
| | - run.csx
| | - function.json
| | - function.proj
| - MySecondFunction
| | - run.csx
| | - function.json
| | - function.proj
| - host.json
| - extensions.csproj
| - bin
There's a shared host.json file that can be used to configure the function app. Each function has its own code file (.csx)
and binding configuration file (function.json).
The binding extensions required in version 2.x of the Functions runtime are defined in the extensions.csproj file, with
the actual library files in the bin folder. When developing locally, you must register binding extensions. When
developing functions in the Azure portal, this registration is done for you.
Binding to arguments
Input or output data is bound to a C# script function parameter via the name property in the function.json configuration
file. The following example shows a function.json file and run.csx file for a queue-triggered function. The parameter that
receives data from the queue message is named myQueueItem because that's the value of the name property.
{
"disabled": false,
"bindings": [
{
"type": "queueTrigger",
"direction": "in",
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"MyStorageConnectionAppSetting"
}
]
}
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Queue;
using System;
TIP
If you plan to use the HTTP or WebHook bindings, plan to avoid port exhaustion that can be caused by improper instantiation of
HttpClient . For more information, see How to manage connections in Azure Functions.
A POCO class must have a getter and setter defined for each property.
#load "mylogger.csx"
using Microsoft.Extensions.Logging;
Example mylogger.csx:
Using a shared .csx file is a common pattern when you want to strongly type the data passed between functions by using
a POCO object. In the following simplified example, an HTTP trigger and queue trigger share a POCO object named
Order to strongly type the order data:
#load "..\shared\order.csx"
using System.Net;
using Microsoft.Extensions.Logging;
public static async Task<HttpResponseMessage> Run(Order req, IAsyncCollector<Order> outputQueueItem, ILogger log)
{
log.LogInformation("C# HTTP trigger function received an order.");
log.LogInformation(req.ToString());
log.LogInformation("Submitting to processing queue.");
if (req.orderId == null)
{
return new HttpResponseMessage(HttpStatusCode.BadRequest);
}
else
{
await outputQueueItem.AddAsync(req);
return new HttpResponseMessage(HttpStatusCode.OK);
}
}
#load "..\shared\order.csx"
using System;
using Microsoft.Extensions.Logging;
public static void Run(Order myQueueItem, out Order outputQueueItem, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed order...");
log.LogInformation(myQueueItem.ToString());
outputQueueItem = myQueueItem;
}
Example order.csx:
public class Order
{
public string orderId {get; set; }
public string custName {get; set;}
public string custAddress {get; set;}
public string custEmail {get; set;}
public string cartId {get; set; }
Logging
To log output to your streaming logs in C#, include an argument of type ILogger. We recommend that you name it log .
Avoid using Console.Write in Azure Functions.
Async
To make a function asynchronous, use the async keyword and return a Task object.
You can't use out parameters in async functions. For output bindings, use the function return value or a collector object
instead.
Cancellation tokens
A function can accept a CancellationToken parameter, which enables the operating system to notify your code when the
function is about to be terminated. You can use this notification to make sure the function doesn't terminate unexpectedly
in a way that leaves data in an inconsistent state.
The following example shows how to check for impending function termination.
using System;
using System.IO;
using System.Threading;
Importing namespaces
If you need to import namespaces, you can do so as usual, with the using clause.
using System.Net;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
The following namespaces are automatically imported and are therefore optional:
System
System.Collections.Generic
System.IO
System.Linq
System.Net.Http
System.Threading.Tasks
Microsoft.Azure.WebJobs
Microsoft.Azure.WebJobs.Host
#r "System.Web.Http"
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
The following assemblies are automatically added by the Azure Functions hosting environment:
mscorlib
System
System.Core
System.Xml
System.Net.Http
Microsoft.Azure.WebJobs
Microsoft.Azure.WebJobs.Host
Microsoft.Azure.WebJobs.Extensions
System.Web.Http
System.Net.Http.Formatting
For information on how to upload files to your function folder, see the section on package management.
Watched directories
The directory that contains the function script file is automatically watched for changes to assemblies. To watch for
assembly changes in other directories, add them to the watchDirectories list in host.json.
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.ProjectOxford.Face" Version="1.1.0" />
</ItemGroup>
</Project>
To use a custom NuGet feed, specify the feed in a Nuget.Config file in the Function App root. For more information, see
Configuring NuGet behavior.
NOTE
In 1.x C# functions, NuGet packages are referenced with a project.json file instead of a function.proj file.
For 1.x functions, use a project.json file instead. Here is an example project.json file:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.ProjectOxford.Face": "1.1.0"
}
}
}
}
Environment variables
To get an environment variable or an app setting value, use System.Environment.GetEnvironmentVariable , as shown in the
following code example:
public static void Run(TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
log.LogInformation(GetEnvironmentVariable("AzureWebJobsStorage"));
log.LogInformation(GetEnvironmentVariable("WEBSITE_SITE_NAME"));
}
Binding at runtime
In C# and other .NET languages, you can use an imperative binding pattern, as opposed to the declarative bindings in
function.json. Imperative binding is useful when binding parameters need to be computed at runtime rather than design
time. With this pattern, you can bind to supported input and output bindings on-the-fly in your function code.
Define an imperative binding as follows:
Do not include an entry in function.json for your desired imperative bindings.
Pass in an input parameter Binder binder or IBinder binder .
Use the following C# pattern to perform the data binding.
BindingTypeAttribute is the .NET attribute that defines your binding and T is an input or output type that's supported
by that binding type. T cannot be an out parameter type (such as out JObject ). For example, the Mobile Apps table
output binding supports six output types, but you can only use ICollector or IAsyncCollector for T .
Single attribute example
The following example code creates a Storage blob output binding with blob path that's defined at run time, then writes a
string to the blob.
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host.Bindings.Runtime;
BlobAttribute defines the Storage blob input or output binding, and TextWriter is a supported output binding type.
Multiple attribute example
The preceding example gets the app setting for the function app's main Storage account connection string (which is
AzureWebJobsStorage ). You can specify a custom app setting to use for the Storage account by adding the
StorageAccountAttribute and passing the attribute array into BindAsync<T>() . Use a Binder parameter, not IBinder .
For example:
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host.Bindings.Runtime;
The following table lists the .NET attributes for each binding type and the packages in which they are defined.
Cosmos DB Microsoft.Azure.WebJobs.DocumentDBAttribute
#r
"Microsoft.Azure.WebJobs.Extensions.CosmosDB"
Twilio Microsoft.Azure.WebJobs.TwilioSmsAttribute# r
"Microsoft.Azure.WebJobs.Extensions.Twilio"
Next steps
Learn more about triggers and bindings
Learn more about best practices for Azure Functions
Azure Functions F# Developer Reference
4/25/2019 • 7 minutes to read • Edit Online
F# for Azure Functions is a solution for easily running small pieces of code, or "functions," in the cloud. Data
flows into your F# function via function arguments. Argument names are specified in function.json , and
there are predefined names for accessing things like the function logger and cancellation tokens.
IMPORTANT
F# script (.fsx) is only supported by version 1.x of the Azure Functions runtime. If you want to use F# with the version
2.x runtime, you must use a precompiled F# class library project (.fs). You create, manage, and publish an F# class
library project using Visual Studio as you would a C# class library project. For more information about Functions
versions, see Azure Functions runtime versions overview.
This article assumes that you've already read the Azure Functions developer reference.
Folder structure
The folder structure for an F# script project looks like the following:
FunctionsProject
| - MyFirstFunction
| | - run.fsx
| | - function.json
| | - function.proj
| - MySecondFunction
| | - run.fsx
| | - function.json
| | - function.proj
| - host.json
| - extensions.csproj
| - bin
There's a shared host.json file that can be used to configure the function app. Each function has its own code
file (.fsx) and binding configuration file (function.json).
The binding extensions required in version 2.x of the Functions runtime are defined in the extensions.csproj
file, with the actual library files in the bin folder. When developing locally, you must register binding
extensions. When developing functions in the Azure portal, this registration is done for you.
Binding to arguments
Each binding supports some set of arguments, as detailed in the Azure Functions triggers and bindings
developer reference. For example, one of the argument bindings a blob trigger supports is a POCO, which can
be expressed using an F# record. For example:
Your F# Azure Function will take one or more arguments. When we talk about Azure Functions arguments,
we refer to input arguments and output arguments. An input argument is exactly what it sounds like: input to
your F# Azure Function. An output argument is mutable data or a byref<> argument that serves as a way to
pass data back out of your function.
In the example above, blob is an input argument, and output is an output argument. Notice that we used
byref<> for output (there's no need to add the [<Out>] annotation). Using a byref<> type allows your
function to change which record or object the argument refers to.
When an F# record is used as an input type, the record definition must be marked with [<CLIMutable>] in
order to allow the Azure Functions framework to set the fields appropriately before passing the record to your
function. Under the hood, [<CLIMutable>] generates setters for the record properties. For example:
[<CLIMutable>]
type TestObject =
{ SenderName : string
Greeting : string }
An F# class can also be used for both in and out arguments. For a class, properties will usually need getters
and setters. For example:
type Item() =
member val Id = "" with get,set
member val Text = "" with get,set
Logging
To log output to your streaming logs in F#, your function should take an argument of type ILogger. For
consistency, we recommend this argument is named log . For example:
Async
The asyncworkflow can be used, but the result needs to return a Task . This can be done with
Async.StartAsTask , for example:
let Run(req: HttpRequestMessage) =
async {
return new HttpResponseMessage(HttpStatusCode.OK)
} |> Async.StartAsTask
Cancellation Token
If your function needs to handle shutdown gracefully, you can give it a CancellationToken argument. This can
be combined with async , for example:
Importing namespaces
Namespaces can be opened in the usual way:
open System.Net
open System.Threading.Tasks
open Microsoft.Extensions.Logging
#r "System.Web.Http"
open System.Net
open System.Net.Http
open System.Threading.Tasks
open Microsoft.Extensions.Logging
The following assemblies are automatically added by the Azure Functions hosting environment:
mscorlib ,
System
System.Core
System.Xml
System.Net.Http
Microsoft.Azure.WebJobs
Microsoft.Azure.WebJobs.Host
Microsoft.Azure.WebJobs.Extensions
System.Web.Http
System.Net.Http.Formatting .
In addition, the following assemblies are special cased and may be referenced by simplename (e.g.
#r "AssemblyName" ):
Newtonsoft.Json
Microsoft.WindowsAzure.Storage
Microsoft.ServiceBus
Microsoft.AspNet.WebHooks.Receivers
Microsoft.AspNEt.WebHooks.Common .
If you need to reference a private assembly, you can upload the assembly file into a bin folder relative to
your function and reference it by using the file name (e.g. #r "MyAssembly.dll" ). For information on how to
upload files to your function folder, see the following section on package management.
Editor Prelude
An editor that supports F# Compiler Services will not be aware of the namespaces and assemblies that Azure
Functions automatically includes. As such, it can be useful to include a prelude that helps the editor find the
assemblies you are using, and to explicitly open namespaces. For example:
#if !COMPILED
#I "../../bin/Binaries/WebJobs.Script.Host"
#r "Microsoft.Azure.WebJobs.Host.dll"
#endif
open System
open Microsoft.Azure.WebJobs.Host
open Microsoft.Extensions.Logging
When Azure Functions executes your code, it processes the source with COMPILED defined, so the editor
prelude will be ignored.
Package management
To use NuGet packages in an F# function, add a project.json file to the function's folder in the function app's
file system. Here is an example project.json file that adds a NuGet package reference to
Microsoft.ProjectOxford.Face version 1.1.0:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.ProjectOxford.Face": "1.1.0"
}
}
}
}
Only the .NET Framework 4.6 is supported, so make sure that your project.json file specifies net46 as
shown here.
When you upload a project.json file, the runtime gets the packages and automatically adds references to the
package assemblies. You don't need to add #r "AssemblyName" directives. Just add the required open
statements to your .fsx file.
You may wish to put automatically references assemblies in your editor prelude, to improve your editor's
interaction with F# Compile Services.
How to add a project.json file to your Azure Function
1. Begin by making sure your function app is running, which you can do by opening your function in the
Azure portal. This also gives access to the streaming logs where package installation output will be
displayed.
2. To upload a project.json file, use one of the methods described in how to update function app files. If you
are using Continuous Deployment for Azure Functions, you can add a project.json file to your staging
branch in order to experiment with it before adding it to your deployment branch.
3. After the project.json file is added, you will see output similar to the following example in your function's
streaming log:
Environment variables
To get an environment variable or an app setting value, use System.Environment.GetEnvironmentVariable , for
example:
open System.Environment
open Microsoft.Extensions.Logging
#load "logger.fsx"
logger.fsx
Paths provides to the #load directive are relative to the location of your .fsx file.
#load "logger.fsx" loads a file located in the function folder.
#load "package\logger.fsx" loads a file located in the package folder in the function folder.
#load "..\shared\mylogger.fsx" loads a file located in the shared folder at the same level as the function
folder, that is, directly under wwwroot .
The #load directive only works with .fsx (F# script) files, and not with .fs files.
Next steps
For more information, see the following resources:
F# Guide
Best Practices for Azure Functions
Azure Functions developer reference
Azure Functions triggers and bindings
Azure Functions testing
Azure Functions scaling
Azure Functions JavaScript developer guide
5/20/2019 • 19 minutes to read • Edit Online
This guide contains information about the intricacies of writing Azure Functions with JavaScript.
A JavaScript function is an exported function that executes when triggered (triggers are configured in
function.json). The first argument passed to every function is a context object, which is used for receiving
and sending binding data, logging, and communicating with the runtime.
This article assumes that you have already read the Azure Functions developer reference. Complete the
Functions quickstart to create your first function, using Visual Studio Code or in the portal.
This article also supports TypeScript app development.
Folder structure
The required folder structure for a JavaScript project looks like the following. This default can be changed.
For more information, see the scriptFile section below.
FunctionsProject
| - MyFirstFunction
| | - index.js
| | - function.json
| - MySecondFunction
| | - index.js
| | - function.json
| - SharedCode
| | - myFirstHelperFunction.js
| | - mySecondHelperFunction.js
| - node_modules
| - host.json
| - package.json
| - extensions.csproj
At the root of the project, there's a shared host.json file that can be used to configure the function app.
Each function has a folder with its own code file (.js) and binding configuration file (function.json). The
name of function.json 's parent directory is always the name of your function.
The binding extensions required in version 2.x of the Functions runtime are defined in the
extensions.csproj file, with the actual library files in the bin folder. When developing locally, you must
register binding extensions. When developing functions in the Azure portal, this registration is done for
you.
Exporting a function
JavaScript functions must be exported via module.exports (or exports ). Your exported function should be
a JavaScript function that executes when triggered.
By default, the Functions runtime looks for your function in index.js , where index.js shares the same
parent directory as its corresponding function.json . In the default case, your exported function should be
the only export from its file or the export named run or index . To configure the file location and export
name of your function, read about configuring your function's entry point below.
Your exported function is passed a number of arguments on execution. The first argument it takes is
always a context object. If your function is synchronous (does not return a Promise), you must pass the
context object, as calling context.done is required for correct use.
When exporting an async function, you can also configure an output binding to take the return value.
This is recommended if you only have one output binding.
To assign an output using return , change the name property to $return in function.json .
{
"type": "http",
"direction": "out",
"name": "$return"
}
In this case, your function should look like the following example:
Bindings
In JavaScript, bindings are configured and defined in a function's function.json. Functions interact with
bindings a number of ways.
Inputs
Input are divided into two categories in Azure Functions: one is the trigger input and the other is the
additional input. Trigger and other input bindings (bindings of direction === "in" ) can be read by a
function in three ways:
[Recommended] As parameters passed to your function. They are passed to the function in the
same order that they are defined in function.json. The name property defined in function.json does
not need to match the name of your parameter, although it should.
As members of the context.bindings object. Each member is named by the name property
defined in function.json.
As inputs using the JavaScript arguments object. This is essentially the same as passing inputs
as parameters, but allows you to dynamically handle inputs.
Outputs
Outputs (bindings of direction === "out" ) can be written to by a function in a number of ways. In all
cases, the name property of the binding as defined in function.json corresponds to the name of the object
member written to in your function.
You can assign data to output bindings in one of the following ways (don't combine these methods):
[Recommended for multiple outputs] Returning an object. If you are using an async/Promise
returning function, you can return an object with assigned output data. In the example below, the
output bindings are named "httpResponse" and "queueOutput" in function.json.
If you are using a synchronous function, you can return this object using context.done (see
example).
[Recommended for single output] Returning a value directly and using the $return binding
name. This only works for async/Promise returning functions. See example in exporting an async
function.
Assigning values to context.bindings You can assign values directly to context.bindings.
module.exports = async function(context) {
let retMsg = 'Hello, world!';
context.bindings.httpResponse = {
body: retMsg
};
context.bindings.queueOutput = retMsg;
return;
};
{
"type": "httpTrigger",
"name": "req",
"direction": "in",
"dataType": "binary"
}
context object
The runtime uses a context object to pass data to and from your function and to let you communicate
with the runtime. The context object can be used for reading and setting data from bindings, writing logs,
and using the context.done callback when your exported function is synchronous.
The context object is always the first parameter to a function. It should be included because it has
important methods such as context.done and context.log . You can name the object whatever you would
like (for example, ctx or c ).
context.bindings property
context.bindings
Returns a named object that is used to read or assign binding data. Input and trigger binding data can be
accessed by reading properties on context.bindings . Output binding data can be assigned by adding data
to context.bindings
For example, the following binding definitions in your function.json let you access the contents of a queue
from context.bindings.myInput and assign outputs to a queue using context.bindings.myOutput .
{
"type":"queue",
"direction":"in",
"name":"myInput"
...
},
{
"type":"queue",
"direction":"out",
"name":"myOutput"
...
}
// myInput contains the input data, which may have properties such as "name"
var author = context.bindings.myInput.name;
// Similarly, you can set your output data
context.bindings.myOutput = {
some_text: 'hello world',
a_number: 1 };
You can choose to define output binding data using the context.done method instead of the
context.binding object (see below ).
context.bindingData property
context.bindingData
Returns a named object that contains trigger metadata and function invocation data ( invocationId ,
sys.methodName , sys.utcNow , sys.randGuid ). For an example of trigger metadata, see this event hubs
example.
context.done method
context.done([err],[propertyBag])
Lets the runtime know that your code has completed. When your function uses the async function
declaration, you do not need to use context.done() . The context.done callback is implicitly called. Async
functions are available in Node 8 or a later version, which requires version 2.x of the Functions runtime.
If your function is not an async function, you must call context.done to inform the runtime that your
function is complete. The execution times out if it is missing.
The context.done method allows you to pass back both a user-defined error to the runtime and a JSON
object containing output binding data. Properties passed to context.done overwrite anything set on the
context.bindings object.
context.log method
context.log(message)
Allows you to write to the streaming function logs at the default trace level. On context.log , additional
logging methods are available that let you write function logs at other trace levels:
METHOD DESCRIPTION
You can configure the trace-level threshold for logging in the host.json file. For more information on
writing logs, see writing trace outputs below.
Read monitoring Azure Functions to learn more about viewing and querying function logs.
context.log({hello: 'world'});
context.log.info({hello: 'world'});
Because error is the highest trace level, this trace is written to the output at all trace levels as long as
logging is enabled.
All context.log methods support the same parameter format that's supported by the Node.js util.format
method. Consider the following code, which writes function logs by using the default trace level:
context.log('Node.js HTTP trigger function processed a request. RequestUri=' + req.originalUrl);
context.log('Request Headers = ' + JSON.stringify(req.headers));
You can also write the same code in the following format:
{
"tracing": {
"consoleLevel": "verbose"
}
}
Values of consoleLevel correspond to the names of the context.log methods. To disable all trace logging
to the console, set consoleLevel to off. For more information, see host.json reference.
PROPERTY DESCRIPTION
Response object
The context.res (response) object has the following properties:
PROPERTY DESCRIPTION
// You can access your http request off the context ...
if(context.req.body.emoji === ':pizza:') context.log('Yay!');
// and also set your http response
context.res = { status: 202, body: 'You successfully ordered more coffee!' };
From the named input and output bindings. In this way, the HTTP trigger and bindings work
the same as any other binding. The following example sets the response object by using a named
response binding:
{
"type": "http",
"direction": "out",
"name": "response"
}
{
"type": "http",
"direction": "out",
"name": "$return"
}
You can see the current version that the runtime is using by checking the above app setting or by printing
process.version from any function.
Dependency management
In order to use community libraries in your JavaScript code, as is shown in the below example, you need to
ensure that all dependencies are installed on your Function App in Azure.
module.exports = function(context) {
// Using our imported underscore.js library
var matched_names = _
.where(context.bindings.myInput.names, {first: 'Carla'});
NOTE
You should define a package.json file at the root of your Function App. Defining the file lets all functions in the
app share the same cached packages, which gives the best performance. If a version conflict arises, you can resolve
it by adding a package.json file in the folder of a specific function.
When deploying Function Apps from source control, any package.json file present in your repo, will
trigger an npm install in its folder during deployment. But when deploying via the Portal or CLI, you will
have to manually install the packages.
There are two ways to install packages on your Function App:
Deploying with Dependencies
1. Install all requisite packages locally by running npm install .
2. Deploy your code, and ensure that the node_modules folder is included in the deployment.
Using Kudu
1. Go to https://<function_app_name>.scm.azurewebsites.net .
2. Click Debug Console > CMD.
3. Go to D:\home\site\wwwroot , and then drag your package.json file to the wwwroot folder at the top
half of the page.
You can upload files to your function app in other ways also. For more information, see How to
update function app files.
4. After the package.json file is uploaded, run the npm install command in the Kudu remote
execution console.
This action downloads the packages indicated in the package.json file and restarts the function app.
Environment variables
In Functions, app settings, such as service connection strings, are exposed as environment variables during
execution. You can access these settings using process.env , as shown here in the GetEnvironmentVariable
function:
context.done();
};
function GetEnvironmentVariable(name)
{
return name + ": " + process.env[name];
}
There are several ways that you can add, update, and delete function app settings:
In the Azure portal.
By using the Azure CLI.
When running locally, app settings are read from the local.settings.json project file.
By default, a JavaScript function is executed from index.js , a file that shares the same parent directory as
its corresponding function.json .
scriptFile can be used to get a folder structure that looks like the following example:
FunctionApp
| - host.json
| - myNodeFunction
| | - function.json
| - lib
| | - sayHello.js
| - node_modules
| | - ... packages ...
| - package.json
The function.json for myNodeFunction should include a scriptFile property pointing to the file with the
exported function to run.
{
"scriptFile": "../lib/sayHello.js",
"bindings": [
...
]
}
Using entryPoint
In scriptFile (or index.js ), a function must be exported using module.exports in order to be found and
run. By default, the function that executes when triggered is the only export from that file, the export
named run , or the export named index .
This can be configured using entryPoint in function.json , as in the following example:
{
"entryPoint": "logFoo",
"bindings": [
...
]
}
In Functions v2.x, which supports the this parameter in user functions, the function code could then be
as in the following example:
class MyObj {
constructor() {
this.foo = 1;
};
logFoo(context) {
context.log("Foo is " + this.foo);
context.done();
}
}
In this example, it is important to note that although an object is being exported, there are no guarantees
for preserving state between executions.
Local Debugging
When started with the --inspect parameter, a Node.js process listens for a debugging client on the
specified port. In Azure Functions 2.x, you can specify arguments to pass into the Node.js process that runs
your code by adding the environment variable or App Setting languageWorkers:node:arguments = <args> .
To debug locally, add "languageWorkers:node:arguments": "--inspect=5858" under Values in your
local.settings.json file and attach a debugger to port 5858.
When debugging using VS Code, the --inspect parameter is automatically added using the port value
in the project's launch.json file.
In version 1.x, setting languageWorkers:node:arguments will not work. The debug port can be selected with
the --nodeDebugPort parameter on Azure Functions Core Tools.
TypeScript
When you target version 2.x of the Functions runtime, both Azure Functions for Visual Studio Code and
the Azure Functions Core Tools let you create function apps using a template that support TypeScript
function app projects. The template generates package.json and tsconfig.json project files that make it
easier to transpile, run, and publish JavaScript functions from TypeScript code with these tools.
A generated .funcignore file is used to indicate which files are excluded when a project is published to
Azure.
TypeScript files (.ts) are transpiled into JavaScript files (.js) in the dist output directory. TypeScript
templates use the scriptFile parameter in function.json to indicate the location of the corresponding .js
file in the dist folder. The output location is set by the template by using outDir parameter in the
tsconfig.json file. If you change this setting or the name of the folder, the runtime is not able to find the
code to run.
NOTE
Experimental support for TypeScript exists version 1.x of the Functions runtime. The experimental version transpiles
TypeScript files into JavaScript files when the function is invoked. In version 2.x, this experimental support has been
superseded by the tool-driven method that does transpilation before the host is initialized and during the
deployment process.
The way that you locally develop and deploy from a TypeScript project depends on your development tool.
Visual Studio Code
The Azure Functions for Visual Studio Code extension lets you develop your functions using TypeScript.
The Core Tools is a requirement of the Azure Functions extension.
To create a TypeScript function app in Visual Studio Code, you simply choose TypeScript when you create
a function app and are asked to choose the language.
When you press F5 to run the app locally, transpilation is done before the host (func.exe) is initialized.
When you deploy your function app to Azure using the Deploy to function app... button, the Azure
Functions extension first generates a production-ready build of JavaScript files from the TypeScript source
files.
Azure Functions Core Tools
To create a TypeScript function app project using Core Tools, you must specify the typescript language
option when you create your function app. You can do this in one of the following ways:
Run the func init command, select node as your language stack, and then select typescript .
Run the func init --worker-runtime typescript command.
To run your function app code locally using Core Tools, use the npm start command, instead of
func host start . The npm start command is equivalent to the following commands:
Before you use the command to deploy to Azure, you must first run the
func azure functionapp publish
npm run build:production command. This command creates a production-ready build of JavaScript files
from the TypeScript source files that can be deployed using func azure functionapp publish .
Next steps
For more information, see the following resources:
Best practices for Azure Functions
Azure Functions developer reference
Azure Functions triggers and bindings
Azure Functions Java developer guide
5/9/2019 • 9 minutes to read • Edit Online
Programming model
The concepts of triggers and bindings are fundamental to Azure Functions. Triggers start the execution of
your code. Bindings give you a way to pass data to and return data from a function, without having to write
custom data access code.
Folder structure
Here is the folder structure of an Azure Function Java project:
FunctionsProject
| - src
| | - main
| | | - java
| | | | - FunctionApp
| | | | | - MyFirstFunction.java
| | | | | - MySecondFunction.java
| - target
| | - azure-functions
| | | - FunctionApp
| | | | - FunctionApp.jar
| | | | - host.json
| | | | - MyFirstFunction
| | | | | - function.json
| | | | - MySecondFunction
| | | | | - function.json
| | | | - bin
| | | | - lib
| - pom.xml
There's a shared host.json file that can be used to configure the function app. Each function has its own code
file (.java) and binding configuration file (function.json).
You can put more than one function in a project. Avoid putting your functions into separate jars. The
FunctionApp in the target directory is what gets deployed to your function app in Azure.
IMPORTANT
You must configure an Azure Storage account in your local.settings.json to run Azure Storage Blob, Queue, or Table
triggers locally.
Example:
{
"scriptFile": "azure-functions-example.jar",
"entryPoint": "com.example.Function.echo",
"bindings": [
{
"type": "httpTrigger",
"name": "req",
"direction": "in",
"authLevel": "anonymous",
"methods": [ "post" ]
},
{
"type": "http",
"name": "$return",
"direction": "out"
}
]
}
Customize JVM
Functions lets you customize the Java virtual machine (JVM ) used to execute your Java functions. The
following JVM options are used by default:
-XX:+TieredCompilation
-XX:TieredStopAtLevel=1
-noverify
-Djava.net.preferIPv4Stack=true
-jar
You can provide additional arguments in an app setting named JAVA_OPTS . You can add app settings to your
function app deployed to Azure in one of these ways:
Azure portal
In the Azure portal, use the Application Settings tab to add the JAVA_OPTS setting.
Azure CLI
The az functionapp config appsettings set command can be used to set JAVA_OPTS , as in the following
example:
```azurecli-interactive
az functionapp config appsettings set --name <APP_NAME> \
--resource-group <RESOURCE_GROUP> \
--settings "JAVA_OPTS=-Djava.awt.headless=true"
```
This example enables headless mode. Replace <APP_NAME> with the name of your function app and
<RESOURCE_GROUP> with the resource group.
WARNING
When running in a Consumption plan, you must add the WEBSITE_USE_PLACEHOLDER setting with a value of 0 .
This setting does increase the cold start times for Java functions.
Third-party libraries
Azure Functions supports the use of third-party libraries. By default, all dependencies specified in your project
pom.xml file will be automatically bundled during the mvn package goal. For libraries not specified as
dependencies in the pom.xml file, place them in a lib directory in the function's root directory. Dependencies
placed in the lib directory will be added to the system class loader at runtime.
The com.microsoft.azure.functions:azure-functions-java-library dependency is provided on the classpath by
default, and does not need to be included in the lib directory. Also, dependencies listed here are added to
the classpath by azure-functions-java-worker.
Bindings
Input and output bindings provide a declarative way to connect to data from within your code. A function can
have multiple input and output bindings.
Example Input binding
package com.example;
import com.microsoft.azure.functions.annotation.*;
To receive a batch of inputs, you can bind to String[] , POJO[] , List<String> , or List<POJO> .
@FunctionName("ProcessIotMessages")
public void processIotMessages(
@EventHubTrigger(name = "message", eventHubName = "%AzureWebJobsEventHubPath%", connection =
"AzureWebJobsEventHubSender", cardinality = Cardinality.MANY) List<TestEventData> messages,
final ExecutionContext context)
{
context.getLogger().info("Java Event Hub trigger received messages. Batch size: " +
messages.size());
}
This function gets triggered whenever there is new data in the configured event hub. As the cardinality is
set to MANY , function receives a batch of messages from event hub. EventData from event hub gets converted
to TestEventData for the function execution.
Example Output binding
You can bind an output binding to the return value using $return
package com.example;
import com.microsoft.azure.functions.annotation.*;
If there are multiple output bindings, use the return value for only one of them.
To send multiple output values, use OutputBinding<T> defined in the azure-functions-java-library package.
@FunctionName("QueueOutputPOJOList")
public HttpResponseMessage QueueOutputPOJOList(@HttpTrigger(name = "req", methods = { HttpMethod.GET,
HttpMethod.POST }, authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
@QueueOutput(name = "itemsOut", queueName = "test-output-java-pojo", connection =
"AzureWebJobsStorage") OutputBinding<List<TestData>> itemsOut,
final ExecutionContext context) {
context.getLogger().info("Java HTTP trigger processed a request.");
itemsOut.getValue().add(testData1);
itemsOut.getValue().add(testData2);
This function is invoked on an HttpRequest and writes multiple values to the Azure Queue.
Metadata
Few triggers send trigger metadata along with input data. You can use annotation @BindingName to bind to
trigger metadata
package com.example;
import java.util.Optional;
import com.microsoft.azure.functions.annotation.*;
In the example above, the queryValueis bound to query string parameter name in the Http request URL
http://{example.host}/api/metadata?name=test . Following is another example to binding to Id from queue
trigger metadata
@FunctionName("QueueTriggerMetadata")
public void QueueTriggerMetadata(
@QueueTrigger(name = "message", queueName = "test-input-java-metadata", connection =
"AzureWebJobsStorage") String message,@BindingName("Id") String metadataId,
@QueueOutput(name = "output", queueName = "test-output-java-metadata", connection =
"AzureWebJobsStorage") OutputBinding<TestData> output,
final ExecutionContext context
) {
context.getLogger().info("Java Queue trigger function processed a message: " + message + " with
metadaId:" + metadataId );
TestData testData = new TestData();
testData.id = metadataId;
output.setValue(testData);
}
NOTE
Name provided in the annotation needs to match the metadata property
Execution context
ExecutionContext defined in the azure-functions-java-library contains helper methods to communicate with
the functions runtime.
Logger
Use getLogger defined in ExecutionContext to write logs from function code.
Example:
import com.microsoft.azure.functions.*;
import com.microsoft.azure.functions.annotation.*;
To stream logging output for your Function app using the Azure CLI, open a new command prompt, Bash, or
Terminal session and enter the following command:
The az webapp log tail command has options to filter output using the --provider option.
To download the log files as a single ZIP file using the Azure CLI, open a new command prompt, Bash, or
Terminal session and enter the following command:
You must have enabled file system logging in the Azure portal or Azure CLI before running this command.
Environment variables
In Functions, app settings, such as service connection strings, are exposed as environment variables during
execution. You can access these settings using, System.getenv("AzureWebJobsStorage")
Example:
Add AppSetting with name testAppSetting and value testAppSettingValue
This article provides details about how you write Azure Functions using PowerShell.
NOTE
PowerShell for Azure Functions is currently in preview. To receive important updates, subscribe to the Azure App Service
announcements repository on GitHub.
A PowerShell Azure function (function) is represented as a PowerShell script that executes when triggered. Each
function script has a related function.json file that defines how the function behaves, such as how it's triggered
and its input and output parameters. To learn more, see the Triggers and binding article.
Like other kinds of functions, PowerShell script functions take in parameters that match the names of all the input
bindings defined in the function.json file. A TriggerMetadata parameter is also passed that contains additional
information on the trigger that started the function.
This article assumes that you have already read the Azure Functions developer reference. You should have also
completed the Functions quickstart for PowerShell to create your first PowerShell function.
Folder structure
The required folder structure for a PowerShell project looks like the following. This default can be changed. For
more information, see the scriptFile section below.
PSFunctionApp
| - MyFirstFunction
| | - run.ps1
| | - function.json
| - MySecondFunction
| | - run.ps1
| | - function.json
| - Modules
| | - myFirstHelperModule
| | | - myFirstHelperModule.psd1
| | | - myFirstHelperModule.psm1
| | - mySecondHelperModule
| | | - mySecondHelperModule.psd1
| | | - mySecondHelperModule.psm1
| - local.settings.json
| - host.json
| - requirements.psd1
| - profile.ps1
| - extensions.csproj
| - bin
At the root of the project, there's a shared host.json file that can be used to configure the function app. Each
function has a folder with its own code file (.ps1) and binding configuration file ( function.json ). The name of the
function.json file's parent directory is always the name of your function.
Certain bindings require the presence of an extensions.csproj file. Binding extensions, required in version 2.x of
the Functions runtime, are defined in the extensions.csproj file, with the actual library files in the bin folder.
When developing locally, you must register binding extensions. When developing functions in the Azure portal,
this registration is done for you.
In PowerShell Function Apps, you may optionally have a profile.ps1 which runs when a function app starts to
run (otherwise know as a cold start. For more information, see PowerShell profile.
# $TriggerMetadata is optional here. If you don't need it, you can safely remove it from the param block
param($MyFirstInputBinding, $MySecondInputBinding, $TriggerMetadata)
TriggerMetadata parameter
The TriggerMetadata parameter is used to supply additional information about the trigger. The additional
metadata varies from binding to binding but they all contain a sys property that contains the following data:
$TriggerMetadata.sys
Every trigger type has a different set of metadata. For example, the $TriggerMetadata for QueueTrigger contains
the InsertionTime , Id , DequeueCount , among other things. For more information on the queue trigger's
metadata, go to the official documentation for queue triggers. Check the documentation on the triggers you're
working with to see what comes inside the trigger metadata.
Bindings
In PowerShell, bindings are configured and defined in a function's function.json. Functions interact with bindings a
number of ways.
Reading trigger and input data
Trigger and input bindings are read as parameters passed to your function. Input bindings have a direction set to
in in function.json. The name property defined in function.json is the name of the parameter, in the param
block. Since PowerShell uses named parameters for binding, the order of the parameters doesn't matter. However,
it's a best practice to follow the order of the bindings defined in the function.json .
param($MyFirstInputBinding, $MySecondInputBinding)
param($MyFirstInputBinding, $MySecondInputBinding)
You can also pass in a value for a specific binding through the pipeline.
param($MyFirstInputBinding, $MySecondInputBinding)
Because the output is to HTTP, which accepts a singleton value only, an error is thrown when Push-OutputBinding
is called a second time.
For outputs that only accept singleton values, you can use the -Clobber parameter to override the old value
instead of trying to add to a collection. The following example assumes that you have already added a value. By
using -Clobber , the response from the following example overrides the existing value to return a value of "output
#3":
The output binding for a Storage queue accepts multiple output values. In this case, calling the following example
after the first writes to the queue a list with two items: "output #1" and "output #2".
The following example, when called after the previous two, adds two more values to the output collection:
When written to the queue, the message contains these four values: "output #1", "output #2", "output #3", and
"output #4".
Get-OutputBinding cmdlet
You can use the Get-OutputBinding cmdlet to retrieve the values currently set for your output bindings. This
cmdlet retrieves a hashtable that contains the names of the output bindings with their respective values.
The following is an example of using Get-OutputBinding to return current binding values:
Get-OutputBinding
Name Value
---- -----
MyQueue myData
MyOtherQueue myData
Get-OutputBinding also contains a parameter called -Name , which can be used to filter the returned binding, as in
the following example:
Name Value
---- -----
MyQueue myData
Logging
Logging in PowerShell functions works like regular PowerShell logging. You can use the logging cmdlets to write
to each output stream. Each cmdlet maps to a log level used by Functions.
Error Write-Error
Warning Write-Warning
Debug Write-Debug
Trace Write-Progress
Write-Verbose
In addition to these cmdlets, anything written to the pipeline is redirected to the Information log level and
displayed with the default PowerShell formatting.
IMPORTANT
Using the Write-Verbose or Write-Debug cmdlets is not enough to see verbose and debug level logging. You must also
configure the log level threshold, which declares what level of logs you actually care about. To learn more, see Configure the
function app log level.
{
"logging": {
"logLevel": {
"Function.MyFunction": "Debug",
"default": "Trace"
}
}
}
run.ps1
param($req, $TriggerMetadata)
$name = $req.Query.Name
param([string] $myBlob)
PowerShell profile
In PowerShell, there's the concept of a PowerShell profile. If you're not familiar with PowerShell profiles, see
About profiles.
In PowerShell Functions, the profile script executes when the function app starts. Function apps start when first
deployed and after being idled (cold start).
When you create a function app using tools, such as Visual Studio Code and Azure Functions Core Tools, a default
profile.ps1 is created for you. The default profile is maintained on the Core Tools GitHub repository and
contains:
Automatic MSI authentication to Azure.
The ability to turn on the Azure PowerShell AzureRM PowerShell aliases if you would like.
PowerShell version
The following table shows the PowerShell version used by each major version of the Functions runtime:
FUNCTIONS VERSION POWERSHELL VERSION
You can see the current version by printing $PSVersionTable from any function.
Dependency management
PowerShell functions support managing Azure modules by the service. By modifying the host.json and setting the
managedDependency enabled property to true, the requirements.psd1 file will be processed. The latest Azure
modules will be automatically downloaded and made available to the function.
host.json
{
"managedDependency": {
"enabled": true
}
}
requirements.psd1
@{
Az = '1.*'
}
Leveraging your own custom modules or modules from the PowerShell Gallery is a little different than how you
would do it normally.
When you install the module on your local machine, it goes in one of the globally available folders in your
$env:PSModulePath . Since your function runs in Azure, you won't have access to the modules installed on your
machine. This requires that the $env:PSModulePath for a PowerShell function app differs from $env:PSModulePath
in a regular PowerShell script.
In Functions, PSModulePath contains two paths:
A Modules folder that exists at the root of your Function App.
A path to a Modules folder that lives inside the PowerShell language worker.
Function app-level Modules folder
To use custom modules or PowerShell modules from the PowerShell Gallery, you can place modules on which
your functions depend in a Modules folder. From this folder, modules are automatically available to the functions
runtime. Any function in the function app can use these modules.
To take advantage of this feature, create a Modules folder in the root of your function app. Save the modules you
want to use in your functions in this location.
mkdir ./Modules
Save-Module MyGalleryModule -Path ./Modules
Use Save-Module to save all of the modules your functions use, or copy your own custom modules to the Modules
folder. With a Modules folder, your function app should have the following folder structure:
PSFunctionApp
| - MyFunction
| | - run.ps1
| | - function.json
| - Modules
| | - MyGalleryModule
| | - MyOtherGalleryModule
| | - MyCustomModule.psm1
| - local.settings.json
| - host.json
When you start your function app, the PowerShell language worker adds this Modules folder to the
$env:PSModulePath so that you can rely on module autoloading just as you would in a regular PowerShell script.
Environment variables
In Functions, app settings, such as service connection strings, are exposed as environment variables during
execution. You can access these settings using $env:NAME_OF_ENV_VAR , as shown in the following example:
param($myTimer)
There are several ways that you can add, update, and delete function app settings:
In the Azure portal.
By using the Azure CLI.
When running locally, app settings are read from the local.settings.json project file.
Concurrency
By default, the Functions PowerShell runtime can only process one invocation of a function at a time. However,
this concurrency level might not be sufficient in the following situations:
When you're trying to handle a large number of invocations at the same time.
When you have functions that invoke other functions inside the same function app.
You can change this behavior by setting the following environment variable to an integer value:
PSWorkerInProcConcurrencyUpperBound
You set this environment variable in the app settings of your Function App.
Considerations for using concurrency
PowerShell is a single threaded scripting language by default. However, concurrency can be added by using
multiple PowerShell runspaces in the same process. This feature is how the Azure Functions PowerShell runtime
works.
There are some drawbacks with this approach.
Concurrency is only as good as the machine it's running on
If your function app is running on an App Service plan that only supports a single core, then concurrency won't
help much. That's because there are no additional cores to help balance the load. In this case, performance can
vary when the single core has to context-switch between runspaces.
The Consumption plan runs using only one core, so you can't leverage concurrency. If you want to take full
advantage of concurrency, instead deploy your functions to a function app running on a dedicated App Service
plan with sufficient cores.
Azure PowerShell state
Azure PowerShell uses some process-level contexts and state to help save you from excess typing. However, if you
turn on concurrency in your function app and invoke actions that change state, you could end up with race
conditions. These race conditions are difficult to debug because one invocation relies on a certain state and the
other invocation changed the state.
There's immense value in concurrency with Azure PowerShell, since some operations can take a considerable
amount of time. However, you must proceed with caution. If you suspect that you're experiencing a race condition,
set the concurrency back to 1 and try the request again.
FunctionApp
| - host.json
| - myFunction
| | - function.json
| - lib
| | - PSFunction.ps1
In this case, the function.json for myFunction includes a scriptFile property referencing the file with the
exported function to run.
{
"scriptFile": "../lib/PSFunction.ps1",
"bindings": [
// ...
]
}
FunctionApp
| - host.json
| - myFunction
| | - function.json
| - lib
| | - PSFunction.psm1
function Invoke-PSTestFunc {
param($InputBinding, $TriggerMetadata)
In this example, the configuration for myFunction includes a scriptFile property that references
PSFunction.psm1 , which is a PowerShell module in another folder. The entryPoint property references the
Invoke-PSTestFunc function, which is the entry point in the module.
{
"scriptFile": "../lib/PSFunction.psm1",
"entryPoint": "Invoke-PSTestFunc",
"bindings": [
// ...
]
}
With this configuration, the Invoke-PSTestFunc gets executed exactly as a run.ps1 would.
Your script is run on every invocation. Avoid using Install-Module in your script. Instead use Save-Module before
publishing so that your function doesn't have to waste time downloading the module. If cold starts are impacting
your functions, consider deploying your function app to an App Service plan set to always on or to a Premium
plan.
Next steps
For more information, see the following resources:
Best practices for Azure Functions
Azure Functions developer reference
Azure Functions triggers and bindings
Azure Functions Python developer guide
4/26/2019 • 8 minutes to read • Edit Online
This article is an introduction to developing Azure Functions using Python. The content below assumes that
you've already read the Azure Functions developers guide.
NOTE
Python for Azure Functions is currently in preview. To receive important updates, subscribe to the Azure App Service
announcements repository on GitHub.
Programming model
An Azure Function should be a stateless method in your Python script that processes input and produces
output. By default, the runtime expects the method to be implemented as a global method called main() in
the __init__.py file.
You can change the default configuration by specifying the scriptFile and entryPoint properties in the
function.json file. For example, the function.json below tells the runtime to use the customentry () method in
the main.py file, as the entry point for your Azure Function.
{
"scriptFile": "main.py",
"entryPoint": "customentry",
...
}
Data from triggers and bindings is bound to the function via method attributes using the name property
defined in the function.json configuration file. For example, the function.json below describes a simple
function triggered by an HTTP request named req :
{
"bindings": [
{
"name": "req",
"direction": "in",
"type": "httpTrigger",
"authLevel": "anonymous"
},
{
"name": "$return",
"direction": "out",
"type": "http"
}
]
}
Optionally, you can also declare the parameter types and return type in the function using Python type
annotations. For example, the same function can be written using annotations, as follows:
import azure.functions
Use the Python annotations included in the azure.functions.* package to bind input and outputs to your
methods.
Folder structure
The folder structure for a Python Functions project looks like the following:
FunctionApp
| - MyFirstFunction
| | - __init__.py
| | - function.json
| - MySecondFunction
| | - __init__.py
| | - function.json
| - SharedCode
| | - myFirstHelperFunction.py
| | - mySecondHelperFunction.py
| - host.json
| - requirements.txt
| - extensions.csproj
| - bin
There's a shared host.json file that can be used to configure the function app. Each function has its own code
file and binding configuration file (function.json).
Shared code should be kept in a separate folder. To reference modules in the SharedCode folder, you can use
the following syntax:
Binding extensions used by the Functions runtime are defined in the extensions.csproj file, with the actual
library files in the bin folder. When developing locally, you must register binding extensions using Azure
Functions Core Tools.
When deploying a Functions project to your function app in Azure, the entire content of the FunctionApp
folder should be included in the package, but not the folder itself.
// local.settings.json
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "<azure-storage-connection-string>"
}
}
# __init__.py
import azure.functions as func
import logging
When the function is invoked, the HTTP request is passed to the function as req . An entry will be retrieved
from the Azure Blob Storage based on the ID in the route URL and made available as obj in the function
body. Here the storage account specified is the connection string found in AzureWebJobsStorage which is the
same storage account used by the function app.
Outputs
Output can be expressed both in return value and output parameters. If there's only one output, we
recommend using the return value. For multiple outputs, you'll have to use output parameters.
To use the return value of a function as the value of an output binding, the name property of the binding
should be set to $return in function.json .
To produce multiple outputs, use the set() method provided by the azure.functions.Out interface to assign
a value to the binding. For example, the following function can push a message to a queue and also return an
HTTP response.
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "req",
"direction": "in",
"type": "httpTrigger",
"authLevel": "anonymous"
},
{
"name": "msg",
"direction": "out",
"type": "queue",
"queueName": "outqueue",
"connection": "AzureWebJobsStorage"
},
{
"name": "$return",
"direction": "out",
"type": "http"
}
]
}
message = req.params.get('body')
msg.set(message)
return message
Logging
Access to the Azure Functions runtime logger is available via a root logging handler in your function app.
This logger is tied to Application Insights and allows you to flag warnings and errors encountered during the
function execution.
The following example logs an info message when the function is invoked via an HTTP trigger.
import logging
def main(req):
logging.info('Python HTTP trigger function processed a request.')
Additional logging methods are available that let you write to the console at different trace levels:
METHOD DESCRIPTION
from . import helpers # Use more dots to navigate up the folder structure.
def main(req: func.HttpRequest):
helpers.process_http_request(req)
Alternatively, put shared code into a standalone package, publish it to a public or a private PyPI instance, and
specify it as a regular dependency.
Async
Since only a single Python process can exist per function app, it is recommended to implement your Azure
Function as an asynchronous coroutine using the async def statement.
If the main() function is synchronous (no async qualifier) we automatically run it in an asyncio thread-pool.
Context
To get the invocation context of a function during execution, include the context argument in its signature.
For example:
import azure.functions
requests==2.19.1
When you're ready for publishing, make sure that all your dependencies are listed in the requirements.txt
file, located at the root of your project directory. To successfully execute your Azure Functions, the
requirements file should contain a minimum of the following packages:
azure-functions
azure-functions-worker
grpcio==1.14.1
grpcio-tools==1.14.1
protobuf==3.6.1
six==1.11.0
Publishing to Azure
If you're using a package that requires a compiler and does not support the installation of manylinux-
compatible wheels from PyPI, publishing to Azure will fail with the following error:
There was an error restoring dependencies.ERROR: cannot install <package name - version> dependency:
binary dependencies without wheels are not supported.
The terminal process terminated with exit code: 1
To automatically build and configure the required binaries, install Docker on your local machine and run the
following command to publish using the Azure Functions Core Tools (func). Remember to replace
<app name> with the name of your function app in Azure.
Underneath the covers, Core Tools will use docker to run the mcr.microsoft.com/azure-functions/python
image as a container on your local machine. Using this environment, it'll then build and install the required
modules from source distribution, before packaging them up for final deployment to Azure.
NOTE
Core Tools (func) uses the PyInstaller program to freeze the user's code and dependencies into a single stand-alone
executable to run in Azure. This functionality is currently in preview and may not extend to all types of Python
packages. If you're unable to import your modules, try publishing again using the --no-bundler option.
If you continue to experience issues, please let us know by opening an issue and including a description of the
problem.
To build your dependencies and publish using a continuous integration (CI) and continuous delivery (CD )
system, you can use an Azure Pipeline or Travis CI custom script.
Following is an example azure-pipelines.yml script for the build and publishing process.
pool:
vmImage: 'Ubuntu 16.04'
steps:
- task: NodeTool@0
inputs:
versionSpec: '8.x'
- script: |
set -e
echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ wheezy main" | sudo tee
/etc/apt/sources.list.d/azure-cli.list
curl -L https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
sudo apt-get install -y apt-transport-https
echo "install Azure CLI..."
sudo apt-get update && sudo apt-get install -y azure-cli
npm i -g azure-functions-core-tools --unsafe-perm true
echo "installing dotnet core"
curl -sSL https://dot.net/v1/dotnet-install.sh | bash /dev/stdin --channel 2.0
- script: |
set -e
az login --service-principal --username "$(APP_ID)" --password "$(PASSWORD)" --tenant "$(TENANT_ID)"
func settings add FUNCTIONS_WORKER_RUNTIME python
func extensions install
func azure functionapp publish $(APP_NAME) --build-native-deps
Following is an example .travis.yaml script for the build and publishing process.
sudo: required
language: node_js
node_js:
- "8"
services:
- docker
before_install:
- echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ wheezy main" | sudo tee
/etc/apt/sources.list.d/azure-cli.list
- curl -L https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
- sudo apt-get install -y apt-transport-https
- sudo apt-get update && sudo apt-get install -y azure-cli
- npm i -g azure-functions-core-tools --unsafe-perm true
script:
- az login --service-principal --username "$APP_ID" --password "$PASSWORD" --tenant "$TENANT_ID"
- az account get-access-token --query "accessToken" | func azure functionapp publish $APP_NAME --build-
native-deps
Next steps
For more information, see the following resources:
Best practices for Azure Functions
Azure Functions triggers and bindings
Blob storage bindings
HTTP and Webhook bindings
Queue storage bindings
Timer trigger
Azure App Service diagnostics overview
5/10/2019 • 4 minutes to read • Edit Online
When you’re running a web application, you want to be prepared for any issues that may arise, from 500 errors to
your users telling you that your site is down. App Service diagnostics is an intelligent and interactive experience to
help you troubleshoot your app with no configuration required. When you do run into issues with your app, App
Service diagnostics points out what’s wrong to guide you to the right information to more easily and quickly
troubleshoot and resolve the issue.
Although this experience is most helpful when you’re having issues with your app within the last 24 hours, all the
diagnostic graphs are always available for you to analyze.
App Service diagnostics works for not only your app on Windows, but also apps on Linux/containers, App Service
Environment, and Azure Functions.
After clicking on these tiles, you can see a list of topics related to the issue described in the tile. These topics provide
snippets of notable information from the full report. You can click on any of these topics to investigate the issues
further. Also, you can click on View Full Report to explore all the topics on a single page.
Diagnostic report
After you choose to investigate the issue further by clicking on a topic, you can view more details about the topic
often supplemented with graphs and markdowns. Diagnostic report can be a powerful tool for pinpointing the
problem with your app.
Health checkup
If you don't know what’s wrong with your app or don’t know where to start troubleshooting your issues, the health
checkup is a good place to start. The health checkup analyzes your applications to give you a quick, interactive
overview that points out what’s healthy and what’s wrong, telling you where to look to investigate the issue. Its
intelligent and interactive interface provides you with guidance through the troubleshooting process. Health
checkup is integrated with the Genie experience for Windows apps and web app down diagnostic report for Linux
apps.
Health checkup graphs
There are four different graphs in the health checkup.
requests and errors: A graph that shows the number of requests made over the last 24 hours along with HTTP
server errors.
app performance: A graph that shows response time over the last 24 hours for various percentile groups.
CPU usage: A graph that shows the overall percent CPU usage per instance over the last 24 hours.
memory usage: A graph that shows the overall percent physical memory usage per instance over the last 24
hours.
Investigate application code issues (only for Windows app)
Because many app issues are related to issues in your application code, App Service diagnostics integrates with
Application Insights to highlight exceptions and dependency issues to correlate with the selected downtime.
Application Insights has to be enabled separately.
To view Application Insights exceptions and dependencies, select the web app down or web app slow tile
shortcuts.
Troubleshooting steps (only for Windows app)
If an issue is detected with a specific problem category within the last 24 hours, you can view the full diagnostic
report, and App Service diagnostics may prompt you to view more troubleshooting advice and next steps for a
more guided experience.
Diagnostic tools (only for Windows app)
Diagnostics Tools include more advanced diagnostic tools that help you investigate application code issues,
slowness, connection strings, and more. and proactive tools that help you mitigate issues with CPU usage, requests,
and memory.
Proactive CPU monitoring
Proactive CPU monitoring provides you an easy, proactive way to take an action when your app or child process
for your app is consuming high CPU resources. You can set your own CPU threshold rules to temporarily mitigate
a high CPU issue until the real cause for the unexpected issue is found.
Proactive auto -healing
Like proactive CPU monitoring, proactive auto-healing offers an easy, proactive approach to mitigating unexpected
behavior of your app. You can set your own rules based on request count, slow request, memory limit, and HTTP
status code to trigger mitigation actions. This tool can be used to temporarily mitigate an unexpected behavior until
the real cause for the issue is found. For more information on proactive auto-healing, visit Announcing the new
auto healing experience in app service diagnostics.
Change analysis
In a fast-paced development environment, sometimes it may be difficult to keep track of all the changes made to
your app and let alone pinpoint on a change that caused an unhealthy behavior. Change analysis can help you
narrow down on the changes made to your app to facilitate trouble-shooting experience. Change analysis is
embedded in diagnostic report such as Application Crashes so you can use it concurrently with other metrics.
Change analysis has to be enabled before using the feature. For more information on change analysis, visit
Announcing the new change analysis experience in App Service Diagnostics.
Optimize the performance and reliability of Azure
Functions
3/13/2019 • 7 minutes to read • Edit Online
This article provides guidance to improve the performance and reliability of your serverless function apps.
#r "..\Shared\MyAssembly.dll".
Otherwise, it is easy to accidentally deploy multiple test versions of the same binary that behave differently
between functions.
Don't use verbose logging in production code. It has a negative performance impact.
Use async code but avoid blocking calls
Asynchronous programming is a recommended best practice. However, always avoid referencing the Result
property or calling Wait method on a Task instance. This approach can lead to thread exhaustion.
TIP
If you plan to use the HTTP or WebHook bindings, plan to avoid port exhaustion that can be caused by improper
instantiation of HttpClient . For more information, see How to manage connections in Azure Functions.
{
"http": {
"routePrefix": "api",
"maxOutstandingRequests": 200,
"maxConcurrentRequests": 100,
"dynamicThrottlesEnabled": true
}
}
Other host configuration options can be found in the host configuration document.
Next steps
For more information, see the following resources:
How to manage connections in Azure Functions
Azure App Service best practices
Work with Azure Functions Proxies
3/15/2019 • 9 minutes to read • Edit Online
This article explains how to configure and work with Azure Functions Proxies. With this feature, you can specify
endpoints on your function app that are implemented by another resource. You can use these proxies to break a
large API into multiple function apps (as in a microservice architecture), while still presenting a single API surface
for clients.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
NOTE
Standard Functions billing applies to proxy executions. For more information, see Azure Functions pricing.
Create a proxy
This section shows you how to create a proxy in the Functions portal.
1. Open the Azure portal, and then go to your function app.
2. In the left pane, select New proxy.
3. Provide a name for your proxy.
4. Configure the endpoint that's exposed on this function app by specifying the route template and HTTP
methods. These parameters behave according to the rules for HTTP triggers.
5. Set the backend URL to another endpoint. This endpoint could be a function in another function app, or it
could be any other API. The value does not need to be static, and it can reference application settings and
parameters from the original client request.
6. Click Create.
Your proxy now exists as a new endpoint on your function app. From a client perspective, it is equivalent to an
HttpTrigger in Azure Functions. You can try out your new proxy by copying the Proxy URL and testing it with your
favorite HTTP client.
Use variables
The configuration for a proxy does not need to be static. You can condition it to use variables from the original
client request, the back-end response, or application settings.
Reference local functions
You can use localhost to reference a function inside the same function app directly, without a roundtrip proxy
request.
"backendurl": "https://localhost/api/httptriggerC#1" will reference a local HTTP triggered function at the route
/api/httptriggerC#1
NOTE
If your function uses function, admin or sys authorization levels, you will need to provide the code and clientId, as per the
original function URL. In this case the reference would look like:
"backendurl": "https://localhost/api/httptriggerC#1?code=<keyvalue>&clientId=<keyname>"
TIP
Use application settings for back-end hosts when you have multiple deployments or test environments. That way, you can
make sure that you are always talking to the right back-end for that environment.
Troubleshoot Proxies
By adding the flag "debug":true to any proxy in your proxies.json you will enable debug logging. Logs are
stored in D:\home\LogFiles\Application\Proxies\DetailedTrace and accessible through the advanced tools (kudu).
Any HTTP responses will also contain a Proxy-Trace-Location header with a URL to access the log file.
You can debug a proxy from the client side by adding a Proxy-Trace-Enabled header set to true . This will also log
a trace to the file system, and return the trace URL as a header in the response.
Block proxy traces
For security reasons you may not want to allow anyone calling your service to generate a trace. They will not be
able to access the trace contents without your login credentials, but generating the trace consumes resources and
exposes that you are using Function Proxies.
Disable traces altogether by adding "debug":false to any particular proxy in your proxies.json .
Advanced configuration
The proxies that you configure are stored in a proxies.json file, which is located in the root of a function app
directory. You can manually edit this file and deploy it as part of your app when you use any of the deployment
methods that Functions supports.
TIP
If you have not set up one of the deployment methods, you can also work with the proxies.json file in the portal. Go to your
function app, select Platform features, and then select App Service Editor. By doing so, you can view the entire file
structure of your function app and then make changes.
Proxies.json is defined by a proxies object, which is composed of named proxies and their definitions. Optionally, if
your editor supports it, you can reference a JSON schema for code completion. An example file might look like the
following:
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"proxy1": {
"matchCondition": {
"methods": [ "GET" ],
"route": "/api/{test}"
},
"backendUri": "https://<AnotherApp>.azurewebsites.net/api/<FunctionName>"
}
}
}
Each proxy has a friendly name, such as proxy1 in the preceding example. The corresponding proxy definition
object is defined by the following properties:
matchCondition: Required--an object defining the requests that trigger the execution of this proxy. It contains
two properties that are shared with HTTP triggers:
methods: An array of the HTTP methods that the proxy responds to. If it is not specified, the proxy
responds to all HTTP methods on the route.
route: Required--defines the route template, controlling which request URLs your proxy responds to.
Unlike in HTTP triggers, there is no default value.
backendUri: The URL of the back-end resource to which the request should be proxied. This value can
reference application settings and parameters from the original client request. If this property is not included,
Azure Functions responds with an HTTP 200 OK.
requestOverrides: An object that defines transformations to the back-end request. See Define a
requestOverrides object.
responseOverrides: An object that defines transformations to the client response. See Define a
responseOverrides object.
NOTE
The route property in Azure Functions Proxies does not honor the routePrefix property of the Function App host
configuration. If you want to include a prefix such as /api , it must be included in the route property.
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"Root": {
"disabled":true,
"matchCondition": {
"route": "/example"
},
"backendUri": "https://<AnotherApp>.azurewebsites.net/api/<FunctionName>"
}
}
}
Application Settings
The proxy behavior can be controlled by several app settings. They are all outlined in the Functions App Settings
reference
AZURE_FUNCTION_PROXY_DISABLE_LOCAL_CALL
AZURE_FUNCTION_PROXY_BACKEND_URL_DECODE_SLASHES
Reserved Characters (string formatting)
Proxies read all strings out of a JSON file, using \ as an escape symbol. Proxies also interpret curly braces. See a
full set of examples below.
\ \\ example.com\\text.html -->
example.com\text.html
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"proxy1": {
"matchCondition": {
"methods": [ "GET" ],
"route": "/api/{test}"
},
"backendUri": "https://<AnotherApp>.azurewebsites.net/api/<FunctionName>",
"requestOverrides": {
"backend.request.headers.Accept": "application/xml",
"backend.request.headers.x-functions-key": "%ANOTHERAPP_API_KEY%"
}
}
}
}
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"proxy1": {
"matchCondition": {
"methods": [ "GET" ],
"route": "/api/{test}"
},
"responseOverrides": {
"response.body": "Hello, {test}",
"response.headers.Content-Type": "text/plain"
}
}
}
}
NOTE
In this example, the response body is set directly, so no backendUri property is needed. The example shows how you might
use Azure Functions Proxies for mocking APIs.
Azure Functions networking options
4/26/2019 • 5 minutes to read • Edit Online
This article describes the networking features available across the hosting options for Azure Functions. All of the
following networking options provide some ability to access resources without using internet routable addresses,
or restrict internet access to a function app.
The hosting models have different levels of network isolation available. Choosing the correct one will help you
meet your network isolation requirements.
You can host function apps in a couple of ways:
There's a set of plan options that run on a multitenant infrastructure, with various levels of virtual network
connectivity and scaling options:
The Consumption plan, which scales dynamically in response to load and offers minimal network
isolation options.
The Premium plan, which also scales dynamically, while offering more comprehensive network isolation.
The Azure App Service plan, which operates at a fixed scale and offers similar network isolation to the
Premium plan.
You can run functions in an App Service Environment. This method deploys your function into your virtual
network and offers full network control and isolation.
Outbound IP No No No Yes
Restrictions
Inbound IP restrictions
You can use IP restrictions to define a priority-ordered list of IP addresses that are allowed/denied access to your
app. The list can include IPv4 and IPv6 addresses. When there's one or more entries, an implicit "deny all" exists at
the end of the list. IP restrictions work with all function-hosting options.
NOTE
To use the Azure portal editor, the portal must be able to directly access your running function app. Also, the device that
you're using to access the portal must have its IP whitelisted. With network restrictions in place, you can still access any
features on the Platform features tab.
Outbound IP restrictions
Outbound IP restrictions are only available for functions deployed to an App Service Environment. You can
configure outbound restrictions for the virtual network where your App Service Environment is deployed.
To learn more about using the preview version of virtual network integration, see Integrate a function app with an
Azure virtual network.
Hybrid Connections
Hybrid Connections is a feature of Azure Relay that you can use to access application resources in other networks.
It provides access from your app to an application endpoint. You can't use it to access your application. Hybrid
Connections is available to functions running in an App Service plan and an App Service Environment.
As used in Azure Functions, each hybrid connection correlates to a single TCP host and port combination. This
means that the hybrid connection's endpoint can be on any operating system and any application, as long as you're
accessing a TCP listening port. The Hybrid Connections feature does not know or care what the application
protocol is, or what you're accessing. It simply provides network access.
To learn more, see the App Service documentation for Hybrid Connections, which supports Functions in an App
Service plan.
There are many ways to access virtual network resources in other hosting options. But an App Service
Environment is the only way to allow triggers for a function to occur over a virtual network.
Next steps
To learn more about networking and Azure Functions:
Follow the tutorial about getting started with virtual network integration
Read the Functions networking FAQ
Learn more about virtual network integration with App Service/Functions
Learn more about virtual networks in Azure
Enable more networking features and control with App Service Environments
Connect to individual on-premises resources without firewall changes by using Hybrid Connections
IP addresses in Azure Functions
12/3/2018 • 4 minutes to read • Edit Online
This article explains the following topics related to IP addresses of function apps:
How to find the IP addresses currently in use by a function app.
What causes a function app's IP addresses to be changed.
How to restrict the IP addresses that can access a function app.
How to get dedicated IP addresses for a function app.
IP addresses are associated with function apps, not with individual functions. Incoming HTTP requests can't use the
inbound IP address to call individual functions; they must use the default domain name
(functionappname.azurewebsites.net) or a custom domain name.
az webapp show --resource-group <group_name> --name <app_name> --query outboundIpAddresses --output tsv
az webapp show --resource-group <group_name> --name <app_name> --query possibleOutboundIpAddresses --output tsv
NOTE
When a function app that runs on the Consumption plan is scaled, a new range of outbound IP addresses may be assigned.
When running on the Consumption plan, you may need to whitelist the entire data center.
Data center outbound IP addresses
If you need to whitelist the outbound IP addresses used by your function apps, another option is to whitelist the
function apps' data center (Azure region). You can download a JSON file that lists IP addresses for all Azure data
centers. Then find the JSON fragment that applies to the region that your function app runs in.
For example, this is what the Western Europe JSON fragment might look like:
{
"name": "AzureCloud.westeurope",
"id": "AzureCloud.westeurope",
"properties": {
"changeNumber": 9,
"region": "westeurope",
"platform": "Azure",
"systemService": "",
"addressPrefixes": [
"13.69.0.0/17",
"13.73.128.0/18",
... Some IP addresses not shown here
"213.199.180.192/27",
"213.199.183.0/24"
]
}
}
For information about when this file is updated and when the IP addresses change, expand the Details section of
the Download Center page.
IP address restrictions
You can configure a list of IP addresses that you want to allow or deny access to a function app. For more
information, see Azure App Service Static IP Restrictions.
Dedicated IP addresses
If you need static, dedicated IP addresses, we recommend App Service Environments (the Isolated tier of App
Service plans). For more information, see App Service Environment IP addresses and How to control inbound
traffic to an App Service Environment.
To find out if your function app runs in an App Service Environment:
1. Sign in to the Azure portal.
2. Navigate to the function app.
3. Select the Overview tab.
4. The App Service plan tier appears under App Service plan/pricing tier. The App Service Environment pricing
tier is Isolated.
As an alternative, you can use the Cloud Shell:
az webapp show --resource-group <group_name> --name <app_name> --query sku --output tsv
Next steps
A common cause of IP changes is function app scale changes. Learn more about function app scaling.
Azure Functions on Kubernetes with KEDA
5/6/2019 • 2 minutes to read • Edit Online
The Azure Functions runtime provides flexibility in hosting where and how you want. KEDA (Kubernetes-based
Event Driven Autoscaling) pairs seamlessly with the Azure Functions runtime and tooling to provide event driven
scale in Kubernetes.
To build an image and deploy your functions to Kubernetes, run the following command:
Next Steps
For more information, see the following resources:
Create a function using a custom image
Code and test Azure Functions locally
How the Azure Function consumption plan works
Azure Functions developers guide
4/3/2019 • 5 minutes to read • Edit Online
In Azure Functions, specific functions share a few core technical concepts and components, regardless of
the language or binding you use. Before you jump into learning details specific to a given language or
binding, be sure to read through this overview that applies to all of them.
This article assumes that you've already read the Azure Functions overview.
Function code
A function is the primary concept in Azure Functions. A function contains two important pieces - your code,
which can be written in a variety of languages, and some config, the function.json file. For compiled
languages, this config file is generated automatically from annotations in your code. For scripting
languages, you must provide the config file yourself.
The function.json file defines the function's trigger, bindings, and other configuration settings. Every
function has one and only one trigger. The runtime uses this config file to determine the events to monitor
and how to pass data into and return data from a function execution. The following is an example
function.json file.
{
"disabled":false,
"bindings":[
// ... bindings here
{
"type": "bindingType",
"direction": "in",
"name": "myParamName",
// ... more depending on binding
}
]
}
The bindings property is where you configure both triggers and bindings. Each binding shares a few
common settings and some settings which are specific to a particular type of binding. Every binding
requires the following settings:
NOTE
All functions in a function app must be authored in the same language. In previous versions of the Azure Functions
runtime, this wasn't required.
Folder structure
The code for all the functions in a specific function app is located in a root project folder that contains a host
configuration file and one or more subfolders. Each subfolder contains the code for a separate function, as
in the following representation:
FunctionApp
| - host.json
| - Myfirstfunction
| | - function.json
| | - ...
| - mysecondfunction
| | - function.json
| | - ...
| - SharedCode
| - bin
In version 2.x of the Functions runtime, all functions in the function app must share the same language
worker.
The host.json file, which contains some runtime-specific configurations, is in the root folder of the function
app. A bin folder contains packages and other library files required by the function app. See the language-
specific requirements for a function app project:
C# class library (.csproj)
C# script (.csx)
F# script
Java
JavaScript
The above is the default (and recommended) folder structure for a Function app. If you wish to change the
file location of a function's code, modify the scriptFile section of the function.json file. We also
recommend using package deployment to deploy your project to your function app in Azure. You can also
use existing tools like continuous integration and deployment and Azure DevOps.
NOTE
If deploying a package manually, make sure to deploy your host.json file and function folders directly to the
wwwroot folder. Do not include the wwwroot folder in your deployments. Otherwise, you end up with
wwwroot\wwwroot folders.
Parallel execution
When multiple triggering events occur faster than a single-threaded function runtime can process them, the
runtime may invoke the function multiple times in parallel. If a function app is using the Consumption
hosting plan, the function app could scale out automatically. Each instance of the function app, whether the
app runs on the Consumption hosting plan or a regular App Service hosting plan, might process concurrent
function invocations in parallel using multiple threads. The maximum number of concurrent function
invocations in each function app instance varies based on the type of trigger being used as well as the
resources used by other functions within the function app.
Repositories
The code for Azure Functions is open source and stored in GitHub repositories:
Azure Functions
Azure Functions host
Azure Functions portal
Azure Functions templates
Azure WebJobs SDK
Azure WebJobs SDK Extensions
Bindings
Here is a table of all supported bindings.
The following table shows the bindings that are supported in the two major versions of the Azure Functions
runtime.
Blob Storage ✔ ✔ ✔ ✔ ✔
Cosmos DB ✔ ✔ ✔ ✔ ✔
Event Grid ✔ ✔ ✔
TYPE 1.X 2.X TRIGGER INPUT OUTPUT
Event Hubs ✔ ✔ ✔ ✔
HTTP & ✔ ✔ ✔ ✔
Webhooks
Microsoft Graph ✔ ✔ ✔
Excel tables
Microsoft Graph ✔ ✔ ✔
OneDrive files
Microsoft Graph ✔ ✔
Outlook email
Microsoft Graph ✔ ✔ ✔ ✔
Events
Microsoft Graph ✔ ✔
Auth tokens
Mobile Apps ✔ ✔ ✔
Notification ✔ ✔
Hubs
Queue storage ✔ ✔ ✔ ✔
SendGrid ✔ ✔ ✔
Service Bus ✔ ✔ ✔ ✔
SignalR ✔ ✔ ✔
Table storage ✔ ✔ ✔ ✔
Timer ✔ ✔ ✔
Twilio ✔ ✔ ✔
1 In 2.x, all bindings except HTTP and Timer must be registered. See Register binding extensions.
Having issues with errors coming from the bindings? Review the Azure Functions Binding Error Codes
documentation.
Reporting Issues
ITEM DESCRIPTION LINK
Next steps
For more information, see the following resources:
Azure Functions triggers and bindings
Code and test Azure Functions locally
Best Practices for Azure Functions
Azure Functions C# developer reference
Azure Functions NodeJS developer reference
Strategies for testing your code in Azure Functions
5/9/2019 • 7 minutes to read • Edit Online
This article demonstrates how to create automated tests for Azure Functions.
Testing all code is recommended, however you may get the best results by wrapping up a Function's logic and
creating tests outside the Function. Abstracting logic away limits a Function's lines of code and allows the
Function to be solely responsible for calling other classes or modules. This article, however, demonstrates how to
create automated tests against an HTTP and timer-triggered function.
The content that follows is split into two different sections meant to target different languages and environments.
You can learn to build tests in:
C# in Visual Studio with xUnit
JavaScript in VS Code with Jest
The sample repository is available on GitHub.
C# in Visual Studio
The following example describes how to create a C# Function app in Visual Studio and run and tests with xUnit.
Setup
To set up your environment, create a Function and test app. The following steps help you create the apps and
functions required to support the tests:
1. Create a new Functions app and name it Functions
2. Create an HTTP function from the template and name it HttpTrigger.
3. Create a timer function from the template and name it TimerTrigger.
4. Create an xUnit Test app in Visual Studio by clicking File > New > Project > Visual C# > .NET Core >
xUnit Test Project and name it Functions.Test.
5. Use Nuget to add a references from the test app Microsoft.AspNetCore.Mvc
6. Reference the Functions app from Functions.Test app.
Create test classes
Now that the applications are created, you can create the classes used to run the automated tests.
Each function takes an instance of ILogger to handle message logging. Some tests either don't log messages or
have no concern for how logging is implemented. Other tests need to evaluate messages logged to determine
whether a test is passing.
The ListLogger class is meant to implement the ILogger interface and hold in internal list of messages for
evaluation during a test.
Right-click on the Functions.Test application and select Add > Class, name it NullScope.cs and enter the
following code:
using System;
namespace Functions.Tests
{
public class NullScope : IDisposable
{
public static NullScope Instance { get; } = new NullScope();
private NullScope() { }
Next, right-click on the Functions.Test application and select Add > Class, name it ListLogger.cs and enter the
following code:
using Microsoft.Extensions.Logging;
using System;
using System.Collections.Generic;
using System.Text;
namespace Functions.Tests
{
public class ListLogger : ILogger
{
public IList<string> Logs;
public ListLogger()
{
this.Logs = new List<string>();
}
The ListLogger class implements the following members as contracted by the ILogger interface:
BeginScope: Scopes add context to your logging. In this case, the test just points to the static instance on
the NullScope class to allow the test to function.
IsEnabled: A default value of false is provided.
Log: This method uses the provided formatter function to format the message and then adds the
resulting text to the Logs collection.
The Logs collection is an instance of List<string> and is initialized in the constructor.
Next, right-click on the Functions.Test application and select Add > Class, name it LoggerTypes.cs and enter the
following code:
namespace Functions.Tests
{
public enum LoggerTypes
{
Null,
List
}
}
namespace Functions.Tests
{
public class TestFactory
{
public static IEnumerable<object[]> Data()
{
return new List<object[]>
{
new object[] { "name", "Bill" },
new object[] { "name", "Paul" },
new object[] { "name", "Steve" }
};
}
if (type == LoggerTypes.List)
{
logger = new ListLogger();
}
else
{
logger = NullLoggerFactory.Instance.CreateLogger("Null Logger");
}
return logger;
}
}
}
Next, right-click on the Functions.Test application and select Add > Class, name it FunctionsTests.cs and enter
the following code:
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
using Xunit;
namespace Functions.Tests
{
public class FunctionsTests
{
private readonly ILogger logger = TestFactory.CreateLogger();
[Fact]
public async void Http_trigger_should_return_known_string()
{
var request = TestFactory.CreateHttpRequest("name", "Bill");
var response = (OkObjectResult)await HttpFunction.Run(request, logger);
Assert.Equal("Hello, Bill", response.Value);
}
[Theory]
[MemberData(nameof(TestFactory.Data), MemberType = typeof(TestFactory))]
public async void Http_trigger_should_return_known_string_from_member_data(string queryStringKey,
string queryStringValue)
{
var request = TestFactory.CreateHttpRequest(queryStringKey, queryStringValue);
var response = (OkObjectResult)await HttpFunction.Run(request, logger);
Assert.Equal($"Hello, {queryStringValue}", response.Value);
}
[Fact]
public void Timer_should_log_message()
{
var logger = (ListLogger)TestFactory.CreateLogger(LoggerTypes.List);
TimerTrigger.Run(null, logger);
var msg = logger.Logs[0];
Assert.Contains("C# Timer trigger function executed at", msg);
}
}
}
JavaScript in VS Code
The following example describes how to create a JavaScript Function app in VS Code and run and tests with Jest.
This procedure uses the VS Code Functions extension to create Azure Functions.
Setup
To set up your environment, initialize a new Node.js app in an empty folder by running npm init .
npm init -y
npm i jest
Now update package.json to replace the existing test command with the following command:
"scripts": {
"test": "jest"
}
module.exports = {
log: jest.fn()
};
This module mocks the log function to represent the default execution context.
Next, add a new file, name it defaultTimer.js, and add the following code:
module.exports = {
IsPastDue: false
};
This module implements the IsPastDue property to stand is as a fake timer instance.
Next, use the VS Code Functions extension to create a new JavaScript HTTP Function and name it HttpTrigger.
Once the function is created, add a new file in the same folder named index.test.js, and add the following code:
const request = {
query: { name: 'Bill' }
};
expect(context.log.mock.calls.length).toBe(1);
expect(context.res.body).toEqual('Hello Bill');
});
The HTTP function from the template returns a string of "Hello" concatenated with the name provided in the
query string. This test creates a fake instance of a request and passes it to the HTTP function. The test checks that
the log method is called once and the returned text equals "Hello Bill".
Next, use the VS Code Functions extension to create a new JavaScript Timer Function and name it TimerTrigger.
Once the function is created, add a new file in the same folder named index.test.js, and add the following code:
The timer function from the template logs a message at the end of the body of the function. This test ensures the
log function is called once.
Run tests
To run the tests, press CTRL + ~ to open the command window, and run npm test :
npm test
Debug tests
To debug your tests, add the following configuration to your launch.json file:
{
"type": "node",
"request": "launch",
"name": "Jest Tests",
"program": "${workspaceRoot}\\node_modules\\jest\\bin\\jest.js",
"args": [
"-i"
],
"internalConsoleOptions": "openOnSessionStart"
}
Next steps
Now that you've learned how to write automated tests for your functions, continue with these resources:
Manually run a non HTTP -triggered function
Azure Functions error handling
Azure Function Event Grid Trigger Local Debugging
Code and test Azure Functions locally
5/17/2019 • 2 minutes to read • Edit Online
While you're able to develop and test Azure Functions in the Azure portal, many developers prefer a local
development experience. Functions makes it easy to use your favorite code editor and development tools to
create and test functions on your local computer. Your local functions can connect to live Azure services, and
you can debug them on your local computer using the full Functions runtime.
Command prompt or terminal C# (class library), C# script (.csx), Azure Functions Core Tools provides
JavaScript the core runtime and templates for
creating functions, which enable local
development. Version 2.x supports
development on Linux, MacOS, and
Windows. All environments rely on
Core Tools for the local Functions
runtime.
Visual Studio Code C# (class library), C# script (.csx), The Azure Functions extension for VS
JavaScript Code adds Functions support to VS
Code. Requires the Core Tools.
Supports development on Linux,
MacOS, and Windows, when using
version 2.x of the Core Tools. To learn
more, see Create your first function
using Visual Studio Code.
Visual Studio 2019 C# (class library) The Azure Functions tools are included
in the Azure development workload
of Visual Studio 2019 and later
versions. Lets you compile functions in
a class library and publish the .dll to
Azure. Includes the Core Tools for local
testing. To learn more, see Develop
Azure Functions using Visual Studio.
Each of these local development environments lets you create function app projects and use predefined
Functions templates to create new functions. Each uses the Core Tools so that you can test and debug your
functions against the real Functions runtime on your own machine just as you would any other app. You can
also publish you function app project from any of these environments to Azure.
Next steps
To learn more about local development of compiled C# functions using Visual Studio 2019, see Develop
Azure Functions using Visual Studio.
To learn more about local development of functions using VS Code on a Mac, Linux, or Windows computer,
see the VS Code documentation for Azure Functions.
To learn more about developing functions from the command prompt or terminal, see Work with Azure
Functions Core Tools.
Develop Azure Functions using Visual Studio
5/17/2019 • 13 minutes to read • Edit Online
Azure Functions Tools for Visual Studio 2019 is an extension for Visual Studio that lets you develop, test, and
deploy C# functions to Azure. If this experience is your first with Azure Functions, you can learn more at An
introduction to Azure Functions.
The Azure Functions Tools provides the following benefits:
Edit, build, and run functions on your local development computer.
Publish your Azure Functions project directly to Azure.
Use WebJobs attributes to declare function bindings directly in the C# code instead of maintaining a separate
function.json for binding definitions.
Develop and deploy pre-compiled C# functions. Pre-complied functions provide a better cold-start
performance than C# script-based functions.
Code your functions in C# while having all of the benefits of Visual Studio development.
This article provides details about how to use the Azure Functions Tools for Visual Studio 2019 to develop C#
functions and publish them to Azure. Before you read this article, you should complete the Functions quickstart
for Visual Studio.
IMPORTANT
Don't mix local development with portal development in the same function app. When you publish from a local project to a
function app, the deployment process overwrites any functions that you developed in the portal.
Prerequisites
Azure Functions Tools is included in the Azure development workload of Visual Studio 2017, or a later version.
Make sure you include the Azure development workload in your Visual Studio 2019 installation:
Make sure that your Visual Studio is up-to-date and that you are using the most recent version of the Azure
Functions tools.
Other requirements
To create and deploy functions, you also need:
An active Azure subscription. If you don't have an Azure subscription, free accounts are available.
An Azure Storage account. To create a storage account, see Create a storage account.
Check your tools version
1. From the Tools menu, choose Extensions and Updates. Expand Installed > Tools and choose Azure
Functions and Web Jobs Tools.
2. Note the installed Version. You can compare this version with the latest version listed in the release notes.
3. If your version is older, update your tools in Visual Studio as shown in the following section.
Update your tools
1. In the Extensions and Updates dialog, expand Updates > Visual Studio Marketplace, choose Azure
Functions and Web Jobs Tools and select Update.
2. After the tools update is downloaded, close Visual Studio to trigger the tools update using the VSIX
installer.
3. In the installer, choose OK to start and then Modify to update the tools.
4. After the update is complete, choose Close and restart Visual Studio.
3. Use the settings specified in the table that follows the image.
SETTING SUGGESTED VALUE DESCRIPTION
NOTE
Make sure you set the Access rights to Anonymous . If you choose the default level of Function , you're required
to present the function key in requests to access your function endpoint.
IMPORTANT
Because the local.settings.json file can contain secrets, you must excluded it from your project source control. The
Copy to Output Directory setting for this file should always be Copy if newer.
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace FunctionApp1
{
public static class Function1
{
[FunctionName("QueueTriggerCSharp")]
public static void Run([QueueTrigger("myqueue-items", Connection = "QueueStorage")]string
myQueueItem, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
}
}
}
A binding-specific attribute is applied to each binding parameter supplied to the entry point method. The
attribute takes the binding information as parameters. In the previous example, the first parameter has a
QueueTrigger attribute applied, indicating queue triggered function. The queue name and connection
string setting name are passed as parameters to the QueueTrigger attribute. For more information, see
Azure Queue storage bindings for Azure Functions.
You can use the above procedure to add more functions to your function app project. Each function in the project
can have a different trigger, but a function must have exactly one trigger. For more information, see Azure
Functions triggers and bindings concepts.
Add bindings
Add bindings
As with triggers, input and output bindings are added to your function as binding attributes. Add bindings to a
function as follows:
1. Make sure you have configured the project for local development.
2. Add the appropriate NuGet extension package for the specific binding. For more information, see Local C#
development using Visual Studio in the Triggers and Bindings article. The binding-specific NuGet package
requirements are found in the reference article for the binding. For example, find package requirements for
the Event Hubs trigger in the Event Hubs binding reference article.
3. If there are app settings that the binding needs, add them to the Values collection in the local setting file.
These values are used when the function runs locally. When the function runs in the function app in Azure,
the function app settings are used.
4. Add the appropriate binding attribute to the method signature. In the following example, a queue message
triggers the function, and the output binding creates a new queue message with the same text in a different
queue.
The connection to Queue storage is obtained from the AzureWebJobsStorage setting. For more information,
see the reference article for the specific binding.
The following table shows the bindings that are supported in the two major versions of the Azure Functions
runtime.
Blob Storage ✔ ✔ ✔ ✔ ✔
Cosmos DB ✔ ✔ ✔ ✔ ✔
Event Grid ✔ ✔ ✔
Event Hubs ✔ ✔ ✔ ✔
HTTP & ✔ ✔ ✔ ✔
Webhooks
Microsoft Graph ✔ ✔ ✔
Excel tables
TYPE 1.X 2.X TRIGGER INPUT OUTPUT
Microsoft Graph ✔ ✔ ✔
OneDrive files
Microsoft Graph ✔ ✔
Outlook email
Microsoft Graph ✔ ✔ ✔ ✔
Events
Microsoft Graph ✔ ✔
Auth tokens
Mobile Apps ✔ ✔ ✔
Notification Hubs ✔ ✔
Queue storage ✔ ✔ ✔ ✔
SendGrid ✔ ✔ ✔
Service Bus ✔ ✔ ✔ ✔
SignalR ✔ ✔ ✔
Table storage ✔ ✔ ✔ ✔
Timer ✔ ✔ ✔
Twilio ✔ ✔ ✔
1 In 2.x, all bindings except HTTP and Timer must be registered. See Register binding extensions.
Testing functions
Azure Functions Core Tools lets you run Azure Functions project on your local development computer. You are
prompted to install these tools the first time you start a function from Visual Studio.
To test your function, press F5. If prompted, accept the request from Visual Studio to download and install Azure
Functions Core (CLI) tools. You may also need to enable a firewall exception so that the tools can handle HTTP
requests.
With the project running, you can test your code as you would test deployed function. For more information, see
Strategies for testing your code in Azure Functions. When running in debug mode, breakpoints are hit in Visual
Studio as expected.
To learn more about using the Azure Functions Core Tools, see Code and test Azure functions locally.
Publish to Azure
1. In Solution Explorer, right-click the project and select Publish.
2. Select Azure Function App, choose Create New, and then select Publish.
When you enable Run from Zip, your function app in Azure goes into read-only mode and is run directly
from the deployment package. For more information, see Run your Azure Functions from a package file.
Cau t i on
When you choose Select Existing, all files in the existing function app in Azure are overwritten by files
from the local project. Only use this option when republishing updates to an existing function app.
3. If you haven't already connected Visual Studio to your Azure account, select Add an account....
4. In the Create App Service dialog, use the Hosting settings as specified in the table below the image:
SETTING SUGGESTED VALUE DESCRIPTION
App Name Globally unique name Name that uniquely identifies your
new function app.
Storage Account General purpose storage account An Azure storage account is required
by the Functions runtime. Click New
to create a general purpose storage
account. You can also use an existing
account that meets the storage
account requirements.
5. Click Create to create a function app and related resources in Azure with these settings and deploy your
function project code.
6. After the deployment is complete, make a note of the Site URL value, which is the address of your
function app in Azure.
This displays the Application Settings dialog for the function app, where you can add new application settings
or modify existing ones.
Local represents a setting value in the local.settings.json file, and Remote is the current setting in the function
app in Azure. Choose Add setting to create a new app setting. Use the Insert value from Local link to copy a
setting value to the Remote field. Pending changes are written to the local settings file and the function app when
you select OK.
You can also manage application settings in one of these other ways:
Using the Azure portal.
Using the --publish-local-settings publish option in the Azure Functions Core Tools.
Using the Azure CLI.
Monitoring functions
The recommended way to monitor the execution of your functions is by integrating your function app with Azure
Application Insights. When you create a function app in the Azure portal, this integration is done for you by
default. However, when you create your function app during Visual Studio publishing, the integration in your
function app in Azure isn't done.
To enable Application Insights for your function app:
Functions makes it simple to add Application Insights integration to a function app from the Azure portal.
1. In the portal, select All services > Function Apps, select your function app, and then choose the
Application Insights banner at the top of the window
2. Create an Application Insights resource by using the settings specified in the table below the image:
Name Unique app name It's easiest to use the same name as
your function app, which must be
unique in your subscription.
3. Choose OK. The Application Insights resource is created in the same resource group and subscription as
your function app. After creation completes, close the Application Insights window.
4. Back in your function app, select Application settings, and scroll down to Application settings. When
you see a setting named APPINSIGHTS_INSTRUMENTATIONKEY , it means that Application Insights integration is
enabled for your function app running in Azure.
To learn more, see Monitor Azure Functions.
Next steps
To learn more about the Azure Functions Core Tools, see Code and test Azure functions locally.
To learn more about developing functions as .NET class libraries, see Azure Functions C# developer reference.
This article also links to examples of how to use attributes to declare the various types of bindings supported by
Azure Functions.
Work with Azure Functions Core Tools
5/22/2019 • 20 minutes to read • Edit Online
Azure Functions Core Tools lets you develop and test your functions on your local computer from the
command prompt or terminal. Your local functions can connect to live Azure services, and you can debug your
functions on your local computer using the full Functions runtime. You can even deploy a function app to your
Azure subscription.
IMPORTANT
Do not mix local development with portal development in the same function app. When you create and publish
functions from a local project, you should not try to maintain or modify project code in the portal.
Developing functions on your local computer and publishing them to Azure using Core Tools follows these
basic steps:
Install the Core Tools and dependencies.
Create a function app project from a language-specific template.
Register trigger and binding extensions.
Define Storage and other connections.
Create a function from a trigger and language-specific template.
Run the function locally
Publish the project to Azure
Windows
The following steps use npm to install Core Tools on Windows. You can also use Chocolatey. For more
information, see the Core Tools readme.
1. Install Node.js, which includes npm. For version 2.x of the tools, only Node.js 8.5 and later versions are
supported.
2. Install the Core Tools package:
3. If you do not plan to use extension bundles, install the .NET Core 2.x SDK for Windows.
MacOS with Homebrew
The following steps use Homebrew to install the Core Tools on macOS.
1. Install Homebrew, if it's not already installed.
2. Install the Core Tools package:
3. If you do not plan to use extension bundles, install .NET Core 2.x SDK for macOS.
Linux (Ubuntu/Debian ) with APT
The following steps use APT to install Core Tools on your Ubuntu/Debian Linux distribution. For other Linux
distributions, see the Core Tools readme.
1. Register the Microsoft product key as trusted:
2. Verify your Ubuntu server is running one of the appropriate versions from the table below. To add the
apt source, run:
4. If you do not plan to use extension bundles, install .NET Core 2.x SDK for Linux.
When you provide a project name, a new folder with that name is created and initialized. Otherwise, the
current folder is initialized.
In version 2.x, when you run the command you must choose a runtime for your project. If you plan to develop
JavaScript functions, choose node:
Use the up/down arrow keys to choose a language, then press Enter. The output looks like the following
example for a JavaScript project:
func init supports the following options, which are version 2.x-only, unless otherwise noted:
OPTION DESCRIPTION
--force Initialize the project even when there are existing files in the
project. This setting overwrites existing files with the same
name. Other files in the project folder aren't affected.
--worker-runtime Sets the language runtime for the project. Supported values
are dotnet , node (JavaScript), java , and python .
When not set, you are prompted to choose your runtime
during initialization.
IMPORTANT
By default, version 2.x of the Core Tools creates function app projects for the .NET runtime as C# class projects (.csproj).
These C# projects, which can be used with Visual Studio or Visual Studio Code, are compiled during testing and when
publishing to Azure. If you instead want to create and work with the same C# script (.csx) files created in version 1.x and
in the portal, you must include the --csx parameter when you create and deploy functions.
Register extensions
In version 2.x of the Azure Functions runtime, you have to explicitly register the binding extensions (binding
types) that you use in your function app.
Extension bundles make all bindings published by the Azure Functions team available through a setting in the
host.json file. For local development, ensure you have the latest version of Azure Functions Core Tools.
To use extension bundles, update the host.json file to include the following entry for extensionBundle :
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}
The id property references the namespace for Microsoft Azure Functions extension bundles.
The version references the version of the bundle.
Bundle versions increment as packages in the bundle changes. Major version changes happen only when
packages in the bundle move a major version. The version property uses the interval notation for specifying
version ranges. The Functions runtime always picks the maximum permissible version defined by the version
range or interval.
Once you reference the extension bundles in your project, then all default bindings are available to your
functions. The bindings available in the extension bundle are:
PACKAGE VERSION
Microsoft.Azure.WebJobs.Extensions.CosmosDB 3.0.3
Microsoft.Azure.WebJobs.Extensions.DurableTask 1.8.0
Microsoft.Azure.WebJobs.Extensions.EventGrid 2.0.0
Microsoft.Azure.WebJobs.Extensions.EventHubs 3.0.3
Microsoft.Azure.WebJobs.Extensions.SendGrid 3.0.0
Microsoft.Azure.WebJobs.Extensions.ServiceBus 3.0.3
Microsoft.Azure.WebJobs.Extensions.SignalRService 1.0.0
Microsoft.Azure.WebJobs.Extensions.Storage 3.0.4
Microsoft.Azure.WebJobs.Extensions.Twilio 3.0.0
For more information, see Azure Functions triggers and bindings concepts.
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "<language worker>",
"AzureWebJobsStorage": "<connection-string>",
"AzureWebJobsDashboard": "<connection-string>",
"MyBindingConnection": "<binding-connection-string>"
},
"Host": {
"LocalHttpPort": 7071,
"CORS": "*"
"CORSCredentials": true
},
"ConnectionStrings": {
"SQLConnectionString": "<sqlclient-connection-string>"
}
}
SETTING DESCRIPTION
SETTING DESCRIPTION
IsEncrypted When set to true , all values are encrypted using a local
machine key. Used with func settings commands.
Default value is false .
LocalHttpPort Sets the default port used when running the local Functions
host ( func host start and func run ). The --port
command-line option takes precedence over this value.
ConnectionStrings Do not use this collection for the connection strings used
by your function bindings. This collection is only used by
frameworks that typically get connection strings from the
ConnectionStrings section of a configuration file, such as
Entity Framework. Connection strings in this object are
added to the environment with the provider type of
System.Data.SqlClient. Items in this collection are not
published to Azure with other app settings. You must
explicitly add these values to the Connection strings
collection of your function app settings. If you are creating a
SqlConnection in your function code, you should store
the connection string value in Application Settings in the
portal with your other connections.
The function app settings values can also be read in your code as environment variables. For more
information, see the Environment variables section of these language-specific reference topics:
C# precompiled
C# script (.csx)
F# script (.fsx)
Java
JavaScript
When no valid storage connection string is set for AzureWebJobsStorage and the emulator isn't being used, the
following error message is shown:
Missing value for AzureWebJobsStorage in local.settings.json. This is required for all triggers other than
HTTP. You can run 'func azure functionapp fetch-app-settings <functionAppName>' or specify a
connection string in local.settings.json.
Use Azure Storage Explorer to connect to your Azure account. In the Explorer, expand your
subscription, select your storage account, and copy the primary or secondary connection string.
Use Core Tools to download the connection string from Azure with one of the following commands:
Download all settings from an existing function app:
When you are not already signed in to Azure, you are prompted to do so.
Create a function
To create a function, run the following command:
func new
In version 2.x, when you run func new you are prompted to choose a template in the default language of your
function app, then you are also prompted to choose a name for your function. In version 1.x, you are also
prompted to choose the language.
Select a language: Select a template:
Blob trigger
Cosmos DB trigger
Event Grid trigger
HTTP trigger
Queue trigger
SendGrid
Service Bus Queue trigger
Service Bus Topic trigger
Timer trigger
Function code is generated in a subfolder with the provided function name, as you can see in the following
queue trigger output:
You can also specify these options in the command using the following arguments:
ARGUMENT DESCRIPTION
OPTION DESCRIPTION
--cert The path to a .pfx file that contains a private key. Only used
with --useHttps . Version 2.x only.
--nodeDebugPort -n The port for the node debugger to use. Default: A value
from launch.json or 5858. Version 1.x only.
--password Either the password or a file that contains the password for
a .pfx file. Only used with --cert . Version 2.x only.
--pause-on-error Pause for additional input before exiting the process. Used
only when launching Core Tools from an integrated
development environment (IDE).
--script-root --prefix Used to specify the path to the root of the function app
that is to be run or deployed. This is used for compiled
projects that generate project files into a subfolder. For
example, when you build a C# class library project, the
host.json, local.settings.json, and function.json files are
generated in a root subfolder with a path like
MyProject/bin/Debug/netstandard2.0 . In this case, set
the prefix as
--script-root MyProject/bin/Debug/netstandard2.0 .
This is the root of the function app when running in Azure.
For a C# class library project (.csproj), you must include the --build option to generate the library .dll.
When the Functions host starts, it outputs the URL of HTTP -triggered functions:
Found the following functions:
Host.Functions.MyHttpTrigger
IMPORTANT
When running locally, authentication isn't enforced for HTTP endpoints. This means that all local HTTP requests are
handled as authLevel = "anonymous" . For more information, see the HTTP binding article.
NOTE
Examples in this topic use the cURL tool to send HTTP requests from the terminal or a command prompt. You can use a
tool of your choice to send HTTP requests to the local server. The cURL tool is available by default on Linux-based
systems. On Windows, you must first download and install the cURL tool.
For more general information on testing functions, see Strategies for testing your code in Azure Functions.
HTTP and webhook triggered functions
You call the following endpoint to locally run HTTP and webhook triggered functions:
http://localhost:{port}/api/{function_name}
Make sure to use the same server name and port that the Functions host is listening on. You see this in the
output generated when starting the Function host. You can call this URL using any HTTP method supported
by the trigger.
The following cURL command triggers the MyHttpTrigger quickstart function from a GET request with the
name parameter passed in the query string.
The following example is the same function called from a POST request passing name in the request body:
You can make GET requests from a browser passing data in the query string. For all other HTTP methods, you
must use cURL, Fiddler, Postman, or a similar HTTP testing tool.
Non-HTTP triggered functions
For all kinds of functions other than HTTP triggers and webhooks, you can test your functions locally by
calling an administration endpoint. Calling this endpoint with an HTTP POST request on the local server
triggers the function. You can optionally pass test data to the execution in the body of the POST request. This
functionality is similar to the Test tab in the Azure portal.
You call the following administrator endpoint to trigger non-HTTP functions:
http://localhost:{port}/admin/functions/{function_name}
To pass test data to the administrator endpoint of a function, you must supply the data in the body of a POST
request message. The message body is required to have the following JSON format:
{
"input": "<trigger_input>"
}
The <trigger_input> value contains data in a format expected by the function. The following cURL example is
a POST to a QueueTriggerJS function. In this case, the input is a string that is equivalent to the message
expected to be found in the queue.
IMPORTANT
The func run command is not supported in version 2.x of the tools. For more information, see the topic How to
target Azure Functions runtime versions.
You can also invoke a function directly by using func run <FunctionName> and provide input data for the
function. This command is similar to running a function using the Test tab in the Azure portal.
func run supports the following options:
OPTION DESCRIPTION
--timeout -t Time to wait (in seconds) until the local Functions host is
ready.
--no-interactive Does not prompt for input. Useful for automation scenarios.
For example, to call an HTTP -triggered function and pass content body, run the following command:
Publish to Azure
Core Tools supports two types of deployment, deploying function project files directly to your function app
and deploying a custom Linux container, which is supported only in version 2.x. You must have already created
a function app in your Azure subscription.
In version 2.x, you must have registered your extensions in your project before publishing. Projects that
require compilation should be built so that the binaries can be deployed.
Project file deployment
The most common deployment method involves using Core Tools to package your function app project,
binaries, and dependencies and deploy the package to your function app. You can optionally run your
functions directly from the deployment package.
To publish a Functions project to a function app in Azure, use the publish command:
This command publishes to an existing function app in Azure. An error occurs when the <FunctionAppName>
doesn't exist in your subscription. To learn how to create a function app from the command prompt or
terminal window using the Azure CLI, see Create a Function App for serverless execution.
The publishcommand uploads the contents of the Functions project directory. If you delete files locally, the
publish command does not delete them from Azure. You can delete files in Azure by using the Kudu tool in
the Azure portal.
IMPORTANT
When you create a function app in the Azure portal, it uses version 2.x of the Function runtime by default. To make the
function app use version 1.x of the runtime, follow the instructions in Run on version 1.x. You can't change the runtime
version for a function app that has existing functions.
The following project publish options apply for both versions, 1.x and 2.x:
OPTION DESCRIPTION
The following project publish options are only supported in version 2.x:
OPTION DESCRIPTION
func deploy
OPTION DESCRIPTION
--platform Hosting platform for the function app. Valid options are
kubernetes
Monitoring functions
The recommended way to monitor the execution of your functions is by integrating with Azure Application
Insights. When you create a function app in the Azure portal, this integration is done for you by default.
However, when you create your function app by using the Azure CLI, the integration in your function app in
Azure isn't done.
To enable Application Insights for your function app:
Functions makes it simple to add Application Insights integration to a function app from the Azure portal.
1. In the portal, select All services > Function Apps, select your function app, and then choose the
Application Insights banner at the top of the window
2. Create an Application Insights resource by using the settings specified in the table below the image:
Name Unique app name It's easiest to use the same name as
your function app, which must be
unique in your subscription.
3. Choose OK. The Application Insights resource is created in the same resource group and subscription
as your function app. After creation completes, close the Application Insights window.
4. Back in your function app, select Application settings, and scroll down to Application settings.
When you see a setting named APPINSIGHTS_INSTRUMENTATIONKEY , it means that Application Insights
integration is enabled for your function app running in Azure.
To learn more, see Monitor Azure Functions.
Next steps
Azure Functions Core Tools is open source and hosted on GitHub.
To file a bug or feature request, open a GitHub issue.
Create your first Azure function with Java and IntelliJ
4/2/2019 • 3 minutes to read • Edit Online
IMPORTANT
The JAVA_HOME environment variable must be set to the install location of the JDK to complete the steps in this article.
We recommend that you install Azure Functions Core Tools, version 2. It provides a local development
environment for writing, running, and debugging Azure Functions.
This command causes the function host to open a debug port at 5005.
2. On the Run menu, select Edit Configurations.
3. Select (+) to add a Remote.
4. Complete the Name and Settings fields, and then select OK to save the configuration.
5. After setup, select Debug < Remote Configuration Name > or press Shift+F9 on your keyboard to start
debugging.
6. When you're finished, stop the debugger and the running process. Only one function host can be active and
running locally at a time.
az login
2. Deploy your code into a new function by using the azure-functions:deploy Maven target. You can also
select the azure-functions:deploy option in the Maven Projects window.
mvn azure-functions:deploy
3. Find the URL for your function in the Azure CLI output after the function has been successfully deployed.
Next steps
Review the Java Functions developer guide for more information on developing Java functions.
Add additional functions with different triggers to your project by using the azure-functions:add Maven
target.
Create your first function with Java and Eclipse
4/2/2019 • 3 minutes to read • Edit Online
This article shows you how to create a serverless function project with the Eclipse IDE and Apache Maven, test
and debug it, then deploy it to Azure Functions.
If you don't have an Azure subscription, create a free account before you begin.
IMPORTANT
The JAVA_HOME environment variable must be set to the install location of the JDK to complete this quickstart.
It's highly recommended to also install Azure Functions Core Tools, version 2, which provide a local environment
for running and debugging Azure Functions.
1. Right-click on the generated project, then choose Run As and Maven build.
2. In the Edit Configuration dialog, Enter package in the Goals and Name fields, then select Run. This will
build and package the function code.
3. Once the build is complete, create another Run configuration as above, using azure-functions:run as the goal
and name. Select Run to run the function in the IDE.
Terminate the runtime in the console window when you're done testing your function. Only one function host can
be active and running locally at a time.
Debug the function in Eclipse
In your Run As configuration set up in the previous step, change azure-functions:run to
mvn azure-functions:run -DenableDebug and run the updated configuration to start the function app in debug
mode.
Select the Run menu and open Debug Configurations. Choose Remote Java Application and create a new
one. Give your configuration a name and fill in the settings. The port should be consistent with the debug port
opened by function host, which by default is 5005 . After setup, click on Debug to start debugging.
Set breakpoints and inspect objects in your function using the IDE. When finished, stop the debugger and the
running function host. Only one function host can be active and running locally at a time.
az login
Deploy your code into a new Function app using the azure-functions:deploy Maven goal in a new Run As
configuration.
When the deploy is complete, you see the URL you can use to access your Azure function app:
Next steps
Review the Java Functions developer guide for more information on developing Java functions.
Add additional functions with different triggers to your project using the azure-functions:add Maven target.
Create a function app on Linux in an Azure App
Service plan
5/6/2019 • 5 minutes to read • Edit Online
Azure Functions lets you host your functions on Linux in a default Azure App Service container. This article walks
you through how to use the Azure portal to create a Linux-hosted function app that runs in an App Service plan.
You can also bring your own custom container.
If you don't have an Azure subscription, create a free account before you begin.
Sign in to Azure
Sign in to the Azure portal at https://portal.azure.com with your Azure account.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
Hosting plan App Service plan Hosting plan that defines how
resources are allocated to your
function app. When you run in an
App Service plan, you can control the
scaling of your function app.
App Service plan/Location Create plan Choose Create new and supply an
App Service plan name. Choose a
Location in a region near you or
near other services your functions
access. Choose your desired Pricing
tier.
You can't run both Linux and
Windows function apps in the same
App Service plan.
NOTE
The portal development experience can be useful for trying out Azure Functions. For most scenarios, consider developing
your functions locally and publishing the project to your function app using either Visual Studio Code or the Azure Functions
Core Tools.
1. In your new function app, choose the Overview tab, and after it loads completely choose + New function.
2. Paste the function URL into your browser's address bar. Add the query string value &name=<yourname> to the
end of this URL and press the Enter key on your keyboard to execute the request. You should see the
response returned by the function displayed in the browser.
The following example shows the response in the browser:
The request URL includes a key that is required, by default, to access your function over HTTP.
3. When your function runs, trace information is written to the logs. To see the trace output from the previous
execution, return to your function in the portal and click the arrow at the bottom of the screen to expand the
Logs.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts,
tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your
account status and service pricing. If you don't need the resources anymore, here's how to delete them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under
Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that
you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones you
want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can
also select the bell icon at the top of the page to view the notification.
Next steps
You have created a function app with a simple HTTP triggered function.
Now that you have created your first function, let's add an output binding to the function that writes a message to a
Storage queue.
Add messages to an Azure Storage queue using Functions
For more information, see Azure Functions HTTP bindings.
Debug PowerShell Azure Functions locally
5/6/2019 • 6 minutes to read • Edit Online
NOTE
PowerShell for Azure Functions is currently in preview. To receive important updates, subscribe to the Azure App Service
announcements repository on GitHub.
You can debug your PowerShell functions locally as you would any PowerShell scripts using the following
standard development tools:
Visual Studio Code: Microsoft's free, lightweight, and open-source text editor with the PowerShell extension
that offers a full PowerShell development experience.
A PowerShell console: Debug using the same commands you would use to debug any other PowerShell
process.
Azure Functions Core Tools supports local debugging of Azure Functions, including PowerShell functions.
PSFunctionApp
| - HttpTriggerFunction
| | - run.ps1
| | - function.json
| - local.settings.json
| - host.json
| - profile.ps1
This function app is similar to the one you get when you complete the PowerShell quickstart.
The function code in run.ps1 looks like the following script:
param($Request)
$name = $Request.Query.Name
if($name) {
$status = 200
$body = "Hello $name"
}
else {
$status = 400
$body = "Please pass a name on the query string or in the request body."
}
param($Request)
$name = $Request.Query.Name
if($name) {
$status = 200
$body = "Hello $name"
}
# ...
NOTE
Should your project not have the needed configuration files, you are prompted to add them.
Invoke-RestMethod "http://localhost:7071/api/HttpTrigger?Name=Functions"
You'll notice that a response isn't immediately returned. That's because Wait-Debugger has attached the debugger
and PowerShell execution went into break mode as soon as it could. This is because of the BreakAll concept, which
is explained later. After you press the continue button, the debugger now breaks on the line right after
Wait-Debugger .
At this point, the debugger is attached and you can do all the normal debugger operations. For more information
on using the debugger in Visual Studio Code, see the official documentation.
After you continue and fully invoke your script, you'll notice that:
The PowerShell console that did the Invoke-RestMethod has returned a result
The PowerShell Integrated Console in Visual Studio Code is waiting for a script to be executed
Subsequent times when you invoke the same function, the debugger in PowerShell extension breaks right after the
Wait-Debugger .
Open up a console, cd into the directory of your function app, and run the following command:
func host start
With the function app running and the Wait-Debugger in place, you can attach to the process. You do need two
more PowerShell consoles.
One of the consoles acts as the client. From this, you call Invoke-RestMethod to trigger the function. For example,
you can run the following command:
Invoke-RestMethod "http://localhost:7071/api/HttpTrigger?Name=Functions"
You'll notice that it doesn't return a response, which is a result of the Wait-Debugger . The PowerShell runspace is
now waiting for a debugger to be attached. Let's get that attached.
In the other PowerShell console, run the following command:
Get-PSHostProcessInfo
This cmdlet returns a table that looks like the following output:
Make note of the ProcessId for the item in the table with the ProcessName as dotnet . This process is your
function app.
Next, run the following snippet:
Once started, the debugger breaks and shows something like the following output:
To end the debugging session type the 'Detach' command at the debugger prompt, or type 'Ctrl+C' otherwise.
At /Path/To/PSFunctionApp/HttpTriggerFunction/run.ps1:13 char:1
+ if($name) { ...
+ ~~~~~~~~~~~
[DBG]: [Process:49988]: [Runspace1]: PS /Path/To/PSFunctionApp>>
At this point, you're stopped at a breakpoint in the PowerShell debugger. From here, you can do all of the usual
debug operations, step over, step into, continue, quit, and others. To see the complete set of debug commands
available in the console, run the h or ? commands.
You can also set breakpoints at this level with the Set-PSBreakpoint cmdlet.
Once you continue and fully invoke your script, you'll notice that:
The PowerShell console where you executed Invoke-RestMethod has now returned a result.
The PowerShell console where you executed Debug-Runspace is waiting for a script to be executed.
You can invoke the same function again (using Invoke-RestMethod for example) and the debugger breaks in right
after the Wait-Debugger command.
Should this break happen, run the continue or c command to skip over this breakpoint. You then stop at the
expected breakpoint.
Next steps
To learn more about developing Functions using PowerShell, see Azure Functions PowerShell developer guide.
Use dependency injection in .NET Azure Functions
5/23/2019 • 2 minutes to read • Edit Online
Azure Functions supports the dependency injection (DI) software design pattern, which is a technique for achieving
Inversion of Control (IoC ) between classes and their dependencies.
Azure Functions builds on top of the ASP.NET Core Dependency Injection features. You should understand
services, lifetimes, and design patterns of ASP.NET Core dependency injection before using them in functions.
Prerequisites
Before you can use dependency injection, you must install the Microsoft.Azure.Functions.Extensions NuGet
package. You can install this package by running the following command from the package console:
Install-Package Microsoft.Azure.Functions.Extensions
You must also be using version 1.0.28 of the Microsoft.NET.Sdk.Functions package, or a later version.
Registering services
To register services, you can create a configure method and add components to an IFunctionsHostBuilder instance.
The Azure Functions host creates an IFunctionsHostBuilder and passes it directly into your configured method.
To register your configure method, you must add an assembly attribute that specifies the type for your configure
method using the FunctionsStartup attribute.
[assembly: FunctionsStartup(typeof(MyNamespace.Startup))]
namespace MyNamespace
{
public class Startup : FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddHttpClient();
builder.Services.AddSingleton((s) => {
return new CosmosClient(Environment.GetEnvironmentVariable("COSMOSDB_CONNECTIONSTRING"));
});
builder.Services.AddSingleton<ILoggerProvider, MyLoggerProvider>();
}
}
}
Service lifetimes
Azure Function apps provide the same service lifetimes as ASP.NET Dependency Injection, transient, scoped, and
singleton.
In a function app, a scoped service lifetime matches a function execution lifetime. Scoped services are created once
per execution. Later requests for that service during the execution reuse that instance. A singleton service lifetime
matches the host lifetime and is reused across function executions on that instance.
Singleton lifetime services are recommended for connections and clients, for example a SqlConnection ,
CloudBlobClient , or HttpClient .
View or download a sample of different service lifetimes.
Logging services
If you need your own logging provider, the recommended way is to register an ILoggerProvider . For Application
Insights, Functions adds Application Insights automatically for you.
WARNING
Do not add AddApplicationInsightsTelemetry() to the services collection as it will register services that will conflict with
what is provided by the environment.
Singleton
Microsoft.Extensions.Configuration.IConfiguration Runtime configuration
Singleton
Microsoft.Azure.WebJobs.Host.Executors.IHostIdProvider Responsible for providing the ID of the
host instance
Next steps
For more information, see the following resources:
How to monitor your function app
Best practices for functions
Manage connections in Azure Functions
5/17/2019 • 4 minutes to read • Edit Online
Functions in a function app share resources. Among those shared resources are connections: HTTP connections,
database connections, and connections to services such as Azure Storage. When many functions are running
concurrently, it's possible to run out of available connections. This article explains how to code your functions to
avoid using more connections than they need.
Connection limit
The number of available connections is limited partly because a function app runs in a sandbox environment.
One of the restrictions that the sandbox imposes on your code is a limit on the number of outbound connections,
which is currently 600 active (1,200 total) connections per instance. When you reach this limit, the functions
runtime writes the following message to the logs: Host thresholds exceeded: Connections . For more information,
see the Functions service limits.
This limit is per instance. When the scale controller adds function app instances to handle more requests, each
instance has an independent connection limit. That means there's no global connection limit, and you can have
much more than 600 active connections across all active instances.
When troubleshooting, make sure that you have enabled Application Insights for your function app. Application
Insights lets you view metrics for your function apps like executions. For more information, see View telemetry in
Application Insights.
Static clients
To avoid holding more connections than necessary, reuse client instances rather than creating new ones with
each function invocation. We recommend reusing client connections for any language that you might write your
function in. For example, .NET clients like the HttpClient, DocumentClient, and Azure Storage clients can manage
connections if you use a single, static client.
Here are some guidelines to follow when you're using a service-specific client in an Azure Functions application:
Do not create a new client with every function invocation.
Do create a single, static client that every function invocation can use.
Consider creating a single, static client in a shared helper class if different functions use the same service.
A common question about HttpClient in .NET is "Should I dispose of my client?" In general, you dispose of
objects that implement IDisposable when you're done using them. But you don't dispose of a static client
because you aren't done using it when the function ends. You want the static client to live for the duration of your
application.
HTTP agent examples (JavaScript)
Because it provides better connection management options, you should use the native http.agent class instead
of non-native methods, such as the node-fetch module. Connection parameters are configured through options
on the http.agent class. For detailed options available with the HTTP agent, see new Agent([options]).
The global http.globalAgent class used by http.request() has all of these values set to their respective defaults.
The recommended way to configure connection limits in Functions is to set a maximum number globally. The
following example sets the maximum number of sockets for the function app:
http.globalAgent.maxSockets = 200;
The following example creates a new HTTP request with a custom HTTP agent only for that request:
// Rest of function
}
SqlClient connections
Your function code can use the .NET Framework Data Provider for SQL Server (SqlClient) to make connections
to a SQL relational database. This is also the underlying provider for data frameworks that rely on ADO.NET,
such as Entity Framework. Unlike HttpClient and DocumentClient connections, ADO.NET implements
connection pooling by default. But because you can still run out of connections, you should optimize connections
to the database. For more information, see SQL Server Connection Pooling (ADO.NET).
TIP
Some data frameworks, such as Entity Framework, typically get connection strings from the ConnectionStrings section of
a configuration file. In this case, you must explicitly add SQL database connection strings to the Connection strings
collection of your function app settings and in the local.settings.json file in your local project. If you're creating an instance
of SqlConnection in your function code, you should store the connection string value in Application settings with your
other connections.
Next steps
For more information about why we recommend static clients, see Improper instantiation antipattern.
For more Azure Functions performance tips, see Optimize the performance and reliability of Azure Functions.
Azure Functions error handling
10/30/2018 • 2 minutes to read • Edit Online
This topic provides general guidance for handling errors that occur when your functions execute. It also provides
links to the topics that describe binding-specific errors that may occur.
This article demonstrates how to manually run a non HTTP -triggered function via specially formatted HTTP
request.
In some contexts, you may need to run "on-demand" an Azure Function that is indirectly triggered. Examples of
indirect triggers include functions on a schedule or functions that run as the result of another resource's action.
Postman is used in the following example, but you may use cURL, Fiddler or any other like tool to send HTTP
requests.
Host name: The function app's public location that is made up from the function app's name plus
azurewebsites.net or your custom domain.
Folder path: To access non HTTP -triggered functions via an HTTP request, you have to send the request
through the folders admin/functions.
Function name: The name of the function you want to run.
You use this request location in Postman along with the function's master key in the request to Azure to run the
function.
NOTE
When running locally, the function's master key is not required. You can directly call the function omitting the
x-functions-key header.
Due to the elevated permissions in your function app granted by the master key, you should not share this key
with third parties or distribute it in an application.
8. Click Send.
Postman then reports a status of 202 Accepted.
Next, return to your function in the Azure portal. Locate the Logs window and you'll see messages coming from
the manual call to the function.
Next steps
Strategies for testing your code in Azure Functions
Azure Function Event Grid Trigger Local Debugging
Azure Function Event Grid Trigger Local Debugging
2/22/2019 • 2 minutes to read • Edit Online
This article demonstrates how to debug a local function that handles an Azure Event Grid event raised by a
storage account.
Prerequisites
Create or use an existing function app
Create or use an existing storage account
Download ngrok to allow Azure to call your local function
Once the function is created, open the code file and copy the URL commented out at the top of the file. This
location is used when configuring the Event Grid trigger.
Then, set a breakpoint on the line that begins with log.LogInformation .
As the utility is set up, the command window should look similar to the following screenshot:
Copy the HTTPS URL generated when ngrok is run. This value is used when configuring the event grid event
endpoint.
In the Events window, click on the Event Subscription button. In the Even Subscription window, click on the
Endpoint Type dropdown and select Web Hook.
Once the endpoint type is configured, click on Select an endpoint to configure the endpoint value.
The Subscriber Endpoint value is made up from three different values. The prefix is the HTTPS URL generated by
ngrok. The remainder of the URL comes from the URL found in the function code file, with the function name
added at the end. Starting with the URL from the function code file, the ngrok URL replaces
http://localhost:7071 and the function name replaces {functionname} .
The following screenshot shows how the final URL should look:
IMPORTANT
Every time you start ngrok, the HTTPS URL is regenerated and the value changes. Therefore you must create a new Event
Subscription each time you expose your function to Azure via ngrok.
Upload a file
Now you can upload a file to your storage account to trigger an Event Grid event for your local function to handle.
Open Storage Explorer and connect to the your storage account.
Expand Blob Containers
Right-click and select Create Blob Container.
Name the container test
Select the test container
Click the Upload button
Click Upload Files
Select a file and upload it to the blob container
Clean up resources
To clean up the resources created in this article, delete the test container in your storage account.
Next steps
Automate resizing uploaded images using Event Grid
Event Grid trigger for Azure Functions
Create a Function using Azure for Students Starter
5/6/2019 • 5 minutes to read • Edit Online
In this tutorial, we will create a hello world HTTP function in an Azure for Students Starter subscription. We'll also
walk through what's available in Azure Functions in this subscription type.
Microsoft Azure for Students Starter gets you started with the Azure products you need to develop in the cloud at
no cost to you. Learn more about this offer here.
Azure Functions lets you execute your code in a serverless environment without having to first create a VM or
publish a web application. Learn more about Functions here.
Create a Function
In this topic, learn how to use Functions to create an HTTP triggered "hello world" function in the Azure portal.
Sign in to Azure
Sign in to the Azure portal at https://portal.azure.com with your Azure account.
App name Globally unique name Name that identifies your new
function app. Valid characters are
a-z , 0-9 , and - .
App Service Plan/Location New The hosting plan that controls what
region your function app is deployed
to and the density of your resources.
Multiple Function Apps deployed to
the same plan will all share the same
single free instance. This is a
restriction of the Student Starter plan.
The full hosting options are explained
here.
2. Paste the function URL into your browser's address bar. Add the query string value &name=<yourname> to the
end of this URL and press the Enter key on your keyboard to execute the request. You should see the
response returned by the function displayed in the browser.
The following example shows the response in the browser:
The request URL includes a key that is required, by default, to access your function over HTTP.
3. When your function runs, trace information is written to the logs. To see the trace output from the previous
execution, return to your function in the portal and click the arrow at the bottom of the screen to expand the
Logs.
Clean up resources
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts,
tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into
resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your
account status and service pricing. If you don't need the resources anymore, here's how to delete them:
1. In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under
Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that
you used for this quickstart.
2. In the Resource group page, review the list of included resources, and verify that they are the ones you
want to delete.
3. Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can
also select the bell icon at the top of the page to view the notification.
Next steps
You have created a function app with a simple HTTP triggered function! Now you can explore local tooling, more
languages, monitoring, and integrations.
Create your first function using Visual Studio
Create your first function using Visual Studio Code
Azure Functions JavaScript developer guide
Use Azure Functions to connect to an Azure SQL Database
Learn more about Azure Functions HTTP bindings.
Monitor your Azure Functions
Continuous deployment for Azure Functions
4/30/2019 • 5 minutes to read • Edit Online
Azure Functions makes it easy to deploy your function app by using continuous integration. Functions integrates
with major code repositories and deployment sources. This integration enables a workflow where function code
updates made through one of these services trigger deployment to Azure. If you're new to Azure Functions, start
with the Azure Functions overview.
Continuous deployment is a great option for projects where you're integrating multiple and frequent
contributions. It also lets you maintain source control on your function code. Azure Functions supports the
following deployment sources:
Bitbucket
Dropbox
External repository (Git or Mercurial)
Git local repository
GitHub
OneDrive
Azure DevOps
Deployments are configured on a per-function app basis. After continuous deployment is enabled, access to
function code in the portal is set to read -only.
FunctionApp
| - host.json
| - Myfirstfunction
| | - function.json
| | - ...
| - mysecondfunction
| | - function.json
| | - ...
| - SharedCode
| - bin
In version 2.x of the Functions runtime, all functions in the function app must share the same language worker.
The host.json file, which contains some runtime-specific configurations, is in the root folder of the function app. A
bin folder contains packages and other library files required by the function app. See the language-specific
requirements for a function app project:
C# class library (.csproj)
C# script (.csx)
F# script
Java
JavaScript
To be able to deploy from Azure DevOps, you must first link your Azure DevOps organization with your Azure
subscription. For more information, see Set up billing for your Azure DevOps organization.
3. On the Deployment source blade, select Choose source. Fill in the information for your chosen
deployment source, and then select OK.
After you set up continuous deployment, all file changes in your deployment source are copied to the function
app and a full site deployment is triggered. The site is redeployed when files in the source are updated.
Deployment scenarios
Typical deployment scenarios include creating a staging deployment and moving existing functions to continuous
deployment.
Create a staging deployment
Function apps don't yet support deployment slots. But you can still manage separate staging and production
deployments by using continuous integration.
The process to configure and work with a staging deployment looks generally like this:
1. Create two function apps in your subscription: one for the production code and one for staging.
2. Create a deployment source, if you don't already have one. This example uses GitHub.
3. For your production function app, complete the preceding steps in Set up continuous deployment and set
the deployment branch to the master branch of your GitHub repository.
4. Repeat step 3 for the staging function app, but choose the staging branch instead in your GitHub repo. If
your deployment source doesn't support branching, use a different folder.
5. Make updates to your code in the staging branch or folder, and then verify that the staging deployment
reflects those changes.
6. After testing, merge changes from the staging branch into the master branch. This merge triggers
deployment to the production function app. If your deployment source doesn't support branches,
overwrite the files in the production folder with the files from the staging folder.
Move existing functions to continuous deployment
When you have existing functions that you have created and maintained in the portal, you need to download your
function code files by using FTP or the local Git repository before you can set up continuous deployment as
described earlier. You can do this in the Azure App Service settings for your function app. After you download the
files, you can upload them to your chosen continuous deployment source.
NOTE
After you configure continuous integration, you can no longer edit your source files in the Functions portal.
You can now use these credentials to access your function app from FTP or the built-in Git repo.
Download files by using FTP
1. In your function app in the Azure portal, select Platform features > Properties. Then, copy the values for
FTP/Deployment User, FTP Host Name, and FTPS Host Name.
FTP/Deployment User must be entered as displayed in the portal, including the app name, to provide
proper context for the FTP server.
2. From your FTP client, use the connection information that you gathered to connect to your app and
download the source files for your functions.
Download files by using a local Git repository
1. In your function app in the Azure portal, select Platform features > Deployment options.
5. Clone the repository on your local machine by using a Git-aware command prompt or your favorite Git
tool. The Git clone command looks like this:
6. Fetch files from your function app to the clone on your local computer, as in the following example:
You can automatically deploy your function to an Azure Function app using Azure Pipelines. To define your
pipeline, you can use:
YAML File: This file describes the pipeline, it may have a build steps section, and a release section. The YAML
file should be in the same repo as the app.
Templates: Templates are ready made tasks that build or deploy your app.
YAML-based pipeline
Build your app
Building your app in Azure Pipelines depends on the programming language of your app. Each language has
specific build steps to create a deployment artifact, which can be used to deploy your function app in Azure.
.NET
You can use the following sample to create your YAML file to build your .NET app.
jobs:
- job: Build
pool:
vmImage: 'VS2017-Win2016'
steps:
- script: |
dotnet restore
dotnet build --configuration Release
- task: DotNetCoreCLI@2
inputs:
command: publish
arguments: '--configuration Release --output publish_output'
projects: '*.csproj'
publishWebProjects: false
modifyOutputPath: true
zipAfterPublish: false
- task: ArchiveFiles@2
displayName: "Archive files"
inputs:
rootFolderOrFile: "$(System.DefaultWorkingDirectory)/publish_output"
includeRootFolder: false
archiveFile: "$(System.DefaultWorkingDirectory)/build$(Build.BuildId).zip"
- task: PublishBuildArtifacts@1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/build$(Build.BuildId).zip'
name: 'drop'
JavaScript
You can use the following sample to create your YAML file to build your JavaScript app:
jobs:
- job: Build
pool:
vmImage: ubuntu-16.04 # Use 'VS2017-Win2016' if you have Windows native +Node modules
steps:
- bash: |
if [ -f extensions.csproj ]
then
dotnet build extensions.csproj --output ./bin
fi
npm install
npm run build --if-present
npm prune --production
- task: ArchiveFiles@2
displayName: "Archive files"
inputs:
rootFolderOrFile: "$(System.DefaultWorkingDirectory)"
includeRootFolder: false
archiveFile: "$(System.DefaultWorkingDirectory)/build$(Build.BuildId).zip"
- task: PublishBuildArtifacts@1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/build$(Build.BuildId).zip'
name: 'drop'
Python
You can use the following sample to create your YAML file to build your Python app, Python is only supported for
Linux Azure Functions:
jobs:
- job: Build
pool:
vmImage: ubuntu-16.04
steps:
- task: UsePythonVersion@0
displayName: "Setting python version to 3.6 as required by functions"
inputs:
versionSpec: '3.6'
architecture: 'x64'
- bash: |
if [ -f extensions.csproj ]
then
dotnet build extensions.csproj --output ./bin
fi
python3.6 -m venv worker_venv
source worker_venv/bin/activate
pip3.6 install setuptools
pip3.6 install -r requirements.txt
- task: ArchiveFiles@2
displayName: "Archive files"
inputs:
rootFolderOrFile: "$(System.DefaultWorkingDirectory)"
includeRootFolder: false
archiveFile: "$(System.DefaultWorkingDirectory)/build$(Build.BuildId).zip"
- task: PublishBuildArtifacts@1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/build$(Build.BuildId).zip'
name: 'drop'
steps:
- task: AzureFunctionApp@1
inputs:
azureSubscription: '<Azure service connection>'
appType: functionAppLinux
appName: '<Name of function app>'
Template-based pipeline
Templates in Azure DevOps, are predefined group of tasks that build or deploy an app.
Build your app
Building your app in Azure Pipelines depends on the programming language of your app. Each language has
specific build steps to create a deployment artifact, that can be used to update your function app in Azure. To use
the built-in build templates, when creating a new build pipeline, choose Use the classic editor to create a pipeline
using the designer templates
After configuring the source of your code, search for Azure Functions build templates, and choose the template
that matches your app language.
JavaScript apps
If your JavaScript app have a dependency on Windows native modules, you will need to update:
The Agent Pool version to Hosted VS2017
Next steps
Azure Functions Overview
Azure DevOps Overview
Zip deployment for Azure Functions
3/15/2019 • 6 minutes to read • Edit Online
This article describes how to deploy your function app project files to Azure from a .zip (compressed) file. You
learn how to do a push deployment, both by using Azure CLI and by using the REST APIs. Azure Functions Core
Tools also uses these deployment APIs when publishing a local project to Azure.
Azure Functions has the full range of continuous deployment and integration options that are provided by Azure
App Service. For more information, see Continuous deployment for Azure Functions.
To speed development, you may find it easier to deploy your function app project files directly from a .zip file. The
.zip deployment API takes the contents of a .zip file and extracts the contents into the wwwroot folder of your
function app. This .zip file deployment uses the same Kudu service that powers continuous integration-based
deployments, including:
Deletion of files that were left over from earlier deployments.
Deployment customization, including running deployment scripts.
Deployment logs.
Syncing function triggers in a Consumption plan function app.
For more information, see the .zip deployment reference.
IMPORTANT
When you use .zip deployment, any files from an existing deployment that aren't found in the .zip file are deleted from your
function app.
The code for all the functions in a specific function app is located in a root project folder that contains a host
configuration file and one or more subfolders. Each subfolder contains the code for a separate function, as in the
following representation:
FunctionApp
| - host.json
| - Myfirstfunction
| | - function.json
| | - ...
| - mysecondfunction
| | - function.json
| | - ...
| - SharedCode
| - bin
In version 2.x of the Functions runtime, all functions in the function app must share the same language worker.
The host.json file, which contains some runtime-specific configurations, is in the root folder of the function app. A
bin folder contains packages and other library files required by the function app. See the language-specific
requirements for a function app project:
C# class library (.csproj)
C# script (.csx)
F# script
Java
JavaScript
A function app includes all of the files and folders in the wwwroot directory. A .zip file deployment includes the
contents of the wwwroot directory, but not the directory itself. When deploying a C# class library project, you must
include the compiled library files and dependencies in a bin subfolder in your .zip package.
The downloaded .zip file is in the correct format to be republished to your function app by using .zip
push deployment. The portal download can also add the files needed to open your function app
directly in Visual Studio.
Using REST APIs:
Use the following deployment GET API to download the files from your <function_app> project:
https://<function_app>.scm.azurewebsites.net/api/zip/site/wwwroot/
Including /site/wwwroot/ makes sure your zip file includes only the function app project files and not the
entire site. If you are not already signed in to Azure, you will be asked to do so.
You can also download a .zip file from a GitHub repository. When you download a GitHub repository as a .zip file,
GitHub adds an extra folder level for the branch. This extra folder level means that you can't deploy the .zip file
directly as you downloaded it from GitHub. If you're using a GitHub repository to maintain your function app, you
should use continuous integration to deploy your app.
Deploy by using Azure CLI
You can use Azure CLI to trigger a push deployment. Push deploy a .zip file to your function app by using the az
functionapp deployment source config-zip command. To use this command, you must use Azure CLI version
2.0.21 or later. To see what Azure CLI version you are using, use the az --version command.
In the following command, replace the <zip_file_path> placeholder with the path to the location of your .zip file.
Also, replace <app_name> with the unique name of your function app.
This command deploys project files from the downloaded .zip file to your function app in Azure. It then restarts
the app. To view the list of deployments for this function app, you must use the REST APIs.
When you're using Azure CLI on your local computer, <zip_file_path> is the path to the .zip file on your
computer. You can also run Azure CLI in Azure Cloud Shell. When you use Cloud Shell, you must first upload your
deployment .zip file to the Azure Files account that's associated with your Cloud Shell. In that case,
<zip_file_path> is the storage location that your Cloud Shell account uses. For more information, see Persist files
in Azure Cloud Shell.
This request triggers push deployment from the uploaded .zip file. You can review the current and past
deployments by using the https://<app_name>.scm.azurewebsites.net/api/deployments endpoint, as shown in the
following cURL example. Again, replace <app_name> with the name of your app and <deployment_user> with the
username of your deployment credentials.
With PowerShell
The following example uses Invoke-RestMethod to send a request that contains the .zip file. Replace the
placeholders <deployment_user> , <deployment_password> , <zip_file_path> , and <app_name> .
#PowerShell
$username = "<deployment_user>"
$password = "<deployment_password>"
$filePath = "<zip_file_path>"
$apiUrl = "https://<app_name>.scm.azurewebsites.net/api/zipdeploy"
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,
$password)))
$userAgent = "powershell/1.0"
Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent
$userAgent -Method POST -InFile $filePath -ContentType "multipart/form-data"
This request triggers push deployment from the uploaded .zip file. To review the current and past deployments,
run the following commands. Again, replace the <app_name> placeholder.
$apiUrl = "https://<app_name>.scm.azurewebsites.net/api/deployments"
Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent
$userAgent -Method GET
Deployment customization
The deployment process assumes that the .zip file that you push contains a ready-to-run app. By default, no
customizations are run. To enable the same build processes that you get with continuous integration, add the
following to your application settings:
SCM_DO_BUILD_DURING_DEPLOYMENT=true
When you use .zip push deployment, this setting is false by default. The default is true for continuous integration
deployments. When set to true, your deployment-related settings are used during deployment. You can configure
these settings either as app settings or in a .deployment configuration file that's located in the root of your .zip file.
For more information, see Repository and deployment-related settings in the deployment reference.
Next steps
Continuous deployment for Azure Functions
Run your Azure Functions from a package file
5/16/2019 • 2 minutes to read • Edit Online
NOTE
The functionality described in this article is not available for function apps running on Linux in an App Service plan.
In Azure, you can run your functions directly from a deployment package file in your function app. The other
option is to deploy your files in the d:\home\site\wwwroot directory of your function app.
This article describes the benefits of running your functions from a package. It also shows how to enable this
functionality in your function app.
VALUE DESCRIPTION
Cau t i on
When running a function app on Windows, the external URL option yields worse cold-start performance. When
deploying your function app to Windows, you should set WEBSITE_RUN_FROM_PACKAGE to 1 and publish with zip
deployment.
The following shows a function app configured to run from a .zip file hosted in Azure Blob storage:
NOTE
Currently, only .zip package files are supported.
Troubleshooting
Run From Package makes wwwroot read-only, so you will receive an error when writing files to this directory.
Tar and gzip formats are not supported.
This feature does not compose with local cache.
For improved cold-start performance, use the local Zip option ( WEBSITE_RUN_FROM_PACKAGE =1).
Next steps
Continuous deployment for Azure Functions
Automate resource deployment for your function app
in Azure Functions
4/26/2019 • 10 minutes to read • Edit Online
You can use an Azure Resource Manager template to deploy a function app. This article outlines the required
resources and parameters for doing so. You might need to deploy additional resources, depending on the triggers
and bindings in your function app.
For more information about creating templates, see Authoring Azure Resource Manager templates.
For sample templates, see:
Function app on Consumption plan
Function app on Azure App Service plan
NOTE
The Premium plan for Azure Functions hosting is currently in preview. For more information, see Azure Functions Premium
plan.
Required resources
An Azure Functions deployment typically consists of these resources:
1A hosting plan is only required when you choose to run your function app on a Premium plan (in preview ) or on
an App Service plan.
TIP
While not required, it is strongly recommended that you configure Application Insights for your app.
Storage account
An Azure storage account is required for a function app. You need a general purpose account that supports blobs,
tables, queues, and files. For more information, see Azure Functions storage account requirements.
{
"type": "Microsoft.Storage/storageAccounts",
"name": "[variables('storageAccountName')]",
"apiVersion": "2018-07-01",
"location": "[resourceGroup().location]",
"kind": "StorageV2",
"properties": {
"accountType": "[parameters('storageAccountType')]"
}
}
In addition, the property AzureWebJobsStorage must be specified as an app setting in the site configuration. If the
function app doesn't use Application Insights for monitoring, it should also specify AzureWebJobsDashboard as an
app setting.
The Azure Functions runtime uses the AzureWebJobsStorage connection string to create internal queues. When
Application Insights is not enabled, the runtime uses the AzureWebJobsDashboard connection string to log to Azure
Table storage and power the Monitor tab in the portal.
These properties are specified in the appSettings collection in the siteConfig object:
"appSettings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'),
';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
},
{
"name": "AzureWebJobsDashboard",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'),
';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
}
]
Application Insights
Application Insights is recommended for monitoring your function apps. The Application Insights resource is
defined with the type Microsoft.Insights/components and the kind web:
{
"apiVersion": "2015-05-01",
"name": "[variables('appInsightsName')]",
"type": "Microsoft.Insights/components",
"kind": "web",
"location": "[resourceGroup().location]",
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/sites/',
variables('functionAppName'))]": "Resource"
},
"properties": {
"Application_Type": "web",
"ApplicationId": "[variables('functionAppName')]"
}
},
In addition, the instrumentation key needs to be provided to the function app using the
APPINSIGHTS_INSTRUMENTATIONKEY application setting. This property is specified in the appSettings collection in the
siteConfig object:
"appSettings": [
{
"name": "APPINSIGHTS_INSTRUMENTATIONKEY",
"value": "[reference(resourceId('microsoft.insights/components/', variables('appInsightsName')),
'2015-05-01').InstrumentationKey]"
}
]
Hosting plan
The definition of the hosting plan varies, and can be one of the following:
Consumption plan (default)
Premium plan (in preview )
App Service plan
Function app
The function app resource is defined by using a resource of type Microsoft.Web/sites and kind functionapp:
{
"apiVersion": "2015-08-01",
"type": "Microsoft.Web/sites",
"name": "[variables('functionAppName')]",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]",
"[resourceId('Microsoft.Insights/components', variables('appInsightsName'))]"
]
IMPORTANT
If you are explicitly defining a hosting plan, an additional item would be needed in the dependsOn array:
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]"
FUNCTIONS_WORKER_RUNTIME The language stack to be used for dotnet , node , java , or python
functions in this app
These properties are specified in the appSettings collection in the siteConfig property:
"properties": {
"siteConfig": {
"appSettings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "node"
},
{
"name": "WEBSITE_NODE_DEFAULT_VERSION",
"value": "10.14.1"
},
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~2"
}
]
}
}
{
"type": "Microsoft.Web/serverfarms",
"apiVersion": "2015-04-01",
"name": "[variables('hostingPlanName')]",
"location": "[resourceGroup().location]",
"properties": {
"name": "[variables('hostingPlanName')]",
"computeMode": "Dynamic",
"sku": "Dynamic"
}
}
NOTE
The Consumption plan cannot be explicitly defined for Linux. It will be created automatically.
If you do explicitly define your consumption plan, you will need to set the serverFarmId property on the app so
that it points to the resource ID of the plan. You should ensure that the function app has a dependsOn setting for
the plan as well.
Create a function app
Windows
On Windows, a Consumption plan requires two additional settings in the site configuration:
WEBSITE_CONTENTAZUREFILECONNECTIONSTRING and WEBSITE_CONTENTSHARE . These properties configure the storage
account and file path where the function app code and configuration are stored.
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "[variables('functionAppName')]",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]"
],
"properties": {
"siteConfig": {
"appSettings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]"
},
{
"name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]"
},
{
"name": "WEBSITE_CONTENTSHARE",
"value": "[toLower(variables('functionAppName'))]"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "node"
},
{
"name": "WEBSITE_NODE_DEFAULT_VERSION",
"value": "10.14.1"
},
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~2"
}
]
}
}
}
Linux
On Linux, the function app must have its kind set to functionapp,linux , and it must have the reserved property
set to true :
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "[variables('functionAppName')]",
"location": "[resourceGroup().location]",
"kind": "functionapp,linux",
"dependsOn": [
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]"
],
"properties": {
"siteConfig": {
"appSettings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountName'),'2015-05-01-
preview').key1)]"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "node"
},
{
"name": "WEBSITE_NODE_DEFAULT_VERSION",
"value": "10.14.1"
},
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~2"
}
]
},
"reserved": true
}
}
{
"type": "Microsoft.Web/serverfarms",
"apiVersion": "2015-04-01",
"name": "[variables('hostingPlanName')]",
"location": "[resourceGroup().location]",
"properties": {
"name": "[variables('hostingPlanName')]",
"sku": "EP1"
}
}
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "[variables('functionAppName')]",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]"
],
"properties": {
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"siteConfig": {
"appSettings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]"
},
{
"name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]"
},
{
"name": "WEBSITE_CONTENTSHARE",
"value": "[toLower(variables('functionAppName'))]"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "node"
},
{
"name": "WEBSITE_NODE_DEFAULT_VERSION",
"value": "10.14.1"
},
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~2"
}
]
}
}
}
To run your app on Linux, you must also set the kind to Linux :
{
"type": "Microsoft.Web/serverfarms",
"apiVersion": "2015-04-01",
"name": "[variables('hostingPlanName')]",
"location": "[resourceGroup().location]",
"kind": "Linux",
"properties": {
"name": "[variables('hostingPlanName')]",
"sku": "[parameters('sku')]",
"workerSize": "[parameters('workerSize')]",
"hostingEnvironment": "",
"numberOfWorkers": 1
}
}
Linux apps should also include a linuxFxVersion property under siteConfig . If you are just deploying code, the
value for this is determined by your desired runtime stack:
JavaScript DOCKER|microsoft/azure-functions-node8:2.0
.NET DOCKER|microsoft/azure-functions-dotnet-core2.0:2.0
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "[variables('functionAppName')]",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]"
],
"properties": {
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"siteConfig": {
"appSettings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "node"
},
{
"name": "WEBSITE_NODE_DEFAULT_VERSION",
"value": "10.14.1"
},
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~2"
}
],
"linuxFxVersion": "DOCKER|microsoft/azure-functions-node8:2.0"
}
}
}
If you are deploying a custom container image, you must specify it with linuxFxVersion and include configuration
that allows your image to be pulled, as in Web App for Containers. Also, set WEBSITES_ENABLE_APP_SERVICE_STORAGE
to false , since your app content is provided in the container itself:
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "[variables('functionAppName')]",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]"
],
"properties": {
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"siteConfig": {
"appSettings": [
{
"name": "AzureWebJobsStorage",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "node"
},
{
"name": "WEBSITE_NODE_DEFAULT_VERSION",
"value": "10.14.1"
},
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~2"
},
{
"name": "DOCKER_REGISTRY_SERVER_URL",
"value": "[parameters('dockerRegistryUrl')]"
},
{
"name": "DOCKER_REGISTRY_SERVER_USERNAME",
"value": "[parameters('dockerRegistryUsername')]"
},
{
"name": "DOCKER_REGISTRY_SERVER_PASSWORD",
"value": "[parameters('dockerRegistryPassword')]"
},
{
"name": "WEBSITES_ENABLE_APP_SERVICE_STORAGE",
"value": "false"
}
],
"linuxFxVersion": "DOCKER|myacr.azurecr.io/myimage:mytag"
}
}
}
Customizing a deployment
A function app has many child resources that you can use in your deployment, including app settings and source
control options. You also might choose to remove the sourcecontrols child resource, and use a different
deployment option instead.
IMPORTANT
To successfully deploy your application by using Azure Resource Manager, it's important to understand how resources are
deployed in Azure. In the following example, top-level configurations are applied by using siteConfig. It's important to set
these configurations at a top level, because they convey information to the Functions runtime and deployment engine. Top-
level information is required before the child sourcecontrols/web resource is applied. Although it's possible to configure
these settings in the child-level config/appSettings resource, in some cases your function app must be deployed before
config/appSettings is applied. For example, when you are using functions with Logic Apps, your functions are a dependency
of another resource.
{
"apiVersion": "2015-08-01",
"name": "[parameters('appName')]",
"type": "Microsoft.Web/sites",
"kind": "functionapp",
"location": "[parameters('location')]",
"dependsOn": [
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]",
"[resourceId('Microsoft.Web/serverfarms', parameters('appName'))]"
],
"properties": {
"serverFarmId": "[variables('appServicePlanName')]",
"siteConfig": {
"alwaysOn": true,
"appSettings": [
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"value": "~2"
},
{
"name": "Project",
"value": "src"
}
]
}
},
"resources": [
{
"apiVersion": "2015-08-01",
"name": "appsettings",
"type": "config",
"dependsOn": [
"[resourceId('Microsoft.Web/Sites', parameters('appName'))]",
"[resourceId('Microsoft.Web/Sites/sourcecontrols', parameters('appName'), 'web')]",
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]"
],
"properties": {
"AzureWebJobsStorage": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]",
"AzureWebJobsDashboard": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-
preview').key1)]",
"FUNCTIONS_EXTENSION_VERSION": "~2",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"Project": "src"
}
},
{
"apiVersion": "2015-08-01",
"name": "web",
"type": "sourcecontrols",
"dependsOn": [
"[resourceId('Microsoft.Web/sites/', parameters('appName'))]"
],
"properties": {
"RepoUrl": "[parameters('sourceCodeRepositoryURL')]",
"branch": "[parameters('sourceCodeBranch')]",
"IsManualIntegration": "[parameters('sourceCodeManualIntegration')]"
}
}
]
}
TIP
This template uses the Project app settings value, which sets the base directory in which the Functions deployment engine
(Kudu) looks for deployable code. In our repository, our functions are in a subfolder of the src folder. So, in the preceding
example, we set the app settings value to src . If your functions are in the root of your repository, or if you are not
deploying from source control, you can remove this app settings value.
[![Deploy to Azure](https://azuredeploy.net/deploybutton.png)]
(https://portal.azure.com/#create/Microsoft.Template/uri/<url-encoded-path-to-azuredeploy-json>)
<a href="https://portal.azure.com/#create/Microsoft.Template/uri/<url-encoded-path-to-azuredeploy-json>"
target="_blank"><img src="https://azuredeploy.net/deploybutton.png"></a>
# Create the parameters for the file, which for this template is the function app name.
$TemplateParams = @{"appName" = "<function-app-name>"}
To test out this deployment, you can use a template like this one that creates a function app on Windows in a
Consumption plan. Replace <function-app-name> with a unique name for your function app.
Next steps
Learn more about how to develop and configure Azure Functions.
Azure Functions developer reference
How to configure Azure function app settings
Create your first Azure function
Install the Azure Functions Runtime preview 2
4/12/2019 • 4 minutes to read • Edit Online
IMPORTANT
The Azure Functions Runtime preview 2 supports only version 1.x of the Azure Functions runtime. This preview feature is not
being updated to support version 2.x of the runtime, and no future updates are planned. If you need to host the Azure
Functions runtime outside of Azure, consider using Azure Functions on Kubernetes with KEDA
If you would like to install the Azure Functions Runtime preview 2, follow these steps:
1. Ensure your machine passes the minimum requirements.
2. Download the Azure Functions Runtime Preview Installer.
3. Uninstall the Azure Functions Runtime preview 1.
4. Install the Azure Functions Runtime preview 2.
5. Complete the configuration of the Azure Functions Runtime preview 2.
6. Create your first function in Azure Functions Runtime Preview
Prerequisites
Before you install the Azure Functions Runtime preview, you must have the following resources available:
1. A machine running Microsoft Windows Server 2016 or Microsoft Windows 10 Creators Update (Professional
or Enterprise Edition).
2. A SQL Server instance running within your network. Minimum edition required is SQL Server Express.
NOTE
You can install the Functions Worker Role on many other machines. To do so, follow these instructions, and only
select Functions Worker Role in the installer.
5. Click Next to have the Azure Functions Runtime Setup Wizard begin the installation process on your
machine.
6. Once complete, the setup wizard launches the Azure Functions Runtime configuration tool.
NOTE
If you are installing on Windows 10 and the Container feature has not been previously enabled, the Azure
Functions Runtime Setup prompts you to reboot your machine to complete the install.
2. Click the Database tab, enter the connection details for your SQL Server instance, including specifying a
Database master key, and click Apply. Connectivity to a SQL Server instance is required in order for the
Azure Functions Runtime to create a database to support the Runtime.
3. Click the Credentials tab. Here, you must create two new credentials for use with a file share for hosting all
your function apps. Specify User name and Password combinations for the file share owner and for the
file share user, then click Apply.
4. Click the File Share tab. Here you must specify the details of the file share location. The file share can be
created for you or you can use an existing File Share and click Apply. If you select a new File Share location,
you must specify a directory for use by the Azure Functions Runtime.
5. Click the IIS tab. This tab shows the details of the websites in IIS that the Azure Functions Runtime
configuration tool creates. You may specify a custom DNS name here for the Azure Functions Runtime
preview portal. Click Apply to complete.
6. Click the Services tab. This tab shows the status of the services in your Azure Functions Runtime
configuration tool. If the Azure Functions Host Activation Service is not running after initial
configuration, click Start Service.
7. Browse to the Azure Functions Runtime Portal as https://<machinename>.<domain>/ .
2. You are prompted to Log in, if deployed in a domain use your domain account username and password,
otherwise use your local account username and password to log in to the portal.
3. To create function apps, you must create a Subscription. In the top left-hand corner of the portal, click the +
option next to the subscriptions.
4. Choose DefaultPlan, enter a name for your Subscription, and click Create.
5. All of your function apps are listed in the left-hand pane of the portal. To create a new Function App, select
the heading Function Apps and click the + option.
6. Enter a name for your function app, select the correct Subscription, choose which version of the Azure
Functions runtime you wish to program against and click Create
7. Your new function app is listed in the left-hand pane of the portal. Select Functions and then click New
Function at the top of the center pane in the portal.
8. Select the Timer Trigger function, in the right-hand flyout name your function and change the Schedule to
*/5 * * * * * (this cron expression causes your timer function to execute every five seconds), and click
Create
9. Your function has now been created. You can view the execution log of your Function app by expanding the
log pane at the bottom of the portal.
How to manage a function app in the Azure portal
5/20/2019 • 3 minutes to read • Edit Online
In Azure Functions, a function app provides the execution context for your individual functions. Function app
behaviors apply to all functions hosted by a given function app. This topic describes how to configure and manage
your function apps in the Azure portal.
To begin, go to the Azure portal and sign in to your Azure account. In the search bar at the top of the portal, type
the name of your function app and select it from the list. After selecting your function app, you see the following
page:
You can navigate to everything you need to manage your function app from the overview page, in particular the
Application settings and Platform features.
Application settings
The Application Settings tab maintains settings that are used by your function app.
These settings are stored encrypted, and you must select Show values to see the values in the portal.
To add a setting, select New application setting and add the new key-value pair.
The function app settings values can also be read in your code as environment variables. For more information, see
the Environment variables section of these language-specific reference topics:
C# precompiled
C# script (.csx)
F# script (.fsx)
Java
JavaScript
PowerShell
When you develop a function app locally, these values are maintained in the local.settings.json project file.
Platform features
Function apps run in, and are maintained, by the Azure App Service platform. As such, your function apps have
access to most of the features of Azure's core web hosting platform. The Platform features tab is where you
access the many features of the App Service platform that you can use in your function apps.
NOTE
Not all App Service features are available when a function app runs on the Consumption hosting plan.
The rest of this topic focuses on the following App Service features in the Azure portal that are useful for
Functions:
App Service editor
Console
Advanced tools (Kudu)
Deployment options
CORS
Authentication
API definition
For more information about how to work with App Service settings, see Configure Azure App Service Settings.
App Service Editor
The App Service editor is an advanced in-portal editor that
you can use to modify JSON configuration files and code files
alike. Choosing this option launches a separate browser tab
with a basic editor. This enables you to integrate with the Git
repository, run and debug code, and modify function app
settings. This editor provides an enhanced development
environment for your functions compared with the default
function app blade.
Console
CORS
API definition
Functions supports Swagger to allow clients to more easily
consume your HTTP-triggered functions. For more
information on creating API definitions with Swagger, visit
Host a RESTful API with CORS in Azure App Service. You can
also use Functions Proxies to define a single API surface for
multiple functions. For more information, see Working with
Azure Functions Proxies.
Next steps
Configure Azure App Service Settings
Continuous deployment for Azure Functions
How to target Azure Functions runtime versions
2/4/2019 • 3 minutes to read • Edit Online
A function app runs on a specific version of the Azure Functions runtime. There are two major versions: 1.x and
2.x. By default, function apps that are created version 2.x of the runtime. This article explains how to configure a
function app in Azure to run on the version you choose. For information about how to configure a local
development environment for a specific version, see Code and test Azure Functions locally.
NOTE
You can't change the runtime version for a function app that has one or more functions. You should use the Azure portal to
change the runtime version.
4. To pin your function app to the version 1.x runtime, choose ~1 under Runtime version. This switch is
disabled when you have functions in your app.
5. When you change the runtime version, go back to the Overview tab and choose Restart to restart the app.
The function app restarts running on the version 1.x runtime, and the version 1.x templates are used when
you create functions.
From the Azure CLI
You can also view and set the FUNCTIONS_EXTENSION_VERSION from the Azure CLI.
NOTE
Because other settings may be impacted by the runtime version, you should change the version in the portal. The portal
automatically makes the other needed updates, such as Node.js version and runtime stack, when you change runtime
versions.
Using the Azure CLI, view the current runtime version with the az functionapp config appsettings set command.
az functionapp config appsettings list --name <function_app> \
--resource-group <my_resource_group>
In this code, replace <function_app> with the name of your function app. Also replace <my_resource_group> with
the name of the resource group for your function app.
You see the FUNCTIONS_EXTENSION_VERSION in the following output, which has been truncated for clarity:
[
{
"name": "FUNCTIONS_EXTENSION_VERSION",
"slotSetting": false,
"value": "~2"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"slotSetting": false,
"value": "dotnet"
},
...
{
"name": "WEBSITE_NODE_DEFAULT_VERSION",
"slotSetting": false,
"value": "8.11.1"
}
]
You can update the FUNCTIONS_EXTENSION_VERSION setting in the function app with the az functionapp config
appsettings set command.
Replace <function_app> with the name of your function app. Also replace <my_resource_group> with the name of
the resource group for your function app. Also, replace <version> with a valid version of the 1.x runtime or ~2
for version 2.x.
You can run this command from the Azure Cloud Shell by choosing Try it in the preceding code sample. You can
also use the Azure CLI locally to execute this command after executing az login to sign in.
Next steps
Target the 2.0 runtime in your local development environment
See Release notes for runtime versions
Manually install or update Azure Functions binding
extensions from the portal
2/22/2019 • 2 minutes to read • Edit Online
The Azure Functions version 2.x runtime uses binding extensions to implement code for triggers and bindings.
Binding extensions are provided in NuGet packages. To register an extension, you essentially install a package.
When developing functions, the way that you install binding extensions depends on the development
environment. For more information, see Register binding extensions in the triggers and bindings article.
Sometimes you need to manually install or update your binding extensions in the Azure portal. For example, you
may need to update a registered binding to a newer version. You may also need to register a supported binding
that can't be installed in the Integrate tab in the portal.
8. Back in the Overview tab in the portal, choose Start to restart the function app.
Next steps
Learn more about Azure functions triggers and bindings
How to disable functions in Azure Functions
9/7/2018 • 2 minutes to read • Edit Online
This article explains how to disable a function in Azure Functions. To disable a function means to make the runtime
ignore the automatic trigger that is defined for the function. The way you do that depends on the runtime version
and the programming language:
Functions 1.x
Scripting languages
C# class libraries
Functions 2.x
One way for all languages
Optional way for C# class libraries
{
"bindings": [
{
"type": "queueTrigger",
"direction": "in",
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"MyStorageConnectionAppSetting"
}
],
"disabled": true
}
or
"bindings": [
...
],
"disabled": "IS_DISABLED"
In the second example, the function is disabled when there is an app setting that is named IS_DISABLED and is set
to true or 1.
You can edit the file in the Azure portal or use the Function State switch on the function's Manage tab. The
portal switch works by changing the function.json file.
Functions 1.x - C# class libraries
In a Functions 1.x class library, you use a Disable attribute to prevent a function from being triggered. You can use
the attribute without a constructor parameter, as shown in the following example:
The attribute without a constructor parameter requires that you recompile and redeploy the project to change the
function's disabled state. A more flexible way to use the attribute is to include a constructor parameter that refers
to a Boolean app setting, as shown in the following example:
public static class QueueFunctions
{
[Disable("MY_TIMER_DISABLED")]
[FunctionName("QueueTrigger")]
public static void QueueTrigger(
[QueueTrigger("myqueue-items")] string myQueueItem,
TraceWriter log)
{
log.Info($"C# function processed: {myQueueItem}");
}
}
This method lets you enable and disable the function by changing the app setting, without recompiling or
redeploying. Changing an app setting causes the function app to restart, so the disabled state change is recognized
immediately.
IMPORTANT
The Disabled attribute is the only way to disable a class library function. The generated function.json file for a class library
function is not meant to be edited directly. If you edit that file, whatever you do to the disabled property will have no
effect.
The same goes for the Function state switch on the Manage tab, since it works by changing the function.json file.
Also, note that the portal may indicate the function is disabled when it isn't.
Next steps
This article is about disabling automatic triggers. For more information about triggers, see Triggers and bindings.
Monitor Azure Functions
4/29/2019 • 18 minutes to read • Edit Online
Azure Functions offers built-in integration with Azure Application Insights to monitor functions. This article
shows you how to configure Azure Functions to send system-generated log files to Application Insights.
We recommend using Application Insights because it collects log, performance, and error data. It
automatically detects performance anomalies and includes powerful analytics tools to help you diagnose
issues and to understand how your functions are used. It's designed to help you continuously improve
performance and usability. You can even use Application Insights during local function app project
development. For more information, see What is Application Insights?.
As the required Application Insights instrumentation is built into Azure Functions, all you need is a valid
instrumentation key to connect your function app to an Application Insights resource.
Name Unique app name It's easiest to use the same name
as your function app, which must
be unique in your subscription.
3. Choose OK. The Application Insights resource is created in the same resource group and subscription
as your function app. After creation completes, close the Application Insights window.
4. Back in your function app, select Application settings, and scroll down to Application settings.
When you see a setting named APPINSIGHTS_INSTRUMENTATIONKEY , it means that Application Insights
integration is enabled for your function app running in Azure.
Early versions of Functions used built-in monitoring, which is no longer recommended. When enabling
Application Insights integration for such a function app, you must also disable built-in logging.
View telemetry in Monitor tab
With Application Insights integration enabled, you can view telemetry data in the Monitor tab.
1. In the function app page, select a function that has run at least once after Application Insights was
configured. Then select the Monitor tab.
3. To see the logs for a particular function invocation, select the Date column link for that invocation.
The logging output for that invocation appears in a new page.
You can see that both pages have a Run in Application Insights link to the Application Insights Analytics
query that retrieves the data.
The following query is displayed. You can see that the invocation list is limited to the last 30 days. The list
shows no more than 20 rows ( where timestamp > ago(30d) | take 20 ). The invocation details list is for the last
30 days with no limit.
For more information, see Query telemetry data later in this article.
View telemetry in Application Insights
To open Application Insights from a function app in the Azure portal, go to the function app's Overview
page. Under Configured features, select Application Insights.
For information about how to use Application Insights, see the Application Insights documentation. This
section shows some examples of how to view data in Application Insights. If you're already familiar with
Application Insights, you can go directly to the sections about how to configure and customize the telemetry
data.
The following areas of Application Insights can be helpful when evaluating the behavior, performance, and
errors in your functions:
TAB DESCRIPTION
Metrics Create charts and alerts that are based on metrics. Metrics
include the number of function invocations, execution time,
and success rates.
Live Metrics Stream View metrics data as it's created in real time.
requests
| where timestamp > ago(30m)
| summarize count() by cloud_RoleInstance, bin(timestamp, 1m)
| render timechart
The tables that are available are shown in the Schema tab on the left. You can find data generated by
function invocations in the following tables:
TABLE DESCRIPTION
The other tables are for availability tests, and client and browser telemetry. You can implement custom
telemetry to add data to them.
Within each table, some of the Functions-specific data is in a customDimensions field. For example, the
following query retrieves all traces that have log level Error .
traces
| where customDimensions.LogLevel == "Error"
The runtime provides the customDimensions.LogLevel and customDimensions.Category fields. You can provide
additional fields in logs that you write in your function code. See Structured logging later in this article.
LOGLEVEL CODE
Trace 0
Debug 1
LOGLEVEL CODE
Information 2
Warning 3
Error 4
Critical 5
None 6
{
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Host.Results": "Error",
"Function": "Error",
"Host.Aggregator": "Trace"
}
}
}
Version 1.x
{
"logger": {
"categoryFilter": {
"defaultLevel": "Information",
"categoryLevels": {
"Host.Results": "Error",
"Function": "Error",
"Host.Aggregator": "Trace"
}
}
}
}
Version 2.x
{
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Host": "Error",
"Function": "Error",
"Host.Aggregator": "Information"
}
}
}
Version 1.x
{
"logger": {
"categoryFilter": {
"defaultLevel": "Information",
"categoryLevels": {
"Host": "Error",
"Function": "Error",
"Host.Aggregator": "Information"
}
}
}
}
To suppress all logs for a category, you can use log level None . No logs are written with that category and
there's no log level above it.
The following sections describe the main categories of logs that the runtime creates.
Category Host.Results
These logs show as "requests" in Application Insights. They indicate success or failure of a function.
All of these logs are written at Information level. If you filter at Warning or above, you won't see any of this
data.
Category Host.Aggregator
These logs provide counts and averages of function invocations over a configurable period of time. The
default period is 30 seconds or 1,000 results, whichever comes first.
The logs are available in the customMetrics table in Application Insights. Examples are the number of runs,
success rate, and duration.
All of these logs are written at Information level. If you filter at Warning or above, you won't see any of this
data.
Other categories
All logs for categories other than the ones already listed are available in the traces table in Application
Insights.
All logs with categories that begin with Host are written by the Functions runtime. The "Function started"
and "Function completed" logs have category Host.Executor . For successful runs, these logs are
Information level. Exceptions are logged at Error level. The runtime also creates Warning level logs, for
example: queue messages sent to the poison queue.
Logs written by your function code have category Function and can be any log level.
{
"aggregator": {
"batchSize": 1000,
"flushTimeout": "00:00:30"
}
}
Configure sampling
Application Insights has a sampling feature that can protect you from producing too much telemetry data on
completed executions at times of peak load. When the rate of incoming executions exceeds a specified
threshold, Application Insights starts to randomly ignore some of the incoming executions. The default
setting for maximum number of executions per second is 20 (five in version 1.x). You can configure sampling
in host.json. Here's an example:
Version 2.x
{
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"maxTelemetryItemsPerSecond" : 20
}
}
}
}
Version 1.x
{
"applicationInsights": {
"sampling": {
"isEnabled": true,
"maxTelemetryItemsPerSecond" : 5
}
}
}
NOTE
Sampling is enabled by default. If you appear to be missing data, you might need to adjust the sampling settings to fit
your particular monitoring scenario.
With an ILogger object, you call Log<level> extension methods on ILogger to create logs. The following
code writes Information logs with category "Function."
Structured logging
The order of placeholders, not their names, determines which parameters are used in the log message.
Suppose you have the following code:
If you keep the same message string and reverse the order of the parameters, the resulting message text
would have the values in the wrong places.
Placeholders are handled this way so that you can do structured logging. Application Insights stores the
parameter name-value pairs and the message string. The result is that the message arguments become fields
that you can query on.
If your logger method call looks like the previous example, you can query the field
customDimensions.prop__rowKey . The prop__ prefix is added to ensure there are no collisions between fields
the runtime adds and fields your function code adds.
You can also query on the original message string by referencing the field
customDimensions.prop__{OriginalFormat} .
{
customDimensions: {
"prop__{OriginalFormat}":"C# Queue trigger function processed: {message}",
"Category":"Function",
"LogLevel":"Information",
"prop__message":"c9519cbf-b1e6-4b9b-bf24-cb7d10b1bb89"
}
}
logger.LogMetric("TestMetric", 1234);
This code is an alternative to calling TrackMetric by using the Application Insights API for .NET.
context.log.metric("TestMetric", 1234);
This code is an alternative to calling trackMetric by using the Node.js SDK for Application Insights.
namespace functionapp0915
{
public class HttpTrigger2
{
private readonly TelemetryClient telemetryClient;
/// Using dependency injection will guarantee that you use the same configuration for telemetry
collected automatically and manually.
public HttpTrigger2(TelemetryConfiguration telemetryConfiguration)
{
this.telemetryClient = new TelemetryClient(telemetryConfiguration);
}
[FunctionName("HttpTrigger2")]
public Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = null)]
HttpRequest req, ExecutionContext context, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
DateTime start = DateTime.UtcNow;
// Track an Event
var evt = new EventTelemetry("Function called");
evt.Context.User.Id = name;
this.telemetryClient.TrackEvent(evt);
// Track a Metric
var metric = new MetricTelemetry("Test Metric", DateTime.Now.Millisecond);
metric.Context.User.Id = name;
this.telemetryClient.TrackMetric(metric);
// Track a Dependency
var dependency = new DependencyTelemetry
{
Name = "GET api/planets/1/",
Target = "swapi.co",
Data = "https://swapi.co/api/planets/1/",
Timestamp = start,
Duration = DateTime.UtcNow - start,
Success = true
};
dependency.Context.User.Id = name;
this.telemetryClient.TrackDependency(dependency);
Version 1.x
using System;
using System.Net;
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.DataContracts;
using Microsoft.ApplicationInsights.Extensibility;
using Microsoft.Azure.WebJobs;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using System.Linq;
namespace functionapp0915
{
public static class HttpTrigger2
{
private static string key = TelemetryConfiguration.Active.InstrumentationKey =
System.Environment.GetEnvironmentVariable(
"APPINSIGHTS_INSTRUMENTATIONKEY", EnvironmentVariableTarget.Process);
[FunctionName("HttpTrigger2")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
HttpRequestMessage req, ExecutionContext context, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
DateTime start = DateTime.UtcNow;
// Track an Event
var evt = new EventTelemetry("Function called");
UpdateTelemetryContext(evt.Context, context, name);
telemetryClient.TrackEvent(evt);
// Track a Metric
var metric = new MetricTelemetry("Test Metric", DateTime.Now.Millisecond);
UpdateTelemetryContext(metric.Context, context, name);
telemetryClient.TrackMetric(metric);
// Track a Dependency
var dependency = new DependencyTelemetry
{
Name = "GET api/planets/1/",
Target = "swapi.co",
Data = "https://swapi.co/api/planets/1/",
Timestamp = start,
Duration = DateTime.UtcNow - start,
Success = true
};
UpdateTelemetryContext(dependency.Context, context, name);
telemetryClient.TrackDependency(dependency);
}
Don't call TrackRequest or StartOperation<RequestTelemetry> because you'll see duplicate requests for a
function invocation. The Functions runtime automatically tracks requests.
Don't set telemetryClient.Context.Operation.Id . This global setting causes incorrect correlation when many
functions are running simultaneously. Instead, create a new telemetry instance ( DependencyTelemetry ,
EventTelemetry ) and modify its Context property. Then pass in the telemetry instance to the corresponding
Track method on TelemetryClient ( TrackDependency() , TrackEvent() ). This method ensures that the
telemetry has the correct correlation details for the current function invocation.
context.done();
};
The tagOverrides parameter sets the operation_Id to the function's invocation ID. This setting enables you
to correlate all of the automatically generated and custom telemetry for a given function invocation.
Dependencies
Functions v2 automatically collects dependencies for HTTP requests, ServiceBus, and SQL.
You can write custom code to show the dependencies. For examples, see the sample code in the C# custom
telemetry section. The sample code results in an application map in Application Insights that looks like the
following image:
Report issues
To report an issue with Application Insights integration in Functions, or to make a suggestion or request,
create an issue in GitHub.
Streaming Logs
While developing an application, it is often useful to see logging information in near-real time. You can view a
stream of log files being generated by your functions either in the Azure portal or in a command-line session
on your local computer.
This is equivalent to the output seen when you debug your functions during local development. For more
information, see How to stream logs.
NOTE
Streaming logs support only a single instance of the Functions host. When your function is scaled to multiple
instances, data from other instances are not shown in the log stream. The Live Metrics Stream in Application Insights
does supported multiple instances. While also in near real time, streaming analytics are also based on sampled data.
Portal
To view streaming logs in the portal, select the Platform features tab in your function app. Then, under
Monitoring, choose Log streaming.
This connects your app to the log streaming service and application logs are displayed in the window. You
can toggle between Application logs and Web server logs.
Azure CLI
You can enable streaming logs by using the Azure CLI. Use the following commands to sign in, choose your
subscription, and stream log files:
az login
az account list
az account set --subscription <subscriptionNameOrId>
az webapp log tail --resource-group <RESOURCE_GROUP_NAME> --name <FUNCTION_APP_NAME>
Azure PowerShell
You can enable streaming logs by using Azure PowerShell. For PowerShell, use the following commands to
add your Azure account, choose your subscription, and stream log files:
Add-AzAccount
Get-AzSubscription
Get-AzSubscription -SubscriptionName "<subscription name>" | Select-AzSubscription
Get-AzWebSiteLog -Name <FUNCTION_APP_NAME> -Tail
Disable built-in logging
When you enable Application Insights, disable the built-in logging that uses Azure Storage. The built-in
logging is useful for testing with light workloads, but isn't intended for high-load production use. For
production monitoring, we recommend Application Insights. If built-in logging is used in production, the
logging record might be incomplete because of throttling on Azure Storage.
To disable built-in logging, delete the AzureWebJobsDashboard app setting. For information about how to
delete app settings in the Azure portal, see the Application settings section of How to manage a function
app. Before you delete the app setting, make sure no existing functions in the same function app use the
setting for Azure Storage triggers or bindings.
Next steps
For more information, see the following resources:
Application Insights
ASP.NET Core logging
Buy and configure an SSL certificate for Azure App
Service
5/20/2019 • 8 minutes to read • Edit Online
This tutorial shows you how to secure your App Service app or function app by creating (purchasing) an App
Service certificate in Azure Key Vault and then bind it to an App Service app.
TIP
App Service Certificates can be used for any Azure or non-Azure Services and is not limited to App Services. To do so, you
need to create a local PFX copy of an App Service certificate that you can use it anywhere you want. For more information,
see Creating a local PFX copy of an App Service Certificate.
Prerequisites
To follow this how -to guide:
Create an App Service app
Map a domain name to your app or buy and configure it in Azure
Custom SSL is not supported in the F1 or D1 tier. If you need to scale up, follow the steps in the next section.
Otherwise, close the Scale up page and skip the Scale up your App Service plan section.
Scale up your App Service plan
Select any of the non-free tiers (B1, B2, B3, or any tier in the Production category). For additional options, click
See additional options.
Click Apply.
When you see the following notification, the scale operation is complete.
SETTING DESCRIPTION
Naked Domain Host Name If you specify the root domain here, you get a certificate that
secures both the root domain and the www subdomain. To
secure any subdomain only, specify the fully qualified domain
name of the subdomain here (for example,
mysubdomain.contoso.com ).
Resource group The resource group that contains the certificate. You can use a
new resource group or select the same resource group as your
App Service app, for example.
Legal Terms Click to confirm that you agree with the legal terms. The
certificates are obtained from GoDaddy.
SETTING DESCRIPTION
Pricing tier For information, see Azure Key Vault pricing details.
Access policies Defines the applications and the allowed access to the vault
resources. You can configure it later, following the steps at
Grant several applications access to a key vault.
Virtual Network Access Restrict vault access to certain Azure virtual networks. You can
configure it later, following the steps at Configure Azure Key
Vault Firewalls and Virtual Networks
Once you've selected the vault, close the Key Vault Repository page. The Store option should show a green check
mark for success. Keep the page open for the next step.
NOTE
Four types of domain verification methods are supported:
App Service - The most convenient option when the domain is already mapped to an App Service app in the same
subscription. It takes advantage of the fact that the App Service app has already verified the domain ownership.
Domain - Verify an App Service domain that you purchased from Azure. Azure automatically adds the verification TXT
record for you and completes the process.
Mail - Verify the domain by sending an email to the domain administrator. Instructions are provided when you select the
option.
Manual - Verify the domain using either an HTML page (Standard certificate only) or a DNS TXT record. Instructions are
provided when you select the option.
Use the following table to help you configure the binding in the SSL Bindings dialog, then click Add Binding.
SETTING DESCRIPTION
Rekey certificate
If you think your certificate's private key is compromised, you can rekey your certificate. Select the certificate in the
App Service Certificates page, then select Rekey and Sync from the left navigation.
Click Rekey to start the process. This process can take 1-10 minutes to complete.
Rekeying your certificate rolls the certificate with a new certificate issued from the certificate authority.
Once the rekey operation is complete, click Sync. The sync operation automatically updates the hostname bindings
for the certificate in App Service without causing any downtime to your apps.
NOTE
If you don't click Sync, App Service automatically syncs your certificate within 48 hours.
Renew certificate
To turn on automatic renewal of your certificate at any time, select the certificate in the App Service Certificates
page, then click Auto Renew Settings in the left navigation.
Select On and click Save. Certificates can start automatically renewing 60 days before expiration if you have
automatic renewal turned on.
To manually renew the certificate instead, click Manual Renew. You can request to manually renew your certificate
60 days before expiration.
Once the renew operation is complete, click Sync. The sync operation automatically updates the hostname
bindings for the certificate in App Service without causing any downtime to your apps.
NOTE
If you don't click Sync, App Service automatically syncs your certificate within 48 hours.
fqdn=<replace-with-www.{yourdomain}>
pfxPath=<replace-with-path-to-your-.PFX-file>
pfxPassword=<replace-with-your=.PFX-password>
resourceGroup=myResourceGroup
webappname=mywebapp$RANDOM
# Create an App Service plan in Basic tier (minimum required by custom domains).
az appservice plan create --name $webappname --resource-group $resourceGroup --sku B1
# Before continuing, go to your DNS configuration UI for your custom domain and follow the
# instructions at https://aka.ms/appservicecustomdns to configure a CNAME record for the
# hostname "www" and point it your web app's default domain name.
PowerShell
$fqdn="<Replace with your custom domain name>"
$pfxPath="<Replace with path to your .PFX file>"
$pfxPassword="<Replace with your .PFX password>"
$webappname="mywebapp$(Get-Random)"
$location="West Europe"
# Before continuing, go to your DNS configuration UI for your custom domain and follow the
# instructions at https://aka.ms/appservicecustomdns to configure a CNAME record for the
# hostname "www" and point it your web app's default domain name.
# Upgrade App Service plan to Basic tier (minimum required by custom SSL certificates)
Set-AzAppServicePlan -Name $webappname -ResourceGroupName $webappname `
-Tier Basic
More resources
Enforce HTTPS
Enforce TLS 1.1/1.2
Use an SSL certificate in your application code in Azure App Service
FAQ : App Service Certificates
Configure your App Service app to use Azure Active
Directory sign-in
3/25/2019 • 5 minutes to read • Edit Online
NOTE
At this time, AAD V2 (including MSAL) is not supported for Azure App Services and Azure Functions. Please check back for
updates.
This article shows you how to configure Azure App Services to use Azure Active Directory as an authentication
provider.
NOTE
You can use the same app registration for multiple domains by adding additional Reply URLs. Make sure to model
each App Service instance with its own registration, so it has its own permissions and consent. Also consider using
separate app registrations for separate site slots. This is to avoid permissions being shared between environments,
so that a bug in new code you are testing does not affect production.
8. At this point, copy the Application ID for the app. Keep it for later use. You will need it to configure your
App Service app.
9. Close the Registered app page. On the App registrations page, click on the Endpoints button at the top,
then copy the WS -FEDERATION SIGN -ON ENDPOINT URL but remove the /wsfed ending from the
URL. The end result should look like
https://login.microsoftonline.com/00000000-0000-0000-0000-000000000000 . The domain name may be
different for a sovereign cloud. This will serve as the Issuer URL for later.
Add Azure Active Directory information to your App Service app
1. Back in the Azure portal, navigate to your App Service app. Click Authentication/Authorization. If the
Authentication/Authorization feature is not enabled, turn the switch to On. Click on Azure Active
Directory, under Authentication Providers, to configure your app.
(Optional) By default, App Service provides authentication but does not restrict authorized access to your
site content and APIs. You must authorize users in your app code. Set Action to take when request is not
authenticated to Log in with Azure Active Directory. This option requires that all requests be
authenticated, and all unauthenticated requests are redirected to Azure Active Directory for authentication.
2. In the Active Directory Authentication configuration, click Advanced under Management Mode. Paste
the Application ID into the Client ID box (from step 8) and paste in the URL (from step 9) into the Issuer
URL value. Then click OK.
3. On the Active Directory Authentication configuration page, click Save.
You are now ready to use Azure Active Directory for authentication in your App Service app.
Related Content
App Service Authentication / Authorization overview
Add authentication to your Mobile App: iOS, Android, Windows Universal, Xamarin.Android, Xamarin.iOS,
Xamarin.Forms, Cordova
Learn how to add App Service authentication to your mobile app.
How to configure your App Service application to
use Facebook login
4/11/2019 • 2 minutes to read • Edit Online
This topic shows you how to configure Azure App Service to use Facebook as an authentication provider.
To complete the procedure in this topic, you must have a Facebook account that has a verified email address and a
mobile phone number. To create a new Facebook account, go to facebook.com.
NOTE
Your redirect URI is the URL of your application appended with the path, /.auth/login/facebook/callback. For
example, https://contoso.azurewebsites.net/.auth/login/facebook/callback . Make sure that you are using
the HTTPS scheme.
8. In the left-hand navigation, click Settings > Basic. On the App Secret field, click Show, provide your
password if requested, then make a note of the values of App ID and App Secret. You use these later to
configure your application in Azure.
IMPORTANT
The app secret is an important security credential. Do not share this secret with anyone or distribute it within a client
application.
9. The Facebook account which was used to register the application is an administrator of the app. At this
point, only administrators can sign into this application. To authenticate other Facebook accounts, click App
Review and enable Make <your-app-name> public to enable general public access using Facebook
authentication.
Add Facebook information to your application
1. Back in the Azure portal, navigate to your application. Click Settings > Authentication / Authorization,
and make sure that App Service Authentication is On.
2. Click Facebook, paste in the App ID and App Secret values which you obtained previously, optionally
enable any scopes needed by your application, then click OK.
By default, App Service provides authentication but does not restrict authorized access to your site content
and APIs. You must authorize users in your app code.
3. (Optional) To restrict access to your site to only users authenticated by Facebook, set Action to take when
request is not authenticated to Facebook. This requires that all requests be authenticated, and all
unauthenticated requests are redirected to Facebook for authentication.
4. When done configuring authentication, click Save.
You are now ready to use Facebook for authentication in your app.
Related Content
App Service Authentication / Authorization overview
Add authentication to your Mobile App: iOS, Android, Windows Universal, Xamarin.Android, Xamarin.iOS,
Xamarin.Forms, Cordova
Learn how to add App Service authentication to your mobile app.
How to configure your App Service application to
use Google login
12/14/2018 • 2 minutes to read • Edit Online
This topic shows you how to configure Azure App Service to use Google as an authentication provider.
To complete the procedure in this topic, you must have a Google account that has a verified email address. To
create a new Google account, go to accounts.google.com.
IMPORTANT
The client secret is an important security credential. Do not share this secret with anyone or distribute it within a
client application.
Related Content
App Service Authentication / Authorization overview
Add authentication to your Mobile App: iOS, Android, Windows Universal, Xamarin.Android, Xamarin.iOS,
Xamarin.Forms, Cordova
Learn how to add App Service authentication to your mobile app.
How to configure your App Service application to
use Microsoft Account login
12/14/2018 • 2 minutes to read • Edit Online
This topic shows you how to configure Azure App Service to use Microsoft Account as an authentication provider.
NOTE
Your redirect URI is the URL of your application appended with the path, /.auth/login/microsoftaccount/callback.
For example, https://contoso.azurewebsites.net/.auth/login/microsoftaccount/callback .
Make sure that you are using the HTTPS scheme.
7. Under "Application Secrets," click Generate New Password. Make note of the value that appears. Once
you leave the page, it will not be displayed again.
IMPORTANT
The password is an important security credential. Do not share the password with anyone or distribute it within a
client application.
8. Click Save
Related content
App Service Authentication / Authorization overview
Add authentication to your Mobile App: iOS, Android, Windows Universal, Xamarin.Android, Xamarin.iOS,
Xamarin.Forms, Cordova
Learn how to add App Service authentication to your mobile app.
How to configure your App Service application to
use Twitter login
12/14/2018 • 2 minutes to read • Edit Online
This topic shows you how to configure Azure App Service to use Twitter as an authentication provider.
To complete the procedure in this topic, you must have a Twitter account that has a verified email address and
phone number. To create a new Twitter account, go to twitter.com.
NOTE
The consumer secret is an important security credential. Do not share this secret with anyone or distribute it with
your app.
Related Content
App Service Authentication / Authorization overview
Add authentication to your Mobile App: iOS, Android, Windows Universal, Xamarin.Android, Xamarin.iOS,
Xamarin.Forms, Cordova
Learn how to add App Service authentication to your mobile app.
Advanced usage of authentication and authorization
in Azure App Service
3/12/2019 • 8 minutes to read • Edit Online
This article shows you how to customize the built-in authentication and authorization in App Service, and to
manage identity from your application.
To get started quickly, see one of the following tutorials:
Tutorial: Authenticate and authorize users end-to-end in Azure App Service (Windows)
Tutorial: Authenticate and authorize users end-to-end in Azure App Service for Linux
How to configure your app to use Azure Active Directory login
How to configure your app to use Facebook login
How to configure your app to use Google login
How to configure your app to use Microsoft Account login
How to configure your app to use Twitter login
When the user clicks on one of the links, the respective sign-in page opens to sign in the user.
To redirect the user post-sign-in to a custom URL, use the post_login_redirect_url query string parameter (not to
be confused with the Redirect URI in your identity provider configuration). For example, to navigate the user to
/Home/Index after sign-in, use the following HTML code:
{"id_token":"<token>","access_token":"<token>"}
The token format varies slightly according to the provider. See the following table for details:
aad {"access_token":"<access_token>"}
twitter {"access_token":"<access_token>",
"access_token_secret":"
<acces_token_secret>"}
If the provider token is validated successfully, the API returns with an authenticationToken in the response body,
which is your session token.
{
"authenticationToken": "...",
"user": {
"userId": "sid:..."
}
}
Once you have this session token, you can access protected app resources by adding the X-ZUMO-AUTH header to
your HTTP requests. For example:
GET https://<appname>.azurewebsites.net/api/products/1
X-ZUMO-AUTH: <authenticationToken_value>
By default, a successful sign-out redirects the client to the URL /.auth/logout/done . You can change the post-sign-
out redirect page by adding the post_logout_redirect_uri query parameter. For example:
GET /.auth/logout?post_logout_redirect_uri=/index.html
GET /.auth/logout?post_logout_redirect_uri=https%3A%2F%2Fmyexternalurl.com
You must run the following command in the Azure Cloud Shell:
Google X-MS-TOKEN-GOOGLE-ID-TOKEN
X-MS-TOKEN-GOOGLE-ACCESS-TOKEN
X-MS-TOKEN-GOOGLE-EXPIRES-ON
X-MS-TOKEN-GOOGLE-REFRESH-TOKEN
Twitter X-MS-TOKEN-TWITTER-ACCESS-TOKEN
X-MS-TOKEN-TWITTER-ACCESS-TOKEN-SECRET
From your client code (such as a mobile app or in-browser JavaScript), send an HTTP GET request to /.auth/me .
The returned JSON has the provider-specific tokens.
NOTE
Access tokens are for accessing provider resources, so they are present only if you configure your provider with a client
secret. To see how to get refresh tokens, see Refresh access tokens.
5. Click Put.
Once your provider is configured, you can find the refresh token and the expiration time for the access token in the
token store.
To refresh your access token at anytime, just call /.auth/refresh in any language. The following snippet uses
jQuery to refresh your access tokens from a JavaScript client.
function refreshTokens() {
let refreshUrl = "/.auth/refresh";
$.ajax(refreshUrl) .done(function() {
console.log("Token refresh completed successfully.");
}) .fail(function() {
console.log("Token refresh failed. See application logs for details.");
});
}
If a user revokes the permissions granted to your app, your call to /.auth/me may fail with a 403 Forbidden
response. To diagnose errors, check your application logs for details.
NOTE
The grace period only applies to the App Service authenticated session, not the tokens from the identity providers. There is
no grace period for the expired provider tokens.
"additionalLoginParams": ["domain_hint=<domain_name>"]
Next steps
Tutorial: Authenticate and authorize users end-to-end (Windows) Tutorial: Authenticate and authorize users end-
to-end (Linux)
Azure App Service Access Restrictions
5/10/2019 • 4 minutes to read • Edit Online
Access Restrictions enable you to define a priority ordered allow/deny list that controls network access to your
app. The list can include IP addresses or Azure Virtual Network subnets. When there are one or more entries,
there is then an implicit "deny all" that exists at the end of the list.
The Access Restrictions capability works with all App Service hosted work loads including; web apps, API apps,
Linux apps, Linux container apps, and Functions.
When a request is made to your app, the FROM address is evaluated against the IP address rules in your access
restrictions list. If the FROM address is in a subnet that is configured with service endpoints to Microsoft.Web,
then the source subnet is compared against the virtual network rules in your access restrictions list. If the address
is not allowed access based on the rules in the list, the service replies with an HTTP 403 status code.
The access restrictions capability is implemented in the App Service front-end roles, which are upstream of the
worker hosts where your code runs. Therefore, access restrictions are effectively network ACLs.
The ability to restrict access to your web app from an Azure Virtual Network (VNet) is called service endpoints.
Service endpoints enable you to restrict access to a multi-tenant service from selected subnets. It must be enabled
on both the networking side as well as the service that it is being enabled with. It does not work to restrict traffic to
apps that are hosted in an App Service Environment. If you are in an App Service Environment, you can control
access to your app with IP address rules.
The list will show all of the current restrictions that are on your app. If you have a VNet restriction on your app, the
table will show if service endpoints are enabled for Microsoft.Web. When there are no defined restrictions on your
app, your app will be accessible from anywhere.
You can click on [+] Add to add a new access restriction rule. Once you add a rule, it will become effective
immediately. Rules are enforced in priority order starting from the lowest number and going up. There is an
implicit deny all that is in effect once you add even a single rule.
When creating a rule, you must select allow/deny and also the type of rule. You are also required to provide the
priority value and what you are restricting access to. You can optionally add a name, and description to the rule.
To set an IP address based rule, select a type of IPv4 or IPv6. IP Address notation must be specified in CIDR
notation for both IPv4 and IPv6 addresses. To specify an exact address, you can use something like 1.2.3.4/32
where the first four octets represent your IP address and /32 is the mask. The IPv4 CIDR notation for all
addresses is 0.0.0.0/0. To learn more about CIDR notation, you can read Classless Inter-Domain Routing.
To restrict access to selected subnets, select a type of Virtual Network. Below that you will be able to pick the
subscription, VNet, and subnet you wish to allow or deny access with. If service endpoints are not already enabled
with Microsoft.Web for the subnet that you selected, it will automatically be enabled for you unless you check the
box asking not to do that. The situation where you would want to enable it on the app but not the subnet is largely
related to if you have the permissions to enable service endpoints on the subnet or not. If you need to get
somebody else to enable service endpoints on the subnet, you can check the box and have your app configured for
service endpoints in anticipation of it being enabled later on the subnet.
Service endpoints cannot be used to restrict access to apps that run in an App Service Environment. When your
app is in an App Service Environment, you can control access to your app with IP access rules.
You can click on any row to edit an existing access restriction rule. Edits are effective immediately including
changes in priority ordering.
When you edit a rule, you cannot change the type between an IP address rule and a Virtual Network rule.
To delete a rule, click the ... on your rule and then click remove.
SCM site
In addition to being able to control access to your app, you can also restrict access to the scm site used by your
app. The scm site is the web deploy endpoint and also the Kudu console. You can separately assign access
restrictions to the scm site from the app or use the same set for both the app and the scm site. When you check
the box to have the same restrictions as your app, everything is blanked out. If you uncheck the box, whatever
settings you had earlier on the scm site are applied.
Programmatic manipulation of access restriction rules
There currently is no CLI or PowerShell for the new Access Restrictions capability but the values can be set
manually with a PUT operation on the app configuration in Resource Manager. As an example, you can use
resources.azure.com and edit the ipSecurityRestrictions block to add the required JSON.
The location for this information in Resource Manager is:
management.azure.com/subscriptions/subscription ID/resourceGroups/resource
groups/providers/Microsoft.Web/sites/web app name/config/web?api-version=2018-02-01
The JSON syntax for the earlier example is:
"ipSecurityRestrictions": [
{
"ipAddress": "131.107.159.0/24",
"action": "Allow",
"tag": "Default",
"priority": 100,
"name": "allowed access"
}
],
NOTE
Managed identity support for App Service on Linux and Web App for Containers is currently in preview.
IMPORTANT
Managed identities for App Service and Azure Functions will not behave as expected if your app is migrated across
subscriptions/tenants. The app will need to obtain a new identity, which can be done by disabling and re-enabling the feature.
See Removing an identity below. Downstream resources will also need to have access policies updated to use the new identity.
This topic shows you how to create a managed identity for App Service and Azure Functions applications and how
to use it to access other resources. A managed identity from Azure Active Directory allows your app to easily access
other AAD -protected resources such as Azure Key Vault. The identity is managed by the Azure platform and does
not require you to provision or rotate any secrets. For more about managed identities in AAD, see Managed
identities for Azure resources.
Your application can be granted two types of identities:
A system -assigned identity is tied to your application and is deleted if your app is deleted. An app can only
have one system-assigned identity. System-assigned identity support is generally available for Windows apps.
A user-assigned identity is a standalone Azure resource which can be assigned to your app. An app can have
multiple user-assigned identities. User-assigned identity support is in preview for all app types.
az login
2. Create a web application using the CLI. For more examples of how to use the CLI with App Service, see App
Service CLI samples:
3. Run the identity assign command to create the identity for this application:
NOTE
This article has been updated to use the new Azure PowerShell Az module. You can still use the AzureRM module, which will
continue to receive bug fixes until at least December 2020. To learn more about the new Az module and AzureRM
compatibility, see Introducing the new Azure PowerShell Az module. For Az module installation instructions, see Install Azure
PowerShell.
The following steps will walk you through creating a web app and assigning it an identity using Azure PowerShell:
1. If needed, install the Azure PowerShell using the instruction found in the Azure PowerShell guide, and then
run Login-AzAccount to create a connection with Azure.
2. Create a web application using Azure PowerShell. For more examples of how to use Azure PowerShell with
App Service, see App Service PowerShell samples:
3. Run the Set-AzWebApp -AssignIdentity command to create the identity for this application:
"identity": {
"type": "SystemAssigned"
}
NOTE
An application can have both system-assigned and user-assigned identities at the same time. In this case, the type
property would be SystemAssigned,UserAssigned
Adding the system-assigned type tells Azure to create and manage the identity for your application.
For example, a web app might look like the following:
{
"apiVersion": "2016-08-01",
"type": "Microsoft.Web/sites",
"name": "[variables('appName')]",
"location": "[resourceGroup().location]",
"identity": {
"type": "SystemAssigned"
},
"properties": {
"name": "[variables('appName')]",
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"hostingEnvironment": "",
"clientAffinityEnabled": false,
"alwaysOn": true
},
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]"
]
}
"identity": {
"type": "SystemAssigned",
"tenantId": "<TENANTID>",
"principalId": "<PRINCIPALID>"
}
Where <TENANTID> and <PRINCIPALID> are replaced with GUIDs. The tenantId property identifies what AAD tenant
the identity belongs to. The principalId is a unique identifier for the application's new identity. Within AAD, the
service principal has the same name that you gave to your App Service or Azure Functions instance.
Creating an app with a user-assigned identity requires that you create the identity and then add its resource
identifier to your app config.
Using the Azure portal
NOTE
This portal experience is being deployed and may not yet be available in all regions.
"identity": {
"type": "UserAssigned",
"userAssignedIdentities": {
"<RESOURCEID>": {}
}
}
NOTE
An application can have both system-assigned and user-assigned identities at the same time. In this case, the type
property would be SystemAssigned,UserAssigned
Adding the user-assigned type and a cotells Azure to create and manage the identity for your application.
For example, a web app might look like the following:
{
"apiVersion": "2016-08-01",
"type": "Microsoft.Web/sites",
"name": "[variables('appName')]",
"location": "[resourceGroup().location]",
"identity": {
"type": "UserAssigned",
"userAssignedIdentities": {
"[resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', variables('identityName'))]": {}
}
},
"properties": {
"name": "[variables('appName')]",
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"hostingEnvironment": "",
"clientAffinityEnabled": false,
"alwaysOn": true
},
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"[resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', variables('identityName'))]"
]
}
"identity": {
"type": "UserAssigned",
"userAssignedIdentities": {
"<RESOURCEID>": {
"principalId": "<PRINCIPALID>",
"clientId": "<CLIENTID>"
}
}
}
Where <PRINCIPALID> and <CLIENTID> are replaced with GUIDs. The principalId is a unique identifier for the
identity which is used for AAD administration. The clientId is a unique identifier for the application's new identity
that is used for specifying which identity to use during runtime calls.
IMPORTANT
You may need to configure the target resource to allow access from your application. For example, if you request a token to
Key Vault, you need to make sure you have added an access policy that includes your application's identity. Otherwise, your
calls to Key Vault will be rejected, even if they include the token. To learn more about which resources support Azure Active
Directory tokens, see Azure services that support Azure AD authentication .
There is a simple REST protocol for obtaining a token in App Service and Azure Functions. For .NET applications,
the Microsoft.Azure.Services.AppAuthentication library provides an abstraction over this protocol and supports a
local development experience.
Using the Microsoft.Azure.Services.AppAuthentication library for .NET
For .NET applications and functions, the simplest way to work with a managed identity is through the
Microsoft.Azure.Services.AppAuthentication package. This library will also allow you to test your code locally on
your development machine, using your user account from Visual Studio, the Azure CLI, or Active Directory
Integrated Authentication. For more on local development options with this library, see the
Microsoft.Azure.Services.AppAuthentication reference. This section shows you how to get started with the library
in your code.
1. Add references to the Microsoft.Azure.Services.AppAuthentication and any other necessary NuGet packages
to your application. The below example also uses Microsoft.Azure.KeyVault.
2. Add the following code to your application, modifying to target the correct resource. This example shows
two ways to work with Azure Key Vault:
using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.KeyVault;
// ...
var azureServiceTokenProvider = new AzureServiceTokenProvider();
string accessToken = await azureServiceTokenProvider.GetAccessTokenAsync("https://vault.azure.net");
// OR
var kv = new KeyVaultClient(new
KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
To learn more about Microsoft.Azure.Services.AppAuthentication and the operations it exposes, see the
Microsoft.Azure.Services.AppAuthentication reference and the App Service and KeyVault with MSI .NET sample.
Using the REST protocol
An app with a managed identity has two environment variables defined:
MSI_ENDPOINT - the URL to the local token service.
MSI_SECRET - a header used to help mitigate server-side request forgery (SSRF ) attacks. The value is rotated
by the platform.
The MSI_ENDPOINT is a local URL from which your app can request tokens. To get a token for a resource, make
an HTTP GET request to this endpoint, including the following parameters:
A successful 200 OK response includes a JSON body with the following properties:
PROPERTY NAME DESCRIPTION
access_token The requested access token. The calling web service can use
this token to authenticate to the receiving web service.
expires_on The time when the access token expires. The date is
represented as the number of seconds from 1970-01-
01T0:0:0Z UTC until the expiration time. This value is used to
determine the lifetime of cached tokens.
token_type Indicates the token type value. The only type that Azure AD
supports is Bearer. For more information about bearer
tokens, see The OAuth 2.0 Authorization Framework: Bearer
Token Usage (RFC 6750).
This response is the same as the response for the AAD service-to-service access token request.
NOTE
Environment variables are set up when the process first starts, so after enabling a managed identity for your application, you
may need to restart your application, or redeploy its code, before MSI_ENDPOINT and MSI_SECRET are available to your
code.
HTTP/1.1 200 OK
Content-Type: application/json
{
"access_token": "eyJ0eXAi…",
"expires_on": "09/14/2017 00:00:00 PM +00:00",
"resource": "https://vault.azure.net",
"token_type": "Bearer"
}
Code examples
To make this request in C#:
In Node.JS:
const rp = require('request-promise');
const getToken = function(resource, apiver, cb) {
let options = {
uri: `${process.env["MSI_ENDPOINT"]}/?resource=${resource}&api-version=${apiver}`,
headers: {
'Secret': process.env["MSI_SECRET"]
}
};
rp(options)
.then(cb);
}
In PowerShell:
$apiVersion = "2017-09-01"
$resourceURI = "https://<AAD-resource-URI-for-resource-to-obtain-token>"
$tokenAuthURI = $env:MSI_ENDPOINT + "?resource=$resourceURI&api-version=$apiVersion"
$tokenResponse = Invoke-RestMethod -Method Get -Headers @{"Secret"="$env:MSI_SECRET"} -Uri $tokenAuthURI
$accessToken = $tokenResponse.access_token
Removing an identity
A system-assigned identity can be removed by disabling the feature using the portal, PowerShell, or CLI in the
same way that it was created. User-assigned identities can be removed individually. To remove all identities, in the
REST/ARM template protocol, this is done by setting the type to "None":
"identity": {
"type": "None"
}
Removing a system-assigned identity in this way will also delete it from AAD. System-assigned identities are also
automatically removed from AAD when the app resource is deleted.
NOTE
There is also an application setting that can be set, WEBSITE_DISABLE_MSI, which just disables the local token service.
However, it leaves the identity in place, and tooling will still show the managed identity as "on" or "enabled." As a result, use of
this setting is not recommended.
Next steps
Access SQL Database securely using a managed identity
Use Key Vault references for App Service and Azure
Functions (preview)
5/20/2019 • 3 minutes to read • Edit Online
NOTE
Key Vault references are currently in preview.
This topic shows you how to work with secrets from Azure Key Vault in your App Service or Azure Functions
application without requiring any code changes. Azure Key Vault is a service that provides centralized secrets
management, with full control over access policies and audit history.
NOTE
Key Vault references currently only support system-assigned managed identities. User-assigned identities cannot be
used.
3. Create an access policy in Key Vault for the application identity you created earlier. Enable the "Get" secret
permission on this policy. Do not configure the "authorized application" or appliationId settings, as this is
not compatible with a managed identity.
Reference syntax
A Key Vault reference is of the form @Microsoft.KeyVault({referenceString}) , where {referenceString} is replaced
by one of the following options:
VaultName=vaultName;SecretName=secretName;SecretVersi The VaultName should the name of your Key Vault resource.
on=secretVersion The SecretName should be the name of the target secret.
The SecretVersion should be the version of the secret to use.
NOTE
In the current preview, versions are required. When rotating secrets, you will need to update the version in your application
configuration.
@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/ec96f02080254f109c51a1f14cdb1931
)
Alternatively:
@Microsoft.KeyVault(VaultName=myvault;SecretName=mysecret;SecretVersion=ec96f02080254f109c51a1f14cdb1931)
TIP
Most application settings using Key Vault references should be marked as slot settings, as you should have separate vaults
for each environment.
{
//...
"resources": [
{
"type": "Microsoft.Storage/storageAccounts",
"name": "[variables('storageAccountName')]",
//...
},
{
"type": "Microsoft.Insights/components",
"name": "[variables('appInsightsName')]",
//...
},
{
"type": "Microsoft.Web/sites",
"name": "[variables('functionAppName')]",
"identity": {
"type": "SystemAssigned"
},
},
//...
"resources": [
{
"type": "config",
"name": "appsettings",
//...
"dependsOn": [
"[resourceId('Microsoft.Web/sites', variables('functionAppName'))]",
"[resourceId('Microsoft.KeyVault/vaults/', variables('keyVaultName'))]",
"[resourceId('Microsoft.KeyVault/vaults/secrets', variables('keyVaultName'),
variables('storageConnectionStringName'))]",
"[resourceId('Microsoft.KeyVault/vaults/secrets', variables('keyVaultName'),
variables('appInsightsKeyName'))]"
],
"properties": {
"AzureWebJobsStorage": "[concat('@Microsoft.KeyVault(SecretUri=',
reference(variables('storageConnectionStringResourceId')).secretUriWithVersion, ')')]",
"WEBSITE_CONTENTAZUREFILECONNECTIONSTRING": "[concat('@Microsoft.KeyVault(SecretUri=',
reference(variables('storageConnectionStringResourceId')).secretUriWithVersion, ')')]",
"APPINSIGHTS_INSTRUMENTATIONKEY": "[concat('@Microsoft.KeyVault(SecretUri=',
reference(variables('appInsightsKeyResourceId')).secretUriWithVersion, ')')]",
"WEBSITE_ENABLE_SYNC_UPDATE_SITE": "true"
//...
}
},
{
"type": "sourcecontrols",
"name": "web",
//...
"dependsOn": [
"[resourceId('Microsoft.Web/sites', variables('functionAppName'))]",
"[resourceId('Microsoft.Web/sites/config', variables('functionAppName'),
'appsettings')]"
],
}
]
},
{
"type": "Microsoft.KeyVault/vaults",
"name": "[variables('keyVaultName')]",
//...
"dependsOn": [
"[resourceId('Microsoft.Web/sites', variables('functionAppName'))]"
],
"properties": {
//...
"accessPolicies": [
{
"tenantId": "[reference(concat('Microsoft.Web/sites/', variables('functionAppName'),
'/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').tenantId]",
"objectId": "[reference(concat('Microsoft.Web/sites/', variables('functionAppName'),
'/providers/Microsoft.ManagedIdentity/Identities/default'), '2015-08-31-PREVIEW').principalId]",
"permissions": {
"secrets": [ "get" ]
}
}
]
},
"resources": [
{
"type": "secrets",
"name": "[variables('storageConnectionStringName')]",
//...
"dependsOn": [
"[resourceId('Microsoft.KeyVault/vaults/', variables('keyVaultName'))]",
"[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]"
],
"properties": {
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=',
variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountResourceId'),'2015-05-01-
preview').key1)]"
}
},
{
"type": "secrets",
"name": "[variables('appInsightsKeyName')]",
//...
"dependsOn": [
"[resourceId('Microsoft.KeyVault/vaults/', variables('keyVaultName'))]",
"[resourceId('Microsoft.Insights/components', variables('appInsightsName'))]"
],
"properties": {
"value": "[reference(resourceId('microsoft.insights/components/',
variables('appInsightsName')), '2015-05-01').InstrumentationKey]"
}
}
]
}
]
}
NOTE
In this example, the source control deployment depends on the application settings. This is normally unsafe behavior, as the
app setting update behaves asynchronously. However, because we have included the WEBSITE_ENABLE_SYNC_UPDATE_SITE
application setting, the update is synchronous. This means that the source control deployment will only begin once the
application settings have been fully updated.
Use Azure Functions to connect to an Azure SQL
Database
5/17/2019 • 4 minutes to read • Edit Online
This article shows you how to use Azure Functions to create a scheduled job that connects to an Azure SQL
Database instance. The function code cleans up rows in a table in the database. The new C# function is created
based on a pre-defined timer trigger template in Visual Studio 2019. To support this scenario, you must also set a
database connection string as an app setting in the function app. This scenario uses a bulk operation against the
database.
If this is your first experience working with C# Functions, you should read the Azure Functions C# developer
reference.
Prerequisites
Complete the steps in the article Create your first function using Visual Studio to create a local function app
that targets the version 2.x runtime. You must also have published your project to a function app in Azure.
This article demonstrates a Transact-SQL command that executes a bulk cleanup operation in the
SalesOrderHeader table in the AdventureWorksLT sample database. To create the AdventureWorksLT
sample database, complete the steps in the article Create an Azure SQL database in the Azure portal.
You must add a server-level firewall rule for the public IP address of the computer you use for this
quickstart. This rule is required to be able access the SQL database instance from your local computer.
2. In the new sqldb_connection setting, paste the connection string you copied in the previous section into
the Local field and replace {your_username} and {your_password} placeholders with real values. Select
Insert value from local to copy the updated value into the Remote field, and then select OK.
The connection strings are stored encrypted in Azure ( Remote). To prevent leaking secrets, the
local.settings.json project file (Local) should be excluded from source control, such as by using a .gitignore
file.
using System.Data.SqlClient;
using System.Threading.Tasks;
5. Replace the existing Run function with the following code:
[FunctionName("DatabaseCleanup")]
public static async Task Run([TimerTrigger("*/15 * * * * *")]TimerInfo myTimer, ILogger log)
{
// Get the connection string from app settings and use it to create a connection.
var str = Environment.GetEnvironmentVariable("sqldb_connection");
using (SqlConnection conn = new SqlConnection(str))
{
conn.Open();
var text = "UPDATE SalesLT.SalesOrderHeader " +
"SET [Status] = 5 WHERE ShipDate < GetDate();";
This function runs every 15 seconds to update the Status column based on the ship date. To learn more
about the Timer trigger, see Timer trigger for Azure Functions.
6. Press F5 to start the function app. The Azure Functions Core Tools execution window opens behind Visual
Studio.
7. At 15 seconds after startup, the function runs. Watch the output and note the number of rows updated in
the SalesOrderHeader table.
On the first execution, you should update 32 rows of data. Following runs update no data rows, unless you
make changes to the SalesOrderHeader table data so that more rows are selected by the UPDATE statement.
If you plan to publish this function, remember to change the TimerTrigger attribute to a more reasonable cron
schedule than every 15 seconds.
Next steps
Next, learn how to use. Functions with Logic Apps to integrate with other services.
Create a function that integrates with Logic Apps
For more information about Functions, see the following articles:
Azure Functions developer reference
Programmer reference for coding functions and defining triggers and bindings.
Testing Azure Functions
Describes various tools and techniques for testing your functions.
2 minutes to read
2 minutes to read
Exporting an Azure-hosted API to PowerApps and
Microsoft Flow
1/18/2019 • 7 minutes to read • Edit Online
PowerApps is a service for building and using custom business apps that connect to your data and work across
platforms. Microsoft Flow makes it easy to automate workflows and business processes between your favorite
apps and services. Both PowerApps and Microsoft Flow come with a variety of built-in connectors to data sources
such as Office 365, Dynamics 365, Salesforce, and more. In some cases, app and flow builders also want to connect
to data sources and APIs built by their organization.
Similarly, developers that want to expose their APIs more broadly within an organization can make their APIs
available to app and flow builders. This topic shows you how to export an API built with Azure Functions or Azure
App Service. The exported API becomes a custom connector, which is used in PowerApps and Microsoft Flow just
like a built-in connector.
NOTE
You can also build custom connectors in the PowerApps and Microsoft Flow UI, without using an OpenAPI definition. For
more information, see Register and use a custom connector (PowerApps) and Register and use a custom connector
(Microsoft Flow).
2. The Export to PowerApps + Microsoft Flow button should be available (if not, you must first create an
OpenAPI definition). Click this button to begin the export process.
SETTING DESCRIPTION
Custom API Name Enter a name, which PowerApps and Microsoft Flow
builders will see in their connector list.
This example shows the API key security definition that was included in the OpenAPI definition.
Now that you've exported the API definition, you import it to create a custom connector in PowerApps and
Microsoft Flow. Custom connectors are shared between the two services, so you only need to import the definition
once.
To import the API definition into PowerApps and Microsoft Flow, follow these steps:
1. Go to powerapps.com or flow.microsoft.com.
2. In the upper right corner, click the gear icon, then click Custom connectors.
4. Enter a name for the custom connector, then navigate to the OpenAPI definition that you exported, and click
Continue.
5. On the General tab, review the information that comes from the OpenAPI definition.
6. On the Security tab, if you are prompted to provide authentication details, enter the values appropriate for
the authentication type. Click Continue.
This example shows the required fields for API key authentication. The fields differ depending on the
authentication type.
7. On the Definitions tab, all the operations defined in your OpenAPI file are auto-populated. If all your
required operations are defined, you can go to the next step. If not, you can add and modify operations here.
This example has one operation, named CalculateCosts . The metadata, like Description, all comes from
the OpenAPI file.
8. Click Create connector at the top of the page.
You can now connect to the custom connector in PowerApps and Microsoft Flow. For more information on
creating connectors in the PowerApps and Microsoft Flow portals, see Register your custom connector
(PowerApps) and Register your custom connector (Microsoft Flow ).
"securityDefinitions": {
"AAD": {
"type": "oauth2",
"flow": "accessCode",
"authorizationUrl": "https://login.windows.net/common/oauth2/authorize",
"scopes": {}
}
}
During export, you provide configuration values that allow PowerApps and Microsoft Flow to authenticate users.
This section covers the authentication types that are supported in Express mode: API key, Azure Active Directory,
and Generic OAuth 2.0. PowerApps and Microsoft Flow also support Basic Authentication, and OAuth 2.0 for
specific services like Dropbox, Facebook, and SalesForce.
API key
When using an API key, the users of your connector are prompted to provide the key when they create a
connection. You specify an API key name to help them understand which key is needed. In the earlier example, we
use the name API Key (contact meganb@contoso.com) so people know where to get information about the API key.
For Azure Functions, the key is typically one of the host keys, covering several functions within the function app.
Azure Active Directory (Azure AD)
When using Azure AD, you need two Azure AD application registrations: one for the API itself, and one for the
custom connector:
To configure registration for the API, use the App Service Authentication/Authorization feature.
To configure registration for the connector, follow the steps in Adding an Azure AD application. The
registration must have delegated access to your API and a reply URL of
https://msmanaged-na.consent.azure-apim.net/redirect .
For more information, see the Azure AD registration examples for PowerApps and Microsoft Flow. These
examples use Azure Resource Manager as the API; substitute your API if you follow the steps.
The following configuration values are required:
Client ID - the client ID of your connector Azure AD registration
Client secret - the client secret of your connector Azure AD registration
Login URL - the base URL for Azure AD. In Azure, this is typically https://login.windows.net .
Tenant ID - the ID of the tenant to be used for the login. This should be "common" or the ID of the tenant in
which the connector is created.
Resource URL - the resource URL of the Azure AD registration for your API
IMPORTANT
If someone else will import the API definition into PowerApps and Microsoft Flow as part of the manual flow, you must
provide them with the client ID and client secret of the connector registration, as well as the resource URL of your API. Make
sure that these secrets are managed securely. Do not share the security credentials of the API itself.
The PowerApps platform is designed for business experts to build apps without traditional application code.
Professional developers can use Azure Functions to extend the capabilities of PowerApps, while shielding
PowerApps app builders from the technical details.
You build an app in this topic based on a maintenance scenario for wind turbines. This topic shows you how to call
the function that you defined in Create an OpenAPI definition for a function. The function determines if an
emergency repair on a wind turbine is cost-effective.
For information on calling the same function from Microsoft Flow, see Call a function from Microsoft Flow.
In this topic, you learn how to:
Prepare sample data in Excel.
Export an API definition.
Add a connection to the API.
Create an app and add data sources.
Add controls to view data in the app.
Add controls to call the function and display data.
Run the app to determine whether a repair is cost-effective.
IMPORTANT
The OpenAPI feature is currently in preview and is only available for version 1.x of the Azure Functions runtime.
Prerequisites
An active PowerApps account with the same sign in credentials as your Azure account.
Excel and the Excel sample file that you will use as a data source for your app.
Complete the tutorial Create an OpenAPI definition for a function.
IMPORTANT
Remember that you must be signed in to Azure with the same credentials that you use for your PowerApps and Microsoft
Flow tenants. This enables Azure to create the custom API and make it available for both PowerApps and Microsoft Flow.
1. In the Azure portal, click your function app name (like function-demo-energy) > Platform features >
API definition.
API Key Name Enter the name that app and flow builders should see in
the custom API UI. Note that the example includes helpful
information.
4. Click OK. The custom API is now built and added to the environment you specified.
Add a connection to the API
The custom API (also known as a custom connector) is available in PowerApps, but you must make a connection to
the API before you can use it in an app.
1. In web.powerapps.com, click Connections.
2. Click New Connection, scroll down to the Turbine Repair connector, and click it.
NOTE
If you share your app with others, each person who works on or uses the app must also enter the API key to connect to the
API. This behavior might change in the future, and we will update this topic to reflect that.
(A ) Left navigation bar, in which you see a hierarchical view of all the controls on each screen
(B ) Middle pane, which shows the screen that you're working on
(C ) Right pane, where you set options such as layout and data sources
(D ) Property drop-down list, where you select the properties that formulas apply to
(E ) Formula bar, where you add formulas (as in Excel) that define app behavior
(F) Ribbon, where you add controls and customize design elements
2. Add the Excel file as a data source.
The data you will import looks like the following:
a. On the app canvas, choose connect to data.
b. On the Data panel, click Add static data to your app.
Normally you would read and write data from an external source, but you're adding the Excel data as
static data because this is a sample.
c. Navigate to the Excel file you saved, select the Turbines table, and click Connect.
3. With the gallery selected, in the right pane, under Properties, click CustomGallerySample.
6. As the last step in the Data panel, change the fields that are displayed in the gallery.
Body1 = LastServiceDate
Subtitle2 = ServiceRequired
Title2 = Title
7. With the gallery selected, set the TemplateFill property to the following formula:
If(ThisItem.IsSelected, Orange, White) .
9. Click File, and name the app. Click Save on the left menu, then click Save in the bottom right corner.
There's a lot of other formatting you would typically do in a production app, but we'll move on to the important
part for this scenario - calling the function.
2. Drag the button and the label below the gallery, and resize the label.
3. Select the button text, and change it to Calculate costs . The app should look like the following image.
4. Select the button, and enter the following formula for the button's OnSelect property.
If (BrowseGallery1.Selected.ServiceRequired="Yes", ClearCollect(DetermineRepair,
TurbineRepair.CalculateCosts({hours: BrowseGallery1.Selected.EstimatedEffort, capacity:
BrowseGallery1.Selected.MaxOutput})))
This formula executes when the button is clicked, and it does the following if the selected gallery item has a
ServiceRequired value of Yes :
Clears the collection DetermineRepair to remove data from previous calls. A collection is a tabular
variable.
Assigns to the collection the data returned by calling the function TurbineRepair.CalculateCosts() .
The values passed to the function come from the EstimatedEffort and MaxOutput fields for the
item selected in the gallery. These fields aren't displayed in the gallery, but they're still available to use
in formulas.
5. Select the label, and enter the following formula for the label's Text property.
"Repair decision: " & First(DetermineRepair).message & " | Cost: " & First(DetermineRepair).costToFix &
" | Revenue: " & First(DetermineRepair).revenueOpportunity
This formula uses the First() function to access the first (and only) row of the DetermineRepair collection.
It then displays the three values that the function returns: message , costToFix , and revenueOpportunity .
These values are blank before the app runs for the first time.
The completed app should look like the following image.
Run the app
You have a complete app! Now it's time to run it and see the function calls in action.
1. In the upper right corner of PowerApps Studio, click the run button: .
2. Select a turbine with a value of Yes for ServiceRequired, then click the Calculate costs button. You
should see a result like the following image.
3. Try the other turbines to see what's returned by the function each time.
Next steps
In this topic, you learned how to:
Prepare sample data in Excel.
Export an API definition.
Add a connection to the API.
Create an app and add data sources.
Add controls to view data in the app.
Add controls to call the function and display data
Run the app to determine whether a repair is cost-effective.
To learn more about PowerApps, see Introduction to PowerApps.
To learn about other interesting scenarios that use Azure Functions, see Call a function from Microsoft Flow and
Create a function that integrates with Azure Logic Apps.
Call a function from Microsoft Flow
5/14/2019 • 8 minutes to read • Edit Online
Microsoft Flow makes it easy to automate workflows and business processes between your favorite apps and
services. Professional developers can use Azure Functions to extend the capabilities of Microsoft Flow, while
shielding flow builders from the technical details.
You build a flow in this topic based on a maintenance scenario for wind turbines. This topic shows you how to call
the function that you defined in Create an OpenAPI definition for a function. The function determines if an
emergency repair on a wind turbine is cost-effective. If it is cost-effective, the flow sends an email to recommend
the repair.
For information on calling the same function from PowerApps, see Call a function from PowerApps.
In this topic, you learn how to:
Create a list in SharePoint.
Export an API definition.
Add a connection to the API.
Create a flow to send email if a repair is cost-effective.
Run the flow.
IMPORTANT
The OpenAPI feature is currently in preview and is only available for version 1.x of the Azure Functions runtime.
Prerequisites
An active Microsoft Flow account with the same sign in credentials as your Azure account.
SharePoint, which you use as a data source for this flow. Sign up for an Office 365 trial if you don't already
have SharePoint.
Complete the tutorial Create an OpenAPI definition for a function.
LastServiceDate Date
ServiceRequired Yes/No
5. Repeat the last two steps for the other three columns in the list:
a. Number > "MaxOutput"
b. Yes/No > "ServiceRequired"
c. Number > "EstimatedEffort"
That's it for now - you should have an empty list that looks like the following image. You add data to the list after
you create the flow.
1. In the Azure portal, click your function app name (like function-demo-energy) > Platform features >
API definition.
SETTING DESCRIPTION
API Key Name Enter the name that app and flow builders should see in
the custom API UI. Note that the example includes helpful
information.
4. Click OK. The custom API is now built and added to the environment you specified.
2. Click Create Connection, scroll down to the Turbine Repair connector, and click it.
NOTE
If you share your flow with others, each person who works on or uses the flow must also enter the API key to connect to the
API. This behavior might change in the future, and we will update this topic to reflect that.
Create a flow
Now you're ready to create a flow that uses the custom connector and the SharePoint list you created.
Add a trigger and specify a condition
You first create a flow from blank (without a template), and add a trigger that fires when an item is created in the
SharePoint list. You then add a condition to determine what happens next.
1. In flow.microsoft.com, click My Flows, then Create from blank.
Microsoft Flow adds two branches to the flow: If yes and If no. You add steps to one or both branches after
you define the condition that you want to match.
5. On the Condition card, click the first box, then select ServiceRequired from the Dynamic content dialog
box.
6. Enter a value of True for the condition.
The value is displayed as Yes or No in the SharePoint list, but it is stored as a boolean, either True or
False .
7. Click Create flow at the top of the page. Be sure to click Update Flow periodically.
For any items created in the list, the flow checks if the ServiceRequired field is set to Yes , then goes to the If yes
branch or the If no branch as appropriate. To save time, in this topic you only specify actions for the If yes branch.
Add the custom connector
You now add the custom connector that calls the function in Azure. You add the custom connector to the flow just
like a standard connector.
1. In the If yes branch, click Add an action.
2. In the Choose an action dialog box, search for Turbine Repair , then select the action Turbine Repair -
Calculates costs.
The following image shows the card that is added to the flow. The fields and descriptions come from the
OpenAPI definition for the connector.
3. On the Calculates costs card, use the Dynamic content dialog box to select inputs for the function.
Microsoft Flow shows numeric fields but not the date field, because the OpenAPI definition specifies that
Hours and Capacity are numeric.
For Hours, select EstimatedEffort, and for Capacity, select MaxOutput.
Now you add another condition based on the output of the function.
4. At the bottom of the If yes branch, click More, then Add a condition.
5. On the Condition 2 card, click the first box, then select Message from the Dynamic content dialog box.
6. Enter a value of Yes . The flow goes to the next If yes branch or If no branch based on whether the
message returned by the function is yes (make the repair) or no (don't make the repair).
2. In the Choose an action dialog box, search for email , then select a send email action based on the email
system you use (in this case Outlook).
3. On the Send an email card, compose an email. Enter a valid name in your organization for the To field. In
the image below you can see the other fields are a combination of text and tokens from the Dynamic
content dialog box.
The Title token comes from the SharePoint list, and CostToFix and RevenueOpportunity are returned by
the function.
The completed flow should look like the following image (we left out the first If no branch to save space).
4. Click Update Flow at the top of the page, then click Done.
Title Turbine 60
LastServiceDate 08/04/2017
MaxOutput 2500
ServiceRequired Yes
EstimatedEffort 10
3. Click Done.
When you add the item, it triggers the flow, which you take a look at next.
4. In flow.microsoft.com, click My Flows, then click the flow you created.
If the run was successful, you can review the flow operations on the next page. If the run failed for any
reason, the next page provides troubleshooting information.
6. Expand the cards to see what occurred during the flow. For example, expand the Calculates costs card to
see the inputs to and outputs from the function.
7. Check the email account for the person you specified in the To field of the Send an email card. The email
sent from the flow should look like the following image.
You can see how the tokens have been replaced with the correct values from the SharePoint list and the
function.
Next steps
In this topic, you learned how to:
Create a list in SharePoint.
Export an API definition.
Add a connection to the API.
Create a flow to send email if a repair is cost-effective.
Run the flow.
To learn more about Microsoft Flow, see Get started with Microsoft Flow.
To learn about other interesting scenarios that use Azure Functions, see Call a function from PowerApps and
Create a function that integrates with Azure Logic Apps.
How to use managed identities for App Service and
Azure Functions
3/21/2019 • 11 minutes to read • Edit Online
NOTE
Managed identity support for App Service on Linux and Web App for Containers is currently in preview.
IMPORTANT
Managed identities for App Service and Azure Functions will not behave as expected if your app is migrated across
subscriptions/tenants. The app will need to obtain a new identity, which can be done by disabling and re-enabling the
feature. See Removing an identity below. Downstream resources will also need to have access policies updated to use the
new identity.
This topic shows you how to create a managed identity for App Service and Azure Functions applications and how
to use it to access other resources. A managed identity from Azure Active Directory allows your app to easily
access other AAD -protected resources such as Azure Key Vault. The identity is managed by the Azure platform and
does not require you to provision or rotate any secrets. For more about managed identities in AAD, see Managed
identities for Azure resources.
Your application can be granted two types of identities:
A system -assigned identity is tied to your application and is deleted if your app is deleted. An app can only
have one system-assigned identity. System-assigned identity support is generally available for Windows apps.
A user-assigned identity is a standalone Azure resource which can be assigned to your app. An app can have
multiple user-assigned identities. User-assigned identity support is in preview for all app types.
az login
2. Create a web application using the CLI. For more examples of how to use the CLI with App Service, see
App Service CLI samples:
3. Run the identity assign command to create the identity for this application:
NOTE
This article has been updated to use the new Azure PowerShell Az module. You can still use the AzureRM module, which will
continue to receive bug fixes until at least December 2020. To learn more about the new Az module and AzureRM
compatibility, see Introducing the new Azure PowerShell Az module. For Az module installation instructions, see Install Azure
PowerShell.
The following steps will walk you through creating a web app and assigning it an identity using Azure PowerShell:
1. If needed, install the Azure PowerShell using the instruction found in the Azure PowerShell guide, and then
run Login-AzAccount to create a connection with Azure.
2. Create a web application using Azure PowerShell. For more examples of how to use Azure PowerShell with
App Service, see App Service PowerShell samples:
3. Run the Set-AzWebApp -AssignIdentity command to create the identity for this application:
"identity": {
"type": "SystemAssigned"
}
NOTE
An application can have both system-assigned and user-assigned identities at the same time. In this case, the type
property would be SystemAssigned,UserAssigned
Adding the system-assigned type tells Azure to create and manage the identity for your application.
For example, a web app might look like the following:
{
"apiVersion": "2016-08-01",
"type": "Microsoft.Web/sites",
"name": "[variables('appName')]",
"location": "[resourceGroup().location]",
"identity": {
"type": "SystemAssigned"
},
"properties": {
"name": "[variables('appName')]",
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"hostingEnvironment": "",
"clientAffinityEnabled": false,
"alwaysOn": true
},
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]"
]
}
"identity": {
"type": "SystemAssigned",
"tenantId": "<TENANTID>",
"principalId": "<PRINCIPALID>"
}
Where <TENANTID> and <PRINCIPALID> are replaced with GUIDs. The tenantId property identifies what AAD
tenant the identity belongs to. The principalId is a unique identifier for the application's new identity. Within AAD,
the service principal has the same name that you gave to your App Service or Azure Functions instance.
Creating an app with a user-assigned identity requires that you create the identity and then add its resource
identifier to your app config.
Using the Azure portal
NOTE
This portal experience is being deployed and may not yet be available in all regions.
"identity": {
"type": "UserAssigned",
"userAssignedIdentities": {
"<RESOURCEID>": {}
}
}
NOTE
An application can have both system-assigned and user-assigned identities at the same time. In this case, the type
property would be SystemAssigned,UserAssigned
Adding the user-assigned type and a cotells Azure to create and manage the identity for your application.
For example, a web app might look like the following:
{
"apiVersion": "2016-08-01",
"type": "Microsoft.Web/sites",
"name": "[variables('appName')]",
"location": "[resourceGroup().location]",
"identity": {
"type": "UserAssigned",
"userAssignedIdentities": {
"[resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', variables('identityName'))]": {}
}
},
"properties": {
"name": "[variables('appName')]",
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"hostingEnvironment": "",
"clientAffinityEnabled": false,
"alwaysOn": true
},
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('hostingPlanName'))]",
"[resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', variables('identityName'))]"
]
}
"identity": {
"type": "UserAssigned",
"userAssignedIdentities": {
"<RESOURCEID>": {
"principalId": "<PRINCIPALID>",
"clientId": "<CLIENTID>"
}
}
}
Where <PRINCIPALID> and <CLIENTID> are replaced with GUIDs. The principalId is a unique identifier for the
identity which is used for AAD administration. The clientId is a unique identifier for the application's new identity
that is used for specifying which identity to use during runtime calls.
IMPORTANT
You may need to configure the target resource to allow access from your application. For example, if you request a token to
Key Vault, you need to make sure you have added an access policy that includes your application's identity. Otherwise, your
calls to Key Vault will be rejected, even if they include the token. To learn more about which resources support Azure Active
Directory tokens, see Azure services that support Azure AD authentication .
There is a simple REST protocol for obtaining a token in App Service and Azure Functions. For .NET applications,
the Microsoft.Azure.Services.AppAuthentication library provides an abstraction over this protocol and supports a
local development experience.
Using the Microsoft.Azure.Services.AppAuthentication library for .NET
For .NET applications and functions, the simplest way to work with a managed identity is through the
Microsoft.Azure.Services.AppAuthentication package. This library will also allow you to test your code locally on
your development machine, using your user account from Visual Studio, the Azure CLI, or Active Directory
Integrated Authentication. For more on local development options with this library, see the
Microsoft.Azure.Services.AppAuthentication reference. This section shows you how to get started with the library
in your code.
1. Add references to the Microsoft.Azure.Services.AppAuthentication and any other necessary NuGet
packages to your application. The below example also uses Microsoft.Azure.KeyVault.
2. Add the following code to your application, modifying to target the correct resource. This example shows
two ways to work with Azure Key Vault:
using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.KeyVault;
// ...
var azureServiceTokenProvider = new AzureServiceTokenProvider();
string accessToken = await azureServiceTokenProvider.GetAccessTokenAsync("https://vault.azure.net");
// OR
var kv = new KeyVaultClient(new
KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
To learn more about Microsoft.Azure.Services.AppAuthentication and the operations it exposes, see the
Microsoft.Azure.Services.AppAuthentication reference and the App Service and KeyVault with MSI .NET sample.
Using the REST protocol
An app with a managed identity has two environment variables defined:
MSI_ENDPOINT - the URL to the local token service.
MSI_SECRET - a header used to help mitigate server-side request forgery (SSRF ) attacks. The value is rotated
by the platform.
The MSI_ENDPOINT is a local URL from which your app can request tokens. To get a token for a resource, make
an HTTP GET request to this endpoint, including the following parameters:
A successful 200 OK response includes a JSON body with the following properties:
PROPERTY NAME DESCRIPTION
access_token The requested access token. The calling web service can use
this token to authenticate to the receiving web service.
expires_on The time when the access token expires. The date is
represented as the number of seconds from 1970-01-
01T0:0:0Z UTC until the expiration time. This value is used
to determine the lifetime of cached tokens.
token_type Indicates the token type value. The only type that Azure AD
supports is Bearer. For more information about bearer
tokens, see The OAuth 2.0 Authorization Framework: Bearer
Token Usage (RFC 6750).
This response is the same as the response for the AAD service-to-service access token request.
NOTE
Environment variables are set up when the process first starts, so after enabling a managed identity for your application, you
may need to restart your application, or redeploy its code, before MSI_ENDPOINT and MSI_SECRET are available to your
code.
HTTP/1.1 200 OK
Content-Type: application/json
{
"access_token": "eyJ0eXAi…",
"expires_on": "09/14/2017 00:00:00 PM +00:00",
"resource": "https://vault.azure.net",
"token_type": "Bearer"
}
Code examples
To make this request in C#:
In Node.JS:
const rp = require('request-promise');
const getToken = function(resource, apiver, cb) {
let options = {
uri: `${process.env["MSI_ENDPOINT"]}/?resource=${resource}&api-version=${apiver}`,
headers: {
'Secret': process.env["MSI_SECRET"]
}
};
rp(options)
.then(cb);
}
In PowerShell:
$apiVersion = "2017-09-01"
$resourceURI = "https://<AAD-resource-URI-for-resource-to-obtain-token>"
$tokenAuthURI = $env:MSI_ENDPOINT + "?resource=$resourceURI&api-version=$apiVersion"
$tokenResponse = Invoke-RestMethod -Method Get -Headers @{"Secret"="$env:MSI_SECRET"} -Uri $tokenAuthURI
$accessToken = $tokenResponse.access_token
Removing an identity
A system-assigned identity can be removed by disabling the feature using the portal, PowerShell, or CLI in the
same way that it was created. User-assigned identities can be removed individually. To remove all identities, in the
REST/ARM template protocol, this is done by setting the type to "None":
"identity": {
"type": "None"
}
Removing a system-assigned identity in this way will also delete it from AAD. System-assigned identities are also
automatically removed from AAD when the app resource is deleted.
NOTE
There is also an application setting that can be set, WEBSITE_DISABLE_MSI, which just disables the local token service.
However, it leaves the identity in place, and tooling will still show the managed identity as "on" or "enabled." As a result, use
of this setting is not recommended.
Next steps
Access SQL Database securely using a managed identity
How to troubleshoot "functions runtime is
unreachable"
3/27/2019 • 3 minutes to read • Edit Online
Error text
This doc is intended to troubleshoot the following error when displayed in the Functions portal.
Error: Azure Functions Runtime is unreachable. Click here for details on storage configuration
Summary
This issue occurs when the Azure Functions Runtime cannot start. The most common reason for this error to occur
is the function app losing access to its storage account. Read more about the storage account requirements here
Troubleshooting
We'll walk through the four most common error cases, how to identify, and how to resolve each case.
1. Storage Account deleted
2. Storage Account application settings deleted
3. Storage Account credentials invalid
4. Storage Account Inaccessible
5. Daily Execution Quota Full
Next Steps
Now that your Function App is back and operational take a look at our quickstarts and developer references to get
up and running again!
Create your first Azure Function
Jump right in and create your first function using the Azure Functions quickstart.
Azure Functions developer reference
Provides more technical information about the Azure Functions runtime and a reference for coding functions
and defining triggers and bindings.
Testing Azure Functions
Describes various tools and techniques for testing your functions.
How to scale Azure Functions
Discusses service plans available with Azure Functions, including the Consumption hosting plan, and how to
choose the right plan.
Learn more about Azure App Service
Azure Functions leverages Azure App Service for core functionality like deployments, environment variables,
and diagnostics.
App settings reference for Azure Functions
5/21/2019 • 5 minutes to read • Edit Online
App settings in a function app contain global configuration options that affect all functions for that function app.
When you run locally, these settings are accessed as local environment variables. This article lists the app settings
that are available in function apps.
There are several ways that you can add, update, and delete function app settings:
In the Azure portal.
By using the Azure CLI.
There are other global configuration options in the host.json file and in the local.settings.json file.
APPINSIGHTS_INSTRUMENTATIONKEY
The Application Insights instrumentation key if you're using Application Insights. See Monitor Azure Functions.
APPINSIGHTS_INSTRUMENTATIONKEY 5dbdd5e9-af77-484b-9032-64f83bb83bb
AzureWebJobsDashboard
Optional storage account connection string for storing logs and displaying them in the Monitor tab in the portal.
The storage account must be a general-purpose one that supports blobs, queues, and tables. See Storage account
and Storage account requirements.
AzureWebJobsDashboard DefaultEndpointsProtocol=https;AccountName=
[name];AccountKey=[key]
TIP
For performance and experience, it is recommended to use APPINSIGHTS_INSTRUMENTATIONKEY and App Insights for
monitoring instead of AzureWebJobsDashboard
AzureWebJobsDisableHomepage
true means disable the default landing page that is shown for the root URL of a function app. Default is false .
AzureWebJobsDisableHomepage true
When this app setting is omitted or set to false , a page similar to the following example is displayed in
response to the URL <functionappname>.azurewebsites.net .
AzureWebJobsDotNetReleaseCompilation
true means use Release mode when compiling .NET code; false means use Debug mode. Default is true .
AzureWebJobsDotNetReleaseCompilation true
AzureWebJobsFeatureFlags
A comma-delimited list of beta features to enable. Beta features enabled by these flags are not production ready,
but can be enabled for experimental use before they go live.
AzureWebJobsFeatureFlags feature1,feature2
AzureWebJobsScriptRoot
The path to the root directory where the host.json file and function folders are located. In a function app, the
default is %HOME%\site\wwwroot .
AzureWebJobsScriptRoot %HOME%\site\wwwroot
AzureWebJobsSecretStorageType
Specifies the repository or provider to use for key storage. Currently, the supported repositories are blob storage
("Blob") and the local file system ("Files"). The default is blob in version 2 and file system in version 1.
AzureWebJobsSecretStorageType Files
AzureWebJobsStorage
The Azure Functions runtime uses this storage account connection string for all functions except for HTTP
triggered functions. The storage account must be a general-purpose one that supports blobs, queues, and tables.
See Storage account and Storage account requirements.
AzureWebJobsStorage DefaultEndpointsProtocol=https;AccountName=
[name];AccountKey=[key]
AzureWebJobs_TypeScriptPath
Path to the compiler used for TypeScript. Allows you to override the default if you need to.
AzureWebJobs_TypeScriptPath %HOME%\typescript
FUNCTION_APP_EDIT_MODE
Dictates whether editing in the Azure portal is enabled. Valid values are "readwrite" and "readonly".
FUNCTION_APP_EDIT_MODE readonly
FUNCTIONS_EXTENSION_VERSION
The version of the Functions runtime to use in this function app. A tilde with major version means use the latest
version of that major version (for example, "~2"). When new versions for the same major version are available,
they are automatically installed in the function app. To pin the app to a specific version, use the full version
number (for example, "2.0.12345"). Default is "~2". A value of ~1 pins your app to version 1.x of the runtime.
FUNCTIONS_EXTENSION_VERSION ~2
FUNCTIONS_WORKER_RUNTIME
The language worker runtime to load in the function app. This will correspond to the language being used in your
application (for example, "dotnet"). For functions in multiple languages you will need to publish them to multiple
apps, each with a corresponding worker runtime value. Valid values are dotnet (C#/F#), node
(JavaScript/TypeScript), java (Java), powershell (PowerShell), and python (Python).
KEY SAMPLE VALUE
FUNCTIONS_WORKER_RUNTIME dotnet
WEBSITE_CONTENTAZUREFILECONNECTIONSTRING
For consumption plans only. Connection string for storage account where the function app code and
configuration are stored. See Create a function app.
WEBSITE_CONTENTAZUREFILECONNECTIONSTRING DefaultEndpointsProtocol=https;AccountName=
[name];AccountKey=[key]
WEBSITE_CONTENTSHARE
For consumption plans only. The file path to the function app code and configuration. Used with
WEBSITE_CONTENTAZUREFILECONNECTIONSTRING. Default is a unique string that begins with the
function app name. See Create a function app.
WEBSITE_CONTENTSHARE functionapp091999e2
WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT
The maximum number of instances that the function app can scale out to. Default is no limit.
NOTE
This setting is a preview feature - and only reliable if set to a value <= 5
WEBSITE_MAX_DYNAMIC_APPLICATION_SCALE_OUT 5
WEBSITE_NODE_DEFAULT_VERSION
Default is "8.11.1".
WEBSITE_NODE_DEFAULT_VERSION 8.11.1
WEBSITE_RUN_FROM_PACKAGE
Enables your function app to run from a mounted package file.
KEY SAMPLE VALUE
WEBSITE_RUN_FROM_PACKAGE 1
Valid values are either a URL that resolves to the location of a deployment package file, or 1 . When set to 1 ,
the package must be in the d:\home\data\SitePackages folder. When using zip deployment with this setting, the
package is automatically uploaded to this location. In preview, this setting was named WEBSITE_RUN_FROM_ZIP . For
more information, see Run your functions from a package file.
AZURE_FUNCTION_PROXY_DISABLE_LOCAL_CALL
By default Functions proxies will utilize a shortcut to send API calls from proxies directly to functions in the same
Function App, rather than creating a new HTTP request. This setting allows you to disable that behavior.
AZURE_FUNCTION_PROXY_BACKEND_URL_DECODE_SLASHES
This setting controls whether %2F is decoded as slashes in route parameters when they are inserted into the
backend URL.
Example
Here is an example proxies.json in a function app at the URL myfunction.com
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"root": {
"matchCondition": {
"route": "/{*all}"
},
"backendUri": "example.com/{all}"
}
}
}
Next steps
Learn how to update app settings
See global settings in the host.json file
See other app settings for App Service apps
Azure Blob storage bindings for Azure Functions
5/6/2019 • 27 minutes to read • Edit Online
This article explains how to work with Azure Blob storage bindings in Azure Functions. Azure Functions
supports trigger, input, and output bindings for blobs. The article includes a section for each binding:
Blob trigger
Blob input binding
Blob output binding
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
NOTE
Use the Event Grid trigger instead of the Blob storage trigger for blob-only storage accounts, for high scale, or to reduce
latency. For more information, see the Trigger section.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Trigger
The Blob storage trigger starts a function when a new or updated blob is detected. The blob contents are
provided as input to the function.
The Event Grid trigger has built-in support for blob events and can also be used to start a function when a new
or updated blob is detected. For an example, see the Image resize with Event Grid tutorial.
Use Event Grid instead of the Blob storage trigger for the following scenarios:
Blob storage accounts
High scale
Minimizing latency
Blob storage accounts
Blob storage accounts are supported for blob input and output bindings but not for blob triggers. Blob storage
triggers require a general-purpose storage account.
High scale
High scale can be loosely defined as containers that have more than 100,000 blobs in them or storage accounts
that have more than 100 blob updates per second.
Latency issues
If your function app is on the Consumption plan, there can be up to a 10-minute delay in processing new blobs
if a function app has gone idle. To avoid this latency, you can switch to an App Service plan with Always On
enabled. You can also use an Event Grid trigger with your Blob storage account. For an example, see the Event
Grid tutorial.
Queue storage trigger
Besides Event Grid, another alternative for processing blobs is the Queue storage trigger, but it has no built-in
support for blob events. You would have to create queue messages when creating or updating blobs. For an
example that assumes you've done that, see the blob input binding example later in this article.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
Java
JavaScript
Python
Trigger - C# example
The following example shows a C# function that writes a log when a blob is added or updated in the
samples-workitems container.
[FunctionName("BlobTriggerCSharp")]
public static void Run([BlobTrigger("samples-workitems/{name}")] Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length}
Bytes");
}
The string {name} in the blob trigger path samples-workitems/{name} creates a binding expression that you can
use in function code to access the file name of the triggering blob. For more information, see Blob name
patterns later in this article.
For more information about the BlobTrigger attribute, see Trigger - attributes.
Trigger - C# script example
The following example shows a blob trigger binding in a function.json file and Python code that uses the
binding. The function writes a log when a blob is added or updated in the samples-workitems container.
Here's the binding data in the function.json file:
{
"disabled": false,
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "samples-workitems/{name}",
"connection":"MyStorageAccountAppSetting"
}
]
}
The string {name} in the blob trigger path samples-workitems/{name} creates a binding expression that you can
use in function code to access the file name of the triggering blob. For more information, see Blob name
patterns later in this article.
For more information about function.json file properties, see the Configuration section explains these
properties.
Here's C# script code that binds to a Stream :
using Microsoft.WindowsAzure.Storage.Blob;
{
"disabled": false,
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "samples-workitems/{name}",
"connection":"MyStorageAccountAppSetting"
}
]
}
The string {name} in the blob trigger path samples-workitems/{name} creates a binding expression that you can
use in function code to access the file name of the triggering blob. For more information, see Blob name
patterns later in this article.
For more information about function.json file properties, see the Configuration section explains these
properties.
Here's the JavaScript code:
module.exports = function(context) {
context.log('Node.js Blob trigger function processed', context.bindings.myBlob);
context.done();
};
The string {name} in the blob trigger path samples-workitems/{name} creates a binding expression that you can
use in function code to access the file name of the triggering blob. For more information, see Blob name
patterns later in this article.
For more information about function.json file properties, see the Configuration section explains these
properties.
Here's the Python code:
import logging
import azure.functions as func
{
"disabled": false,
"bindings": [
{
"name": "file",
"type": "blobTrigger",
"direction": "in",
"path": "myblob/{name}",
"connection":"MyStorageAccountAppSetting"
}
]
}
Trigger - attributes
In C# class libraries, use the following attributes to configure a blob trigger:
BlobTriggerAttribute
The attribute's constructor takes a path string that indicates the container to watch and optionally a blob
name pattern. Here's an example:
[FunctionName("ResizeImage")]
public static void Run(
[BlobTrigger("sample-images/{name}")] Stream image,
[Blob("sample-images-md/{name}", FileAccess.Write)] Stream imageSmall)
{
....
}
You can set the Connection property to specify the storage account to use, as shown in the following
example:
[FunctionName("ResizeImage")]
public static void Run(
[BlobTrigger("sample-images/{name}", Connection = "StorageConnectionAppSetting")] Stream image,
[Blob("sample-images-md/{name}", FileAccess.Write)] Stream imageSmall)
{
....
}
[StorageAccount("ClassLevelStorageAppSetting")]
public static class AzureFunctions
{
[FunctionName("BlobTrigger")]
[StorageAccount("FunctionLevelStorageAppSetting")]
public static void Run( //...
{
....
}
The storage account to use is determined in the following order:
The BlobTrigger attribute's Connection property.
The StorageAccount attribute applied to the same parameter as the BlobTrigger attribute.
The StorageAccount attribute applied to the function.
The StorageAccount attribute applied to the class.
The default storage account for the function app ("AzureWebJobsStorage" app setting).
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
BlobTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
Trigger - usage
In C# and C# script, you can use the following parameter types for the triggering blob:
Stream
TextReader
string
Byte[]
A POCO serializable as JSON
ICloudBlob 1
CloudBlockBlob 1
CloudPageBlob 1
CloudAppendBlob 1
"path": "input/{blobname}.{blobextension}",
If the blob is named original-Blob1.txt, the values of the blobname and blobextension variables in function
code are original-Blob1 and txt.
Filter on blob name
The following example triggers only on blobs in the input container that start with the string "original-":
"path": "input/original-{name}",
If the blob name is original-Blob1.txt, the value of the name variable in function code is Blob1 .
Filter on file type
The following example triggers only on .png files:
"path": "samples/{name}.png",
"path": "images/{{20140101}}-{name}",
If the blob is named {20140101 }-soundfile.mp3, the name variable value in the function code is soundfile.mp3.
Trigger - metadata
The blob trigger provides several metadata properties. These properties can be used as part of binding
expressions in other bindings or as parameters in your code. These values have the same semantics as the
CloudBlob type.
For example, the following C# script and JavaScript examples log the path to the triggering blob, including the
container:
Trigger - polling
If the blob container being monitored contains more than 10,000 blobs (across all containers), the Functions
runtime scans log files to watch for new or changed blobs. This process can result in delays. A function might
not get triggered until several minutes or longer after the blob is created.
WARNING
In addition, storage logs are created on a "best effort" basis. There's no guarantee that all events are captured. Under
some conditions, logs may be missed.
If you require faster or more reliable blob processing, consider creating a queue message when you create the blob. Then
use a queue trigger instead of a blob trigger to process the blob. Another option is to use Event Grid; see the tutorial
Automate resizing uploaded images using Event Grid.
Input
Use a Blob storage input binding to read blobs.
Input - example
See the language-specific example:
C#
C# script (.csx)
Java
JavaScript
Python
Input - C# example
The following example is a C# function that uses a queue trigger and an input blob binding. The queue
message contains the name of the blob, and the function logs the size of the blob.
[FunctionName("BlobInput")]
public static void Run(
[QueueTrigger("myqueue-items")] string myQueueItem,
[Blob("samples-workitems/{queueTrigger}", FileAccess.Read)] Stream myBlob,
ILogger log)
{
log.LogInformation($"BlobInput processed blob\n Name:{myQueueItem} \n Size: {myBlob.Length} bytes");
}
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}
public static void Run(string myQueueItem, string myInputBlob, out string myOutputBlob, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
myOutputBlob = myInputBlob;
}
Input - JavaScript example
The following example shows blob input and output bindings in a function.json file and JavaScript code that
uses the bindings. The function makes a copy of a blob. The function is triggered by a queue message that
contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
In the function.json file, the queueTrigger metadata property is used to specify the blob name in the path
properties:
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}
module.exports = function(context) {
context.log('Node.js Queue trigger function processed', context.bindings.myQueueItem);
context.bindings.myOutputBlob = context.bindings.myInputBlob;
context.done();
};
import logging
import azure.functions as func
@FunctionName("getBlobSize")
@StorageAccount("Storage_Account_Connection_String")
public void blobSize(
@QueueTrigger(
name = "filename",
queueName = "myqueue-items-sample")
String filename,
@BlobInput(
name = "file",
dataType = "binary",
path = "samples-workitems/{queueTrigger}")
byte[] content,
final ExecutionContext context) {
context.getLogger().info("The size of \"" + filename + "\" is: " + content.length + " bytes");
}
In the Java functions runtime library, use the @BlobInput annotation on parameters whose value would come
from a blob. This annotation can be used with native Java types, POJOs, or nullable values using Optional<T> .
Input - attributes
In C# class libraries, use the BlobAttribute.
The attribute's constructor takes the path to the blob and a FileAccess parameter indicating read or write, as
shown in the following example:
[FunctionName("BlobInput")]
public static void Run(
[QueueTrigger("myqueue-items")] string myQueueItem,
[Blob("samples-workitems/{queueTrigger}", FileAccess.Read)] Stream myBlob,
ILogger log)
{
log.LogInformation($"BlobInput processed blob\n Name:{myQueueItem} \n Size: {myBlob.Length} bytes");
}
You can set the Connection property to specify the storage account to use, as shown in the following example:
[FunctionName("BlobInput")]
public static void Run(
[QueueTrigger("myqueue-items")] string myQueueItem,
[Blob("samples-workitems/{queueTrigger}", FileAccess.Read, Connection = "StorageConnectionAppSetting")]
Stream myBlob,
ILogger log)
{
log.LogInformation($"BlobInput processed blob\n Name:{myQueueItem} \n Size: {myBlob.Length} bytes");
}
You can use the StorageAccount attribute to specify the storage account at class, method, or parameter level.
For more information, see Trigger - attributes.
Input - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
Blob attribute.
When you're developing locally, app settings go into the local.settings.json file.
Input - usage
In C# and C# script, you can use the following parameter types for the blob input binding:
Stream
TextReader
string
Byte[]
CloudBlobContainer
CloudBlobDirectory
ICloudBlob 1
CloudBlockBlob 1
CloudPageBlob 1
CloudAppendBlob 1
Output
Use Blob storage output bindings to write blobs.
Output - example
See the language-specific example:
C#
C# script (.csx)
Java
JavaScript
Python
Output - C# example
The following example is a C# function that uses a blob trigger and two output blob bindings. The function is
triggered by the creation of an image blob in the sample-images container. It creates small and medium size
copies of the image blob.
[FunctionName("ResizeImage")]
public static void Run(
[BlobTrigger("sample-images/{name}")] Stream image,
[Blob("sample-images-sm/{name}", FileAccess.Write)] Stream imageSmall,
[Blob("sample-images-md/{name}", FileAccess.Write)] Stream imageMedium)
{
var imageBuilder = ImageResizer.ImageBuilder.Current;
var size = imageDimensionsTable[ImageSize.Small];
imageBuilder.Build(image, imageSmall,
new ResizeSettings(size.Item1, size.Item2, FitMode.Max, null), false);
image.Position = 0;
size = imageDimensionsTable[ImageSize.Medium];
imageBuilder.Build(image, imageMedium,
new ResizeSettings(size.Item1, size.Item2, FitMode.Max, null), false);
}
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}
The configuration section explains these properties.
Here's the C# script code:
public static void Run(string myQueueItem, string myInputBlob, out string myOutputBlob, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
myOutputBlob = myInputBlob;
}
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}
module.exports = function(context) {
context.log('Node.js Queue trigger function processed', context.bindings.myQueueItem);
context.bindings.myOutputBlob = context.bindings.myInputBlob;
context.done();
};
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "queuemsg",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "inputblob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "outputblob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false,
"scriptFile": "__init__.py"
}
import logging
import azure.functions as func
@FunctionName("copyBlobQueueTrigger")
@StorageAccount("Storage_Account_Connection_String")
@BlobOutput(
name = "target",
path = "myblob/{queueTrigger}-Copy")
public String copyBlobQueue(
@QueueTrigger(
name = "filename",
dataType = "string",
queueName = "myqueue-items")
String filename,
@BlobInput(
name = "file",
path = "samples-workitems/{queueTrigger}")
String content,
final ExecutionContext context) {
context.getLogger().info("The content of \"" + filename + "\" is: " + content);
return content;
}
In the Java functions runtime library, use the @BlobOutput annotation on function parameters whose value
would be written to an object in blob storage. The parameter type should be OutputBinding<T> , where T is any
native Java type or a POJO.
Output - attributes
In C# class libraries, use the BlobAttribute.
The attribute's constructor takes the path to the blob and a FileAccess parameter indicating read or write, as
shown in the following example:
[FunctionName("ResizeImage")]
public static void Run(
[BlobTrigger("sample-images/{name}")] Stream image,
[Blob("sample-images-md/{name}", FileAccess.Write)] Stream imageSmall)
{
...
}
You can set the Connection property to specify the storage account to use, as shown in the following example:
[FunctionName("ResizeImage")]
public static void Run(
[BlobTrigger("sample-images/{name}")] Stream image,
[Blob("sample-images-md/{name}", FileAccess.Write, Connection = "StorageConnectionAppSetting")] Stream
imageSmall)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
Blob attribute.
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
In C# and C# script, you can bind to the following types to write blobs:
TextWriter
out string
out Byte[]
CloudBlobStream
Stream
CloudBlobContainer 1
CloudBlobDirectory
ICloudBlob 2
CloudBlockBlob 2
CloudPageBlob 2
CloudAppendBlob 2
1 Requires "in" binding direction in function.json or FileAccess.Read in a C# class library. However, you can
use the container object that the runtime provides to do write operations, such as uploading blobs to the
container.
2 Requires "inout" binding direction in function.json or FileAccess.ReadWrite in a C# class library.
If you try to bind to one of the Storage SDK types and get an error message, make sure that you have a
reference to the correct Storage SDK version.
In async functions, use the return value or IAsyncCollector instead of an out parameter.
Binding to string or Byte[] is only recommended if the blob size is small, as the entire blob contents are
loaded into memory. Generally, it is preferable to use a Stream or CloudBlockBlob type. For more information,
see Concurrency and memory usage earlier in this article.
In JavaScript, access the blob data using context.bindings.<name from function.json> .
Next steps
Learn more about Azure functions triggers and bindings
Azure Cosmos DB bindings for Azure Functions 1.x
3/11/2019 • 28 minutes to read • Edit Online
This article explains how to work with Azure Cosmos DB bindings in Azure Functions. Azure Functions supports
trigger, input, and output bindings for Azure Cosmos DB.
NOTE
This article is for Azure Functions 1.x. For information about how to use these bindings in Functions 2.x, see Azure Cosmos
DB bindings for Azure Functions 2.x.
This binding was originally named DocumentDB. In Functions version 1.x, only the trigger was renamed Cosmos DB; the
input binding, output binding, and NuGet package retain the DocumentDB name.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
NOTE
Azure Cosmos DB bindings are only supported for use with the SQL API. For all other Azure Cosmos DB APIs, you should
access the database from your function by using the static client for your API, including Azure Cosmos DB's API for
MongoDB, Cassandra API, Gremlin API, and Table API.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
Trigger
The Azure Cosmos DB Trigger uses the Azure Cosmos DB Change Feed to listen for inserts and updates across
partitions. The change feed publishes inserts and updates, not deletions.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
JavaScript
Skip trigger examples
Trigger - C# example
The following example shows a C# function that is invoked when there are inserts or updates in the specified
database and collection.
using Microsoft.Azure.Documents;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.Collections.Generic;
namespace CosmosDBSamplesV1
{
public static class CosmosTrigger
{
[FunctionName("CosmosTrigger")]
public static void Run([CosmosDBTrigger(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
LeaseCollectionName = "leases",
CreateLeaseCollectionIfNotExists = true)]IReadOnlyList<Document> documents,
TraceWriter log)
{
if (documents != null && documents.Count > 0)
{
log.Info($"Documents modified: {documents.Count}");
log.Info($"First document Id: {documents[0].Id}");
}
}
}
}
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "<connection-app-setting>",
"databaseName": "Tasks",
"collectionName": "Items",
"createLeaseCollectionIfNotExists": true
}
Here's the C# script code:
#r "Microsoft.Azure.Documents.Client"
using System;
using Microsoft.Azure.Documents;
using System.Collections.Generic;
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "<connection-app-setting>",
"databaseName": "Tasks",
"collectionName": "Items",
"createLeaseCollectionIfNotExists": true
}
context.done();
}
Trigger - attributes
In C# class libraries, use the CosmosDBTrigger attribute.
The attribute's constructor takes the database name and collection name. For information about those settings
and other properties that you can configure, see Trigger - configuration. Here's a CosmosDBTrigger attribute
example in a method signature:
[FunctionName("DocumentUpdates")]
public static void Run(
[CosmosDBTrigger("database", "collection", ConnectionStringSetting = "myCosmosDB")]
IReadOnlyList<Document> documents,
TraceWriter log)
{
...
}
For a complete example, see Trigger - C# example.
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
CosmosDBTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
Trigger - usage
The trigger requires a second collection that it uses to store leases over the partitions. Both the collection being
monitored and the collection that contains the leases must be available for the trigger to work.
IMPORTANT
If multiple functions are configured to use a Cosmos DB trigger for the same collection, each of the functions should use a
dedicated lease collection or specify a different LeaseCollectionPrefix for each function. Otherwise, only one of the
functions will be triggered. For information about the prefix, see the Configuration section.
The trigger doesn't indicate whether a document was updated or inserted, it just provides the document itself. If
you need to handle updates and inserts differently, you could do that by implementing timestamp fields for
insertion or update.
Input
The Azure Cosmos DB input binding uses the SQL API to retrieve one or more Azure Cosmos DB documents
and passes them to the input parameter of the function. The document ID or query parameters can be
determined based on the trigger that invokes the function.
Input - examples
See the language-specific examples that read a single document by specifying an ID value:
C#
C# script (.csx)
JavaScript
F#
Skip input examples
Input - C# examples
This section contains the following examples:
Queue trigger, look up ID from JSON
HTTP trigger, look up ID from query string
HTTP trigger, look up ID from route data
HTTP trigger, look up ID from route data, using SqlQuery
HTTP trigger, get multiple docs, using SqlQuery
HTTP trigger, get multiple docs, using DocumentClient
The examples refer to a simple ToDoItem type:
namespace CosmosDBSamplesV1
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
namespace CosmosDBSamplesV1
{
public class ToDoItemLookup
{
public string ToDoItemId { get; set; }
}
}
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
namespace CosmosDBSamplesV1
{
public static class DocByIdFromJSON
{
[FunctionName("DocByIdFromJSON")]
public static void Run(
[QueueTrigger("todoqueueforlookup")] ToDoItemLookup toDoItemLookup,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
Id = "{ToDoItemId}")]ToDoItem toDoItem,
TraceWriter log)
{
log.Info($"C# Queue trigger function processed Id={toDoItemLookup?.ToDoItemId}");
if (toDoItem == null)
{
log.Info($"ToDo item not found");
}
else
{
log.Info($"Found ToDo item, Description={toDoItem.Description}");
}
}
}
}
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using System.Net;
using System.Net.Http;
namespace CosmosDBSamplesV1
{
public static class DocByIdFromQueryString
{
[FunctionName("DocByIdFromQueryString")]
public static HttpResponseMessage Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
Id = "{Query.id}")] ToDoItem toDoItem,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
if (toDoItem == null)
{
log.Info($"ToDo item not found");
}
else
{
log.Info($"Found ToDo item, Description={toDoItem.Description}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
}
}
namespace CosmosDBSamplesV1
{
public static class DocByIdFromRouteData
{
[FunctionName("DocByIdFromRouteData")]
public static HttpResponseMessage Run(
[HttpTrigger(
AuthorizationLevel.Anonymous, "get", "post",
Route = "todoitems/{id}")]HttpRequestMessage req,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
Id = "{id}")] ToDoItem toDoItem,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
if (toDoItem == null)
{
log.Info($"ToDo item not found");
}
else
{
log.Info($"Found ToDo item, Description={toDoItem.Description}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
}
}
namespace CosmosDBSamplesV1
{
public static class DocByIdFromRouteDataUsingSqlQuery
{
[FunctionName("DocByIdFromRouteDataUsingSqlQuery")]
public static HttpResponseMessage Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post",
Route = "todoitems2/{id}")]HttpRequestMessage req,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
SqlQuery = "select * from ToDoItems r where r.id = {id}")] IEnumerable<ToDoItem> toDoItems,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
foreach (ToDoItem toDoItem in toDoItems)
{
log.Info(toDoItem.Description);
}
return req.CreateResponse(HttpStatusCode.OK);
}
}
}
namespace CosmosDBSamplesV1
{
public static class DocsBySqlQuery
{
[FunctionName("DocsBySqlQuery")]
public static HttpResponseMessage Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
HttpRequestMessage req,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
SqlQuery = "SELECT top 2 * FROM c order by c._ts desc")]
IEnumerable<ToDoItem> toDoItems,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
foreach (ToDoItem toDoItem in toDoItems)
{
log.Info(toDoItem.Description);
}
return req.CreateResponse(HttpStatusCode.OK);
}
}
}
namespace CosmosDBSamplesV1
{
public static class DocsByUsingDocumentClient
{
[FunctionName("DocsByUsingDocumentClient")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection")] DocumentClient client,
TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
if (searchterm == null)
{
return req.CreateResponse(HttpStatusCode.NotFound);
}
while (query.HasMoreResults)
{
foreach (ToDoItem result in await query.ExecuteNextAsync())
{
log.Info(result.Description);
}
}
return req.CreateResponse(HttpStatusCode.OK);
}
}
}
namespace CosmosDBSamplesV1
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
{
"name": "inputDocument",
"type": "documentDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"id" : "{queueTrigger}",
"partitionKey": "{partition key value}",
"connection": "MyAccount_COSMOSDB",
"direction": "in"
}
using System;
using System.Net;
if (toDoItem == null)
{
log.Info($"ToDo item not found");
}
else
{
log.Info($"Found ToDo item, Description={toDoItem.Description}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
using System.Net;
if (toDoItem == null)
{
log.Info($"ToDo item not found");
}
else
{
log.Info($"Found ToDo item, Description={toDoItem.Description}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
using System.Net;
using System.Net;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;
if (searchterm == null)
{
return req.CreateResponse(HttpStatusCode.NotFound);
}
while (query.HasMoreResults)
{
foreach (ToDoItem result in await query.ExecuteNextAsync())
{
log.Info(result.Description);
}
}
return req.CreateResponse(HttpStatusCode.OK);
}
// Change input document contents using Azure Cosmos DB input binding, using
context.bindings.inputDocumentOut
module.exports = function (context) {
context.bindings.inputDocumentOut = context.bindings.inputDocumentIn;
context.bindings.inputDocumentOut.text = "This was updated!";
context.done();
};
context.done();
};
context.done();
};
{
"name": "inputDocument",
"type": "documentDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"id" : "{queueTrigger}",
"connection": "MyAccount_COSMOSDB",
"direction": "in"
}
This example requires a project.json file that specifies the FSharp.Interop.Dynamic and Dynamitey NuGet
dependencies:
{
"frameworks": {
"net46": {
"dependencies": {
"Dynamitey": "1.0.2",
"FSharp.Interop.Dynamic": "3.0.0"
}
}
}
}
Input - attributes
In C# class libraries, use the DocumentDB attribute.
The attribute's constructor takes the database name and collection name. For information about those settings
and other properties that you can configure, see the following configuration section.
Input - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
DocumentDB attribute.
When you're developing locally, app settings go into the local.settings.json file.
Input - usage
In C# and F# functions, when the function exits successfully, any changes made to the input document via named
input parameters are automatically persisted.
In JavaScript functions, updates are not made automatically upon function exit. Instead, use
context.bindings.<documentName>In and context.bindings.<documentName>Out to make updates. See the JavaScript
example.
Output
The Azure Cosmos DB output binding lets you write a new document to an Azure Cosmos DB database using the
SQL API.
Output - examples
See the language-specific examples:
C#
C# script (.csx)
JavaScript
F#
See also the input example that uses DocumentClient .
Skip output examples
Output - C# examples
This section contains the following examples:
Queue trigger, write one doc
Queue trigger, write docs using IAsyncCollector
The examples refer to a simple ToDoItem type:
namespace CosmosDBSamplesV1
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System;
namespace CosmosDBSamplesV1
{
public static class WriteOneDoc
{
[FunctionName("WriteOneDoc")]
public static void Run(
[QueueTrigger("todoqueueforwrite")] string queueMessage,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection")]out dynamic document,
TraceWriter log)
{
document = new { Description = queueMessage, id = Guid.NewGuid() };
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.Threading.Tasks;
namespace CosmosDBSamplesV1
{
public static class WriteDocsIAsyncCollector
{
[FunctionName("WriteDocsIAsyncCollector")]
public static async Task Run(
[QueueTrigger("todoqueueforwritemulti")] ToDoItem[] toDoItemsIn,
[DocumentDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection")]
IAsyncCollector<ToDoItem> toDoItemsOut,
TraceWriter log)
{
log.Info($"C# Queue trigger function processed {toDoItemsIn?.Length} items");
{
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
The function creates Azure Cosmos DB documents in the following format for each record:
{
"id": "John Henry-123456",
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
{
"name": "employeeDocument",
"type": "documentDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"createIfNotExists": true,
"connection": "MyAccount_COSMOSDB",
"direction": "out"
}
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json.Linq;
public static void Run(string myQueueItem, out object employeeDocument, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
employeeDocument = new {
id = employee.name + "-" + employee.employeeId,
name = employee.name,
employeeId = employee.employeeId,
address = employee.address
};
}
namespace CosmosDBSamplesV1
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
{
"bindings": [
{
"name": "toDoItemsIn",
"type": "queueTrigger",
"direction": "in",
"queueName": "todoqueueforwritemulti",
"connection": "AzureWebJobsStorage"
},
{
"type": "documentDB",
"name": "toDoItemsOut",
"databaseName": "ToDoItems",
"collectionName": "Items",
"connection": "CosmosDBConnection",
"direction": "out"
}
],
"disabled": false
}
public static async Task Run(ToDoItem[] toDoItemsIn, IAsyncCollector<ToDoItem> toDoItemsOut, TraceWriter log)
{
log.Info($"C# Queue trigger function processed {toDoItemsIn?.Length} items");
{
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
The function creates Azure Cosmos DB documents in the following format for each record:
{
"id": "John Henry-123456",
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
{
"name": "employeeDocument",
"type": "documentDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"createIfNotExists": true,
"connection": "MyAccount_COSMOSDB",
"direction": "out"
}
context.bindings.employeeDocument = JSON.stringify({
id: context.bindings.myQueueItem.name + "-" + context.bindings.myQueueItem.employeeId,
name: context.bindings.myQueueItem.name,
employeeId: context.bindings.myQueueItem.employeeId,
address: context.bindings.myQueueItem.address
});
context.done();
};
{
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
The function creates Azure Cosmos DB documents in the following format for each record:
{
"id": "John Henry-123456",
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
{
"name": "employeeDocument",
"type": "documentDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"createIfNotExists": true,
"connection": "MyAccount_COSMOSDB",
"direction": "out"
}
type Employee = {
id: string
name: string
employeeId: string
address: string
}
This example requires a project.json file that specifies the FSharp.Interop.Dynamic and Dynamitey NuGet
dependencies:
{
"frameworks": {
"net46": {
"dependencies": {
"Dynamitey": "1.0.2",
"FSharp.Interop.Dynamic": "3.0.0"
}
}
}
}
Output - attributes
In C# class libraries, use the DocumentDB attribute.
The attribute's constructor takes the database name and collection name. For information about those settings
and other properties that you can configure, see Output - configuration. Here's a DocumentDB attribute example in
a method signature:
[FunctionName("QueueToDocDB")]
public static void Run(
[QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")] string myQueueItem,
[DocumentDB("ToDoList", "Items", Id = "id", ConnectionStringSetting = "myCosmosDB")] out dynamic
document)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
DocumentDB attribute.
FUNCTION.JSON PROPERTY ATTRIBUTE PROPERTY DESCRIPTION
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
By default, when you write to the output parameter in your function, a document is created in your database. This
document has an automatically generated GUID as the document ID. You can specify the document ID of the
output document by specifying the id property in the JSON object passed to the output parameter.
NOTE
When you specify the ID of an existing document, it gets overwritten by the new output document.
This article explains how to work with Azure Cosmos DB bindings in Azure Functions 2.x. Azure Functions
supports trigger, input, and output bindings for Azure Cosmos DB.
NOTE
This article is for Azure Functions version 2.x. For information about how to use these bindings in Functions 1.x, see Azure
Cosmos DB bindings for Azure Functions 1.x.
This binding was originally named DocumentDB. In Functions version 2.x, the trigger, bindings, and package are all named
Cosmos DB.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
Supported APIs
Azure Cosmos DB bindings are only supported for use with the SQL API. For all other Azure Cosmos DB APIs,
you should access the database from your function by using the static client for your API, including Azure Cosmos
DB's API for MongoDB, Cassandra API, Gremlin API, and Table API.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Python Register the extension
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Trigger
The Azure Cosmos DB Trigger uses the Azure Cosmos DB Change Feed to listen for inserts and updates across
partitions. The change feed publishes inserts and updates, not deletions.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
Java
JavaScript
Python
Skip trigger examples
Trigger - C# example
The following example shows a C# function that is invoked when there are inserts or updates in the specified
database and collection.
using Microsoft.Azure.Documents;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.Collections.Generic;
using Microsoft.Extensions.Logging;
namespace CosmosDBSamplesV2
{
public static class CosmosTrigger
{
[FunctionName("CosmosTrigger")]
public static void Run([CosmosDBTrigger(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
LeaseCollectionName = "leases",
CreateLeaseCollectionIfNotExists = true)]IReadOnlyList<Document> documents,
ILogger log)
{
if (documents != null && documents.Count > 0)
{
log.LogInformation($"Documents modified: {documents.Count}");
log.LogInformation($"First document Id: {documents[0].Id}");
}
}
}
}
#r "Microsoft.Azure.DocumentDB.Core"
using System;
using Microsoft.Azure.Documents;
using System.Collections.Generic;
using Microsoft.Extensions.Logging;
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "<connection-app-setting>",
"databaseName": "Tasks",
"collectionName": "Items",
"createLeaseCollectionIfNotExists": true
}
context.done();
}
@FunctionName("cosmosDBMonitor")
public void cosmosDbProcessor(
@CosmosDBTrigger(name = "items",
databaseName = "ToDoList",
collectionName = "Items",
leaseCollectionName = "leases",
createLeaseCollectionIfNotExists = true,
connectionStringSetting = "AzureCosmosDBConnection") String[] items,
final ExecutionContext context ) {
context.getLogger().info(items.length + "item(s) is/are changed.");
}
In the Java functions runtime library, use the @CosmosDBTrigger annotation on parameters whose value would
come from Cosmos DB. This annotation can be used with native Java types, POJOs, or nullable values using
Optional.
Skip trigger examples
Trigger - Python example
The following example shows a Cosmos DB trigger binding in a function.json file and a Python function that uses
the binding. The function writes log messages when Cosmos DB records are modified.
Here's the binding data in the function.json file:
{
"name": "documents",
"type": "cosmosDBTrigger",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "<connection-app-setting>",
"databaseName": "Tasks",
"collectionName": "Items",
"createLeaseCollectionIfNotExists": true
}
import logging
import azure.functions as func
[FunctionName("DocumentUpdates")]
public static void Run(
[CosmosDBTrigger("database", "collection", ConnectionStringSetting = "myCosmosDB")]
IReadOnlyList<Document> documents,
ILogger log)
{
...
}
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
CosmosDBTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
Trigger - usage
The trigger requires a second collection that it uses to store leases over the partitions. Both the collection being
monitored and the collection that contains the leases must be available for the trigger to work.
IMPORTANT
If multiple functions are configured to use a Cosmos DB trigger for the same collection, each of the functions should use a
dedicated lease collection or specify a different LeaseCollectionPrefix for each function. Otherwise, only one of the
functions will be triggered. For information about the prefix, see the Configuration section.
The trigger doesn't indicate whether a document was updated or inserted, it just provides the document itself. If
you need to handle updates and inserts differently, you could do that by implementing timestamp fields for
insertion or update.
Input
The Azure Cosmos DB input binding uses the SQL API to retrieve one or more Azure Cosmos DB documents and
passes them to the input parameter of the function. The document ID or query parameters can be determined
based on the trigger that invokes the function.
Input - examples
See the language-specific examples that read a single document by specifying an ID value:
C#
C# script (.csx)
F#
Java
JavaScript
Python
Skip input examples
Input - C# examples
This section contains the following examples:
Queue trigger, look up ID from JSON
HTTP trigger, look up ID from query string
HTTP trigger, look up ID from route data
HTTP trigger, look up ID from route data, using SqlQuery
HTTP trigger, get multiple docs, using SqlQuery
HTTP trigger, get multiple docs, using DocumentClient
The examples refer to a simple ToDoItem type:
namespace CosmosDBSamplesV2
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
namespace CosmosDBSamplesV2
{
public class ToDoItemLookup
{
public string ToDoItemId { get; set; }
}
}
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace CosmosDBSamplesV2
{
public static class DocByIdFromJSON
{
[FunctionName("DocByIdFromJSON")]
public static void Run(
[QueueTrigger("todoqueueforlookup")] ToDoItemLookup toDoItemLookup,
[CosmosDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
Id = "{ToDoItemId}")]ToDoItem toDoItem,
ILogger log)
{
log.LogInformation($"C# Queue trigger function processed Id={toDoItemLookup?.ToDoItemId}");
if (toDoItem == null)
{
log.LogInformation($"ToDo item not found");
}
else
{
log.LogInformation($"Found ToDo item, Description={toDoItem.Description}");
}
}
}
}
NOTE
The HTTP query string parameter is case-sensitive.
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace CosmosDBSamplesV2
{
public static class DocByIdFromQueryString
{
[FunctionName("DocByIdFromQueryString")]
public static IActionResult Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
HttpRequest req,
[CosmosDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
Id = "{Query.id}")] ToDoItem toDoItem,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
if (toDoItem == null)
{
log.LogInformation($"ToDo item not found");
}
else
{
log.LogInformation($"Found ToDo item, Description={toDoItem.Description}");
}
return new OkResult();
}
}
}
namespace CosmosDBSamplesV2
{
public static class DocByIdFromRouteData
{
[FunctionName("DocByIdFromRouteData")]
public static IActionResult Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post",
Route = "todoitems/{id}")]HttpRequest req,
[CosmosDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
Id = "{id}")] ToDoItem toDoItem,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
if (toDoItem == null)
{
log.LogInformation($"ToDo item not found");
}
else
{
log.LogInformation($"Found ToDo item, Description={toDoItem.Description}");
}
return new OkResult();
}
}
}
namespace CosmosDBSamplesV2
{
public static class DocByIdFromRouteDataUsingSqlQuery
{
[FunctionName("DocByIdFromRouteDataUsingSqlQuery")]
public static IActionResult Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post",
Route = "todoitems2/{id}")]HttpRequest req,
[CosmosDB("ToDoItems", "Items",
ConnectionStringSetting = "CosmosDBConnection",
SqlQuery = "select * from ToDoItems r where r.id = {id}")]
IEnumerable<ToDoItem> toDoItems,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
namespace CosmosDBSamplesV2
{
public static class DocsBySqlQuery
{
[FunctionName("DocsBySqlQuery")]
public static IActionResult Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
HttpRequest req,
[CosmosDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
SqlQuery = "SELECT top 2 * FROM c order by c._ts desc")]
IEnumerable<ToDoItem> toDoItems,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
foreach (ToDoItem toDoItem in toDoItems)
{
log.LogInformation(toDoItem.Description);
}
return new OkResult();
}
}
}
namespace CosmosDBSamplesV2
{
public static class DocsByUsingDocumentClient
{
[FunctionName("DocsByUsingDocumentClient")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post",
Route = null)]HttpRequest req,
[CosmosDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection")] DocumentClient client,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
while (query.HasMoreResults)
{
foreach (ToDoItem result in await query.ExecuteNextAsync())
{
log.LogInformation(result.Description);
}
}
return new OkResult();
}
}
}
namespace CosmosDBSamplesV2
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
{
"name": "inputDocument",
"type": "cosmosDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"id" : "{queueTrigger}",
"partitionKey": "{partition key value}",
"connectionStringSetting": "MyAccount_COSMOSDB",
"direction": "in"
}
using System;
using System.Net;
using Microsoft.Extensions.Logging;
if (toDoItem == null)
{
log.LogInformation($"ToDo item not found");
}
else
{
log.LogInformation($"Found ToDo item, Description={toDoItem.Description}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
using System.Net;
using Microsoft.Extensions.Logging;
if (toDoItem == null)
{
log.LogInformation($"ToDo item not found");
}
else
{
log.LogInformation($"Found ToDo item, Description={toDoItem.Description}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
using System.Net;
using Microsoft.Extensions.Logging;
using System.Net;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;
using Microsoft.Extensions.Logging;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, DocumentClient client, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
if (searchterm == null)
{
return req.CreateResponse(HttpStatusCode.NotFound);
}
while (query.HasMoreResults)
{
foreach (ToDoItem result in await query.ExecuteNextAsync())
{
log.LogInformation(result.Description);
}
}
return req.CreateResponse(HttpStatusCode.OK);
}
// Change input document contents using Azure Cosmos DB input binding, using
context.bindings.inputDocumentOut
module.exports = function (context) {
context.bindings.inputDocumentOut = context.bindings.inputDocumentIn;
context.bindings.inputDocumentOut.text = "This was updated!";
context.done();
};
context.done();
};
context.done();
};
import logging
import azure.functions as func
return 'OK'
import logging
import azure.functions as func
{
"name": "inputDocument",
"type": "cosmosDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"id" : "{queueTrigger}",
"connectionStringSetting": "MyAccount_COSMOSDB",
"direction": "in"
}
This example requires a project.json file that specifies the FSharp.Interop.Dynamic and Dynamitey NuGet
dependencies:
{
"frameworks": {
"net46": {
"dependencies": {
"Dynamitey": "1.0.2",
"FSharp.Interop.Dynamic": "3.0.0"
}
}
}
}
@Override
public String toString() {
return "ToDoItem={id=" + id + ",description=" + description + "}";
}
}
@FunctionName("DocByIdFromQueryString")
public HttpResponseMessage run(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
@CosmosDBInput(name = "database",
databaseName = "ToDoList",
collectionName = "Items",
id = "{Query.id}",
partitionKey = "{Query.id}",
connectionStringSetting = "Cosmos_DB_Connection_String")
Optional<String> item,
final ExecutionContext context) {
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
context.getLogger().info("String from the database is " + (item.isPresent() ? item.get() : null));
In the Java functions runtime library, use the @CosmosDBInput annotation on function parameters whose value
would come from Cosmos DB. This annotation can be used with native Java types, POJOs, or nullable values
using Optional.
HTTP trigger, look up ID from query string - POJO parameter (Java)
The following example shows a Java function that retrieves a single document. The function is triggered by a
HTTP request that uses a query string to specify the ID to look up. That ID is used to retrieve a document from the
specified database and collection. The document is then converted to an instance of the ToDoItem POJO
previously created, and passed as an argument to the function.
public class DocByIdFromQueryStringPojo {
@FunctionName("DocByIdFromQueryStringPojo")
public HttpResponseMessage run(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
@CosmosDBInput(name = "database",
databaseName = "ToDoList",
collectionName = "Items",
id = "{Query.id}",
partitionKey = "{Query.id}",
connectionStringSetting = "Cosmos_DB_Connection_String")
ToDoItem item,
final ExecutionContext context) {
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
context.getLogger().info("Item from the database is " + item);
@FunctionName("DocByIdFromRoute")
public HttpResponseMessage run(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS,
route = "todoitems/{id}")
HttpRequestMessage<Optional<String>> request,
@CosmosDBInput(name = "database",
databaseName = "ToDoList",
collectionName = "Items",
id = "{id}",
partitionKey = "{id}",
connectionStringSetting = "Cosmos_DB_Connection_String")
Optional<String> item,
final ExecutionContext context) {
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
context.getLogger().info("String from the database is " + (item.isPresent() ? item.get() : null));
@FunctionName("DocByIdFromRouteSqlQuery")
public HttpResponseMessage run(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS,
route = "todoitems2/{id}")
HttpRequestMessage<Optional<String>> request,
@CosmosDBInput(name = "database",
databaseName = "ToDoList",
collectionName = "Items",
sqlQuery = "select * from Items r where r.id = {id}",
connectionStringSetting = "Cosmos_DB_Connection_String")
ToDoItem[] item,
final ExecutionContext context) {
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
context.getLogger().info("Items from the database are " + item);
HTTP trigger, get multiple docs from route data, using SqlQuery (Java)
The following example shows a Java function that multiple documents. The function is triggered by a HTTP
request that uses a route parameter desc to specify the string to search for in the description field. The search
term is used to retrieve a collection of documents from the specified database and collection, converting the result
set to a ToDoItem[] and passing it as an argument to the function.
public class DocsFromRouteSqlQuery {
@FunctionName("DocsFromRouteSqlQuery")
public HttpResponseMessage run(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET},
authLevel = AuthorizationLevel.ANONYMOUS,
route = "todoitems3/{desc}")
HttpRequestMessage<Optional<String>> request,
@CosmosDBInput(name = "database",
databaseName = "ToDoList",
collectionName = "Items",
sqlQuery = "select * from Items r where contains(r.description, {desc})",
connectionStringSetting = "Cosmos_DB_Connection_String")
ToDoItem[] items,
final ExecutionContext context) {
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
context.getLogger().info("Number of items from the database is " + (items == null ? 0 :
items.length));
Input - attributes
In C# class libraries, use the CosmosDB attribute.
The attribute's constructor takes the database name and collection name. For information about those settings and
other properties that you can configure, see the following configuration section.
Input - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
CosmosDB attribute.
When you're developing locally, app settings go into the local.settings.json file.
Input - usage
In C# and F# functions, when the function exits successfully, any changes made to the input document via named
input parameters are automatically persisted.
In JavaScript functions, updates are not made automatically upon function exit. Instead, use
context.bindings.<documentName>In and context.bindings.<documentName>Out to make updates. See the JavaScript
example.
Output
The Azure Cosmos DB output binding lets you write a new document to an Azure Cosmos DB database using the
SQL API.
Output - examples
See the language-specific examples:
C#
C# script (.csx)
F#
Java
JavaScript
See also the input example that uses DocumentClient .
Skip output examples
Output - C# examples
This section contains the following examples:
Queue trigger, write one doc
Queue trigger, write docs using IAsyncCollector
The examples refer to a simple ToDoItem type:
namespace CosmosDBSamplesV2
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using System;
namespace CosmosDBSamplesV2
{
public static class WriteOneDoc
{
[FunctionName("WriteOneDoc")]
public static void Run(
[QueueTrigger("todoqueueforwrite")] string queueMessage,
[CosmosDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection")]out dynamic document,
ILogger log)
{
document = new { Description = queueMessage, id = Guid.NewGuid() };
namespace CosmosDBSamplesV2
{
public static class WriteDocsIAsyncCollector
{
[FunctionName("WriteDocsIAsyncCollector")]
public static async Task Run(
[QueueTrigger("todoqueueforwritemulti")] ToDoItem[] toDoItemsIn,
[CosmosDB(
databaseName: "ToDoItems",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection")]
IAsyncCollector<ToDoItem> toDoItemsOut,
ILogger log)
{
log.LogInformation($"C# Queue trigger function processed {toDoItemsIn?.Length} items");
{
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
The function creates Azure Cosmos DB documents in the following format for each record:
{
"id": "John Henry-123456",
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
#r "Newtonsoft.Json"
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json.Linq;
using Microsoft.Extensions.Logging;
public static void Run(string myQueueItem, out object employeeDocument, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
employeeDocument = new {
id = employee.name + "-" + employee.employeeId,
name = employee.name,
employeeId = employee.employeeId,
address = employee.address
};
}
namespace CosmosDBSamplesV2
{
public class ToDoItem
{
public string Id { get; set; }
public string Description { get; set; }
}
}
using System;
using Microsoft.Extensions.Logging;
public static async Task Run(ToDoItem[] toDoItemsIn, IAsyncCollector<ToDoItem> toDoItemsOut, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed {toDoItemsIn?.Length} items");
{
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
The function creates Azure Cosmos DB documents in the following format for each record:
{
"id": "John Henry-123456",
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
context.bindings.employeeDocument = JSON.stringify({
id: context.bindings.myQueueItem.name + "-" + context.bindings.myQueueItem.employeeId,
name: context.bindings.myQueueItem.name,
employeeId: context.bindings.myQueueItem.employeeId,
address: context.bindings.myQueueItem.address
});
context.done();
};
{
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
The function creates Azure Cosmos DB documents in the following format for each record:
{
"id": "John Henry-123456",
"name": "John Henry",
"employeeId": "123456",
"address": "A town nearby"
}
{
"name": "employeeDocument",
"type": "cosmosDB",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"createIfNotExists": true,
"connectionStringSetting": "MyAccount_COSMOSDB",
"direction": "out"
}
The configuration section explains these properties.
Here's the F# code:
open FSharp.Interop.Dynamic
open Newtonsoft.Json
open Microsoft.Extensions.Logging
type Employee = {
id: string
name: string
employeeId: string
address: string
}
This example requires a project.json file that specifies the FSharp.Interop.Dynamic and Dynamitey NuGet
dependencies:
{
"frameworks": {
"net46": {
"dependencies": {
"Dynamitey": "1.0.2",
"FSharp.Interop.Dynamic": "3.0.0"
}
}
}
}
HTTP trigger, save one document to database via return value (Java)
The following example shows a Java function whose signature is annotated with @CosmosDBOutput and has return
value of type String . The JSON document returned by the function will be automatically written to the
corresponding CosmosDB collection.
@FunctionName("WriteOneDoc")
@CosmosDBOutput(name = "database",
databaseName = "ToDoList",
collectionName = "Items",
connectionStringSetting = "Cosmos_DB_Connection_String")
public String run(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
final ExecutionContext context) {
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
// Generate random ID
final int id = Math.abs(new Random().nextInt());
// Generate document
final String jsonDocument = "{\"id\":\"" + id + "\", " +
"\"description\": \"" + name + "\"}";
return jsonDocument;
}
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
// Generate random ID
final int id = Math.abs(new Random().nextInt());
// Generate document
final String jsonDocument = "{\"id\":\"" + id + "\", " +
"\"description\": \"" + name + "\"}";
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
// Generate documents
List<ToDoItem> items = new ArrayList<>();
// Create ToDoItem
ToDoItem item = new ToDoItem(String.valueOf(id), name);
items.add(item);
}
In the Java functions runtime library, use the @CosmosDBOutput annotation on parameters that will be written to
Cosmos DB. The annotation parameter type should be OutputBinding<T> , where T is either a native Java type or a
POJO.
Output - attributes
In C# class libraries, use the CosmosDB attribute.
The attribute's constructor takes the database name and collection name. For information about those settings and
other properties that you can configure, see Output - configuration. Here's a CosmosDB attribute example in a
method signature:
[FunctionName("QueueToDocDB")]
public static void Run(
[QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")] string myQueueItem,
[CosmosDB("ToDoList", "Items", Id = "id", ConnectionStringSetting = "myCosmosDB")] out dynamic
document)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
CosmosDB attribute.
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
By default, when you write to the output parameter in your function, a document is created in your database. This
document has an automatically generated GUID as the document ID. You can specify the document ID of the
output document by specifying the id property in the JSON object passed to the output parameter.
NOTE
When you specify the ID of an existing document, it gets overwritten by the new output document.
host.json settings
This section describes the global configuration settings available for this binding in version 2.x. For more
information about global configuration settings in version 2.x, see host.json reference for Azure Functions version
2.x.
{
"version": "2.0",
"extensions": {
"cosmosDB": {
"connectionMode": "Gateway",
"protocol": "Https",
"leaseOptions": {
"leasePrefix": "prefix1"
}
}
}
}
Next steps
Learn more about serverless database computing with Cosmos DB
Learn more about Azure functions triggers and bindings
Event Grid trigger for Azure Functions
3/15/2019 • 17 minutes to read • Edit Online
This article explains how to handle Event Grid events in Azure Functions.
Event Grid is an Azure service that sends HTTP requests to notify you about events that happen in publishers. A
publisher is the service or resource that originates the event. For example, an Azure blob storage account is a
publisher, and a blob upload or deletion is an event. Some Azure services have built-in support for publishing
events to Event Grid.
Event handlers receive and process events. Azure Functions is one of several Azure services that have built-in
support for handling Event Grid events. In this article, you learn how to use an Event Grid trigger to invoke a
function when an event is received from Event Grid.
If you prefer, you can use an HTTP trigger to handle Event Grid Events; see Use an HTTP trigger as an Event
Grid trigger later in this article. Currently, you can't use an Event Grid trigger for an Azure Functions app when
the event is delivered in the CloudEvents schema. Instead, use an HTTP trigger.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
Example
See the language-specific example for an Event Grid trigger:
C#
C# script (.csx)
Java
JavaScript
Python
For an HTTP trigger example, see How to use HTTP trigger later in this article.
C# (2.x)
The following example shows a Functions 2.x C# function that binds to EventGridEvent :
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace Company.Function
{
public static class EventGridTriggerCSharp
{
[FunctionName("EventGridTest")]
public static void EventGridTest([EventGridTrigger]EventGridEvent eventGridEvent, ILogger log)
{
log.LogInformation(eventGridEvent.Data.ToString());
}
}
}
namespace Company.Function
{
public static class EventGridTriggerCSharp
{
[FunctionName("EventGridTriggerCSharp")]
public static void Run([EventGridTrigger]JObject eventGridEvent, ILogger log)
{
log.LogInformation(eventGridEvent.ToString(Formatting.Indented));
}
}
}
C# script example
The following example shows a trigger binding in a function.json file and a C# script function that uses the
binding.
Here's the binding data in the function.json file:
{
"bindings": [
{
"type": "eventGridTrigger",
"name": "eventGridEvent",
"direction": "in"
}
],
"disabled": false
}
#r "Microsoft.Azure.EventGrid"
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
JavaScript example
The following example shows a trigger binding in a function.json file and a JavaScript function that uses the
binding.
Here's the binding data in the function.json file:
{
"bindings": [
{
"type": "eventGridTrigger",
"name": "eventGridEvent",
"direction": "in"
}
],
"disabled": false
}
Python example
The following example shows a trigger binding in a function.json file and a Python function that uses the
binding.
Here's the binding data in the function.json file:
{
"bindings": [
{
"type": "eventGridTrigger",
"name": "event",
"direction": "in"
}
],
"disabled": false,
"scriptFile": "__init__.py"
}
{
"bindings": [
{
"type": "eventGridTrigger",
"name": "eventGridEvent",
"direction": "in"
}
]
}
@FunctionName("eventGridMonitorString")
public void logEvent(
@EventGridTrigger(
name = "event"
)
String content,
final ExecutionContext context) {
// log
context.getLogger().info("Event content: " + content);
}
import java.util.Date;
import java.util.Map;
}
Upon arrival, the event's JSON payload is de-serialized into the EventSchema POJO for use by the function. This
allows the function to access the event's properties in an object-oriented way.
@FunctionName("eventGridMonitor")
public void logEvent(
@EventGridTrigger(
name = "event"
)
EventSchema event,
final ExecutionContext context) {
// log
context.getLogger().info("Event content: ");
context.getLogger().info("Subject: " + event.subject);
context.getLogger().info("Time: " + event.eventTime); // automatically converted to Date by the
runtime
context.getLogger().info("Id: " + event.id);
context.getLogger().info("Data: " + event.data);
}
In the Java functions runtime library, use the EventGridTrigger annotation on parameters whose value would
come from EventGrid. Parameters with these annotations cause the function to run when an event arrives. This
annotation can be used with native Java types, POJOs, or nullable values using Optional<T> .
Attributes
In C# class libraries, use the EventGridTrigger attribute.
Here's an EventGridTrigger attribute in a method signature:
[FunctionName("EventGridTest")]
public static void EventGridTest([EventGridTrigger] JObject eventGridEvent, ILogger log)
{
...
}
Configuration
The following table explains the binding configuration properties that you set in the function.json file. There are
no constructor parameters or properties to set in the EventGridTrigger attribute.
name Required - the variable name used in function code for the
parameter that receives the event data.
Usage
For C# and F# functions in Azure Functions 1.x, you can use the following parameter types for the Event Grid
trigger:
JObject
string
For C# and F# functions in Azure Functions 2.x, you also have the option to use the following parameter type for
the Event Grid trigger:
Microsoft.Azure.EventGrid.Models.EventGridEvent - Defines properties for the fields common to all event
types.
NOTE
In Functions v1 if you try to bind to Microsoft.Azure.WebJobs.Extensions.EventGrid.EventGridEvent , the compiler
will display a "deprecated" message and advise you to use Microsoft.Azure.EventGrid.Models.EventGridEvent
instead. To use the newer type, reference the Microsoft.Azure.EventGrid NuGet package and fully qualify the
EventGridEvent type name by prefixing it with Microsoft.Azure.EventGrid.Models . For information about how to
reference NuGet packages in a C# script function, see Using NuGet packages
For JavaScript functions, the parameter named by the function.json name property has a reference to the event
object.
Event schema
Data for an Event Grid event is received as a JSON object in the body of an HTTP request. The JSON looks
similar to the following example:
[{
"topic":
"/subscriptions/{subscriptionid}/resourceGroups/eg0122/providers/Microsoft.Storage/storageAccounts/egblobsto
re",
"subject": "/blobServices/default/containers/{containername}/blobs/blobname.jpg",
"eventType": "Microsoft.Storage.BlobCreated",
"eventTime": "2018-01-23T17:02:19.6069787Z",
"id": "{guid}",
"data": {
"api": "PutBlockList",
"clientRequestId": "{guid}",
"requestId": "{guid}",
"eTag": "0x8D562831044DDD0",
"contentType": "application/octet-stream",
"contentLength": 2248,
"blobType": "BlockBlob",
"url": "https://egblobstore.blob.core.windows.net/{containername}/blobname.jpg",
"sequencer": "000000000000272D000000000003D60F",
"storageDiagnostics": {
"batchId": "{guid}"
}
},
"dataVersion": "",
"metadataVersion": "1"
}]
The example shown is an array of one element. Event Grid always sends an array and may send more than one
event in the array. The runtime invokes your function once for each array element.
The top-level properties in the event JSON data are the same among all event types, while the contents of the
data property are specific to each event type. The example shown is for a blob storage event.
For explanations of the common and event-specific properties, see Event properties in the Event Grid
documentation.
The EventGridEvent type defines only the top-level properties; the Data property is a JObject .
Create a subscription
To start receiving Event Grid HTTP requests, create an Event Grid subscription that specifies the endpoint URL
that invokes the function.
Azure portal
For functions that you develop in the Azure portal with the Event Grid trigger, select Add Event Grid
subscription.
When you select this link, the portal opens the Create Event Subscription page with the endpoint URL
prefilled.
For more information about how to create subscriptions by using the Azure portal, see Create custom event -
Azure portal in the Event Grid documentation.
Azure CLI
To create a subscription by using the Azure CLI, use the az eventgrid event-subscription create command.
The command requires the endpoint URL that invokes the function. The following example shows the version-
specific URL pattern:
Version 2.x runtime
https://{functionappname}.azurewebsites.net/runtime/webhooks/eventgrid?functionName={functionname}&code=
{systemkey}
The system key is an authorization key that has to be included in the endpoint URL for an Event Grid trigger. The
following section explains how to get the system key.
Here's an example that subscribes to a blob storage account (with a placeholder for the system key):
Version 2.x runtime
For more information about how to create a subscription, see the blob storage quickstart or the other Event Grid
quickstarts.
Get the system key
You can get the system key by using the following API (HTTP GET):
Version 2.x runtime
http://{functionappname}.azurewebsites.net/admin/host/systemkeys/eventgrid_extension?code={masterkey}
http://{functionappname}.azurewebsites.net/admin/host/systemkeys/eventgridextensionconfig_extension?code=
{masterkey}
This is an admin API, so it requires your function app master key. Don't confuse the system key (for invoking an
Event Grid trigger function) with the master key (for performing administrative tasks on the function app). When
you subscribe to an Event Grid topic, be sure to use the system key.
Here's an example of the response that provides the system key:
{
"name": "eventgridextensionconfig_extension",
"value": "{the system key for the function}",
"links": [
{
"rel": "self",
"href": "{the URL for the function, without the system key}"
}
]
}
You can get the master key for your function app from the Function app settings tab in the portal.
IMPORTANT
The master key provides administrator access to your function app. Don't share this key with third parties or distribute it
in native client applications.
For more information, see Authorization keys in the HTTP trigger reference article.
Alternatively, you can send an HTTP PUT to specify the key value yourself.
The deployment may take a few minutes to complete. After the deployment has succeeded, view your web app
to make sure it's running. In a web browser, navigate to: https://<your-site-name>.azurewebsites.net
You see the site but no events have been posted to it yet.
Create an Event Grid subscription
Create an Event Grid subscription of the type you want to test, and give it the URL from your web app as the
endpoint for event notification. The endpoint for your web app must include the suffix /api/updates/ . So, the
full URL is https://<your-site-name>.azurewebsites.net/api/updates
For information about how to create subscriptions by using the Azure portal, see Create custom event - Azure
portal in the Event Grid documentation.
Generate a request
Trigger an event that will generate HTTP traffic to your web app endpoint. For example, if you created a blob
storage subscription, upload or delete a blob. When a request shows up in your web app, copy the request body.
The subscription validation request will be received first; ignore any validation requests, and copy the event
request.
http://localhost:7071/runtime/webhooks/eventgrid?functionName={FUNCTION_NAME}
http://localhost:7071/admin/extensions/EventGridExtensionConfig?functionName={FUNCTION_NAME}
The functionName parameter must be the name specified in the FunctionName attribute.
The following screenshots show the headers and request body in Postman:
The Event Grid trigger function executes and shows logs similar to the following example:
Local testing with ngrok
Another way to test an Event Grid trigger locally is to automate the HTTP connection between the Internet and
your development computer. You can do that with an open-source tool named ngrok:
1. Create an ngrok endpoint.
2. Run the Event Grid trigger function.
3. Create an Event Grid subscription that sends events to the ngrok endpoint.
4. Trigger an event.
When you're done testing, you can use the same subscription for production by updating the endpoint. Use the
az eventgrid event-subscription update Azure CLI command.
Create an ngrok endpoint
Download ngrok.exe from ngrok, and run with the following command:
The -host-header parameter is needed because the functions runtime expects requests from localhost when it
runs on localhost. 7071 is the default port number when the runtime runs locally.
The command creates output like the following:
You'll use the https://{subdomain}.ngrok.io URL for your Event Grid subscription.
Run the Event Grid trigger function
The ngrok URL doesn't get special handling by Event Grid, so your function must be running locally when the
subscription is created. If it isn't, the validation response doesn't get sent and the subscription creation fails.
Create a subscription
Create an Event Grid subscription of the type you want to test, and give it your ngrok endpoint.
Use this endpoint pattern for Functions 2.x:
https://{SUBDOMAIN}.ngrok.io/runtime/webhooks/eventgrid?functionName={FUNCTION_NAME}
https://{SUBDOMAIN}.ngrok.io/admin/extensions/EventGridExtensionConfig?functionName={FUNCTION_NAME}
The {FUNCTION_NAME} parameter must be the name specified in the FunctionName attribute.
Here's an example using the Azure CLI:
For information about how to create a subscription, see Create a subscription earlier in this article.
Trigger an event
Trigger an event that will generate HTTP traffic to your ngrok endpoint. For example, if you created a blob
storage subscription, upload or delete a blob.
The Event Grid trigger function executes and shows logs similar to the following example:
[FunctionName("HttpTrigger")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post")]HttpRequestMessage req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
// If the request is for subscription validation, send back the validation code.
if (messages.Count > 0 && string.Equals((string)messages[0]["eventType"],
"Microsoft.EventGrid.SubscriptionValidationEvent",
System.StringComparison.OrdinalIgnoreCase))
{
log.LogInformation("Validate request received");
return req.CreateResponse<object>(new
{
validationResponse = messages[0]["data"]["validationCode"]
});
}
// The request is not for subscription validation, so it's for one or more events.
foreach (JObject message in messages)
{
// Handle one event.
EventGridEvent eventGridEvent = message.ToObject<EventGridEvent>();
log.LogInformation($"Subject: {eventGridEvent.Subject}");
log.LogInformation($"Time: {eventGridEvent.EventTime}");
log.LogInformation($"Event data: {eventGridEvent.Data.ToString()}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
The following sample JavaScript code for an HTTP trigger simulates Event Grid trigger behavior. Use this
example for events delivered in the Event Grid schema.
module.exports = function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
Your event-handling code goes inside the loop through the messages array.
CloudEvents schema
The following sample C# code for an HTTP trigger simulates Event Grid trigger behavior. Use this example for
events delivered in the CloudEvents schema.
[FunctionName("HttpTrigger")]
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post",
Route = null)]HttpRequestMessage req, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
if (message.Type == JTokenType.Array)
{
// If the request is for subscription validation, send back the validation code.
if (string.Equals((string)message[0]["eventType"],
"Microsoft.EventGrid.SubscriptionValidationEvent",
System.StringComparison.OrdinalIgnoreCase))
{
log.LogInformation("Validate request received");
return req.CreateResponse<object>(new
{
validationResponse = message[0]["data"]["validationCode"]
});
}
}
else
{
// The request is not for subscription validation, so it's for an event.
// CloudEvents schema delivers one event at a time.
log.LogInformation($"Source: {message["source"]}");
log.LogInformation($"Time: {message["eventTime"]}");
log.LogInformation($"Event data: {message["data"].ToString()}");
}
return req.CreateResponse(HttpStatusCode.OK);
}
The following sample JavaScript code for an HTTP trigger simulates Event Grid trigger behavior. Use this
example for events delivered in the CloudEvents schema.
Next steps
Learn more about Azure functions triggers and bindings
Learn more about Event Grid
Azure Event Hubs bindings for Azure Functions
3/5/2019 • 17 minutes to read • Edit Online
This article explains how to work with Azure Event Hubs bindings for Azure Functions. Azure Functions
supports trigger and output bindings for Event Hubs.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Trigger
Use the function trigger to respond to an event sent to an event hub event stream. You must have read access to
the underlying event hub to set up the trigger. When the function is triggered, the message passed to the
function is typed as a string.
Trigger - scaling
Each instance of an event triggered function is backed by a single EventProcessorHost instance. The trigger
(powered by Event Hubs) ensures that only one EventProcessorHost instance can get a lease on a given
partition.
For example, consider an Event Hub as follows:
10 partitions
1,000 events distributed evenly across all partitions, with 100 messages in each partition
When your function is first enabled, there is only one instance of the function. Let's call the first function instance
Function_0 . The Function_0 function has a single instance of EventProcessorHost that holds a lease on all ten
partitions. This instance is reading events from partitions 0-9. From this point forward, one of the following
happens:
New function instances are not needed: Function_0 is able to process all 1,000 events before the
Functions scaling logic take effect. In this case, all 1,000 messages are processed by Function_0 .
An additional function instance is added: If the Functions scaling logic determines that Function_0
has more messages than it can process, a new function app instance ( Function_1 ) is created. This new
function also has an associated instance of EventProcessorHost. As the underlying Event Hubs detect that
a new host instance is trying read messages, it load balances the partitions across the its host instances.
For example, partitions 0-4 may be assigned to Function_0 and partitions 5-9 to Function_1 .
N more function instances are added: If the Functions scaling logic determines that both Function_0
and Function_1 have more messages than they can process, new Functions_N function app instances are
created. Apps are created to the point where N is greater than the number of event hub partitions. In our
example, Event Hubs again load balances the partitions, in this case across the instances Function_0 ...
Functions_9 .
When Functions scales, N instances is a number greater than the number of event hub partitions. This is done
to ensure EventProcessorHost instances are available to obtain locks on partitions as they become available
from other instances. You are only charged for the resources used when the function instance executes. In other
words, you are not charged for this over-provisioning.
When all function execution completes (with or without errors), checkpoints are added to the associated storage
account. When check-pointing succeeds, all 1,000 messages are never retrieved again.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
F#
Java
JavaScript
Python
Trigger - C# example
The following example shows a C# function that logs the message body of the event hub trigger.
[FunctionName("EventHubTriggerCSharp")]
public static void Run([EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")]
string myEventHubMessage, ILogger log)
{
log.LogInformation($"C# function triggered to process a message: {myEventHubMessage}");
}
To get access to event metadata in function code, bind to an EventData object (requires a using statement for
Microsoft.Azure.EventHubs ). You can also access the same properties by using binding expressions in the
method signature. The following example shows both ways to get the same data:
[FunctionName("EventHubTriggerCSharp")]
public static void Run(
[EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")] EventData
myEventHubMessage,
DateTime enqueuedTimeUtc,
Int64 sequenceNumber,
string offset,
ILogger log)
{
log.LogInformation($"Event: {Encoding.UTF8.GetString(myEventHubMessage.Body)}");
// Metadata accessed by binding to EventData
log.LogInformation($"EnqueuedTimeUtc={myEventHubMessage.SystemProperties.EnqueuedTimeUtc}");
log.LogInformation($"SequenceNumber={myEventHubMessage.SystemProperties.SequenceNumber}");
log.LogInformation($"Offset={myEventHubMessage.SystemProperties.Offset}");
// Metadata accessed by using binding expressions in method parameters
log.LogInformation($"EnqueuedTimeUtc={enqueuedTimeUtc}");
log.LogInformation($"SequenceNumber={sequenceNumber}");
log.LogInformation($"Offset={offset}");
}
NOTE
When receiving in a batch you cannot bind to method parameters like in the above example with
DateTime enqueuedTimeUtc and must receive these from each EventData object
[FunctionName("EventHubTriggerCSharp")]
public static void Run([EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")]
EventData[] eventHubMessages, ILogger log)
{
foreach (var message in eventHubMessages)
{
log.LogInformation($"C# function triggered to process a message:
{Encoding.UTF8.GetString(message.Body)}");
log.LogInformation($"EnqueuedTimeUtc={message.SystemProperties.EnqueuedTimeUtc}");
}
}
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
using System;
To get access to event metadata in function code, bind to an EventData object (requires a using statement for
Microsoft.Azure.EventHubs ). You can also access the same properties by using binding expressions in the
method signature. The following example shows both ways to get the same data:
#r "Microsoft.Azure.EventHubs"
using System.Text;
using System;
using Microsoft.ServiceBus.Messaging;
using Microsoft.Azure.EventHubs;
Trigger - F# example
The following example shows an event hub trigger binding in a function.json file and an F# function that uses the
binding. The function logs the message body of the event hub trigger.
The following examples show Event Hubs binding data in the function.json file.
Version 2.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
context.done();
};
To receive events in a batch, set cardinality to many in the function.json file, as shown in the following
examples.
Version 2.x
{
"type": "eventHubTrigger",
"name": "eventHubMessages",
"direction": "in",
"eventHubName": "MyEventHub",
"cardinality": "many",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "eventHubMessages",
"direction": "in",
"path": "MyEventHub",
"cardinality": "many",
"connection": "myEventHubReadConnectionAppSetting"
}
context.done();
};
{
"type": "eventHubTrigger",
"name": "event",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
import logging
import azure.functions as func
{
"type": "eventHubTrigger",
"name": "msg",
"direction": "in",
"eventHubName": "myeventhubname",
"connection": "myEventHubReadConnectionAppSetting"
}
@FunctionName("ehprocessor")
public void eventHubProcessor(
@EventHubTrigger(name = "msg",
eventHubName = "myeventhubname",
connection = "myconnvarname") String message,
final ExecutionContext context )
{
context.getLogger().info(message);
}
In the Java functions runtime library, use the EventHubTrigger annotation on parameters whose value would
come from Event Hub. Parameters with these annotations cause the function to run when an event arrives. This
annotation can be used with native Java types, POJOs, or nullable values using Optional.
Trigger - attributes
In C# class libraries, use the EventHubTriggerAttribute attribute.
The attribute's constructor takes the name of the event hub, the name of the consumer group, and the name of
an app setting that contains the connection string. For more information about these settings, see the trigger
configuration section. Here's an EventHubTriggerAttribute attribute example:
[FunctionName("EventHubTriggerCSharp")]
public static void Run([EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")]
string myEventHubMessage, ILogger log)
{
...
}
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
EventHubTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
See code examples that use these properties earlier in this article.
Output
Use the Event Hubs output binding to write events to an event stream. You must have send permission to an
event hub to write events to it.
Ensure the required package references are in place: Functions 1.x or Functions 2.x
Output - example
See the language-specific example:
C#
C# script (.csx)
F#
Java
JavaScript
Python
Output - C# example
The following example shows a C# function that writes a message to an event hub, using the method return
value as the output:
[FunctionName("EventHubOutput")]
[return: EventHub("outputEventHubMessage", Connection = "EventHubConnectionAppSetting")]
public static string Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
return $"{DateTime.Now}";
}
The following sample shows how to use the IAsyncCollector interface to send a batch of messages. This
scenario is common when you are processing messages coming from one Event Hub and sending the result to
another Event Hub.
[FunctionName("EH2EH")]
public static async Task Run(
[EventHubTrigger("source", Connection = "EventHubConnectionAppSetting")] EventData[] events,
[EventHub("dest", Connection = "EventHubConnectionAppSetting")]IAsyncCollector<string> outputEvents,
ILogger log)
{
foreach (EventData eventData in events)
{
// do some processing:
var myProcessedEvent = DoSomething(eventData);
{
"type": "eventHub",
"name": "outputEventHubMessage",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"path": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
using System;
using Microsoft.Extensions.Logging;
public static void Run(TimerInfo myTimer, out string outputEventHubMessage, ILogger log)
{
String msg = $"TimerTriggerCSharp1 executed at: {DateTime.Now}";
log.LogInformation(msg);
outputEventHubMessage = msg;
}
Output - F# example
The following example shows an event hub trigger binding in a function.json file and an F# function that uses the
binding. The function writes a message to an event hub.
The following examples show Event Hubs binding data in the function.json file. The first example is for Functions
2.x, and the second one is for Functions 1.x.
{
"type": "eventHub",
"name": "outputEventHubMessage",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"path": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"path": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
module.exports = function(context) {
var timeStamp = new Date().toISOString();
var message = 'Message created at: ' + timeStamp;
context.bindings.outputEventHubMessage = [];
{
"type": "eventHub",
"name": "$return",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
import datetime
import logging
import azure.functions as func
In the Java functions runtime library, use the @EventHubOutput annotation on parameters whose value would be
published to Event Hub. The parameter should be of type OutputBinding<T> , where T is a POJO or any native
Java type.
Output - attributes
For C# class libraries, use the EventHubAttribute attribute.
The attribute's constructor takes the name of the event hub and the name of an app setting that contains the
connection string. For more information about these settings, see Output - configuration. Here's an EventHub
attribute example:
[FunctionName("EventHubOutput")]
[return: EventHub("outputEventHubMessage", Connection = "EventHubConnectionAppSetting")]
public static string Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer, ILogger log)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
EventHub attribute.
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
In C# and C# script, send messages by using a method parameter such as out string paramName . In C# script,
paramName is the value specified in the name property of function.json. To write multiple messages, you can use
ICollector<string> or IAsyncCollector<string> in place of out string .
In JavaScript, access the output event by using context.bindings.<name> . <name> is the value specified in the
name property of function.json.
host.json settings
This section describes the global configuration settings available for this binding in version 2.x. The example
host.json file below contains only the version 2.x settings for this binding. For more information about global
configuration settings in version 2.x, see host.json reference for Azure Functions version 2.x.
NOTE
For a reference of host.json in Functions 1.x, see host.json reference for Azure Functions 1.x.
{
"version": "2.0",
"extensions": {
"eventHubs": {
"batchCheckpointFrequency": 5,
"eventProcessorOptions": {
"maxBatchSize": 256,
"prefetchCount": 512
}
}
}
}
PROPERTY DEFAULT DESCRIPTION
Next steps
Learn more about Azure functions triggers and bindings
Azure IoT Hub bindings for Azure Functions
4/9/2019 • 17 minutes to read • Edit Online
This article explains how to work with Azure Functions bindings for IoT Hub. The IoT Hub support is based on the
Azure Event Hubs Binding.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Python Register the extension
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
IMPORTANT
While the following code samples use the Event Hub API, the given syntax is applicable for IoT Hub functions.
Trigger
Use the function trigger to respond to an event sent to an event hub event stream. You must have read access to
the underlying event hub to set up the trigger. When the function is triggered, the message passed to the function
is typed as a string.
Trigger - scaling
Each instance of an event triggered function is backed by a single EventProcessorHost instance. The trigger
(powered by Event Hubs) ensures that only one EventProcessorHost instance can get a lease on a given partition.
For example, consider an Event Hub as follows:
10 partitions
1,000 events distributed evenly across all partitions, with 100 messages in each partition
When your function is first enabled, there is only one instance of the function. Let's call the first function instance
Function_0 . The Function_0 function has a single instance of EventProcessorHost that holds a lease on all ten
partitions. This instance is reading events from partitions 0-9. From this point forward, one of the following
happens:
New function instances are not needed: Function_0 is able to process all 1,000 events before the
Functions scaling logic take effect. In this case, all 1,000 messages are processed by Function_0 .
An additional function instance is added: If the Functions scaling logic determines that Function_0 has
more messages than it can process, a new function app instance ( Function_1 ) is created. This new function
also has an associated instance of EventProcessorHost. As the underlying Event Hubs detect that a new host
instance is trying read messages, it load balances the partitions across the its host instances. For example,
partitions 0-4 may be assigned to Function_0 and partitions 5-9 to Function_1 .
N more function instances are added: If the Functions scaling logic determines that both Function_0
and Function_1 have more messages than they can process, new Functions_N function app instances are
created. Apps are created to the point where N is greater than the number of event hub partitions. In our
example, Event Hubs again load balances the partitions, in this case across the instances Function_0 ...
Functions_9 .
When Functions scales, N instances is a number greater than the number of event hub partitions. This is done to
ensure EventProcessorHost instances are available to obtain locks on partitions as they become available from
other instances. You are only charged for the resources used when the function instance executes. In other words,
you are not charged for this over-provisioning.
When all function execution completes (with or without errors), checkpoints are added to the associated storage
account. When check-pointing succeeds, all 1,000 messages are never retrieved again.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
F#
Java
JavaScript
Python
Trigger - C# example
The following example shows a C# function that logs the message body of the event hub trigger.
[FunctionName("EventHubTriggerCSharp")]
public static void Run([EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")]
string myEventHubMessage, ILogger log)
{
log.LogInformation($"C# function triggered to process a message: {myEventHubMessage}");
}
To get access to event metadata in function code, bind to an EventData object (requires a using statement for
Microsoft.Azure.EventHubs ). You can also access the same properties by using binding expressions in the method
signature. The following example shows both ways to get the same data:
[FunctionName("EventHubTriggerCSharp")]
public static void Run(
[EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")] EventData
myEventHubMessage,
DateTime enqueuedTimeUtc,
Int64 sequenceNumber,
string offset,
ILogger log)
{
log.LogInformation($"Event: {Encoding.UTF8.GetString(myEventHubMessage.Body)}");
// Metadata accessed by binding to EventData
log.LogInformation($"EnqueuedTimeUtc={myEventHubMessage.SystemProperties.EnqueuedTimeUtc}");
log.LogInformation($"SequenceNumber={myEventHubMessage.SystemProperties.SequenceNumber}");
log.LogInformation($"Offset={myEventHubMessage.SystemProperties.Offset}");
// Metadata accessed by using binding expressions in method parameters
log.LogInformation($"EnqueuedTimeUtc={enqueuedTimeUtc}");
log.LogInformation($"SequenceNumber={sequenceNumber}");
log.LogInformation($"Offset={offset}");
}
NOTE
When receiving in a batch you cannot bind to method parameters like in the above example with
DateTime enqueuedTimeUtc and must receive these from each EventData object
[FunctionName("EventHubTriggerCSharp")]
public static void Run([EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")]
EventData[] eventHubMessages, ILogger log)
{
foreach (var message in eventHubMessages)
{
log.LogInformation($"C# function triggered to process a message:
{Encoding.UTF8.GetString(message.Body)}");
log.LogInformation($"EnqueuedTimeUtc={message.SystemProperties.EnqueuedTimeUtc}");
}
}
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
using System;
To get access to event metadata in function code, bind to an EventData object (requires a using statement for
Microsoft.Azure.EventHubs ). You can also access the same properties by using binding expressions in the method
signature. The following example shows both ways to get the same data:
#r "Microsoft.Azure.EventHubs"
using System.Text;
using System;
using Microsoft.ServiceBus.Messaging;
using Microsoft.Azure.EventHubs;
Trigger - F# example
The following example shows an event hub trigger binding in a function.json file and an F# function that uses the
binding. The function logs the message body of the event hub trigger.
The following examples show Event Hubs binding data in the function.json file.
Version 2.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "myEventHubMessage",
"direction": "in",
"path": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
context.done();
};
To receive events in a batch, set cardinality to many in the function.json file, as shown in the following examples.
Version 2.x
{
"type": "eventHubTrigger",
"name": "eventHubMessages",
"direction": "in",
"eventHubName": "MyEventHub",
"cardinality": "many",
"connection": "myEventHubReadConnectionAppSetting"
}
Version 1.x
{
"type": "eventHubTrigger",
"name": "eventHubMessages",
"direction": "in",
"path": "MyEventHub",
"cardinality": "many",
"connection": "myEventHubReadConnectionAppSetting"
}
context.done();
};
{
"type": "eventHubTrigger",
"name": "event",
"direction": "in",
"eventHubName": "MyEventHub",
"connection": "myEventHubReadConnectionAppSetting"
}
import logging
import azure.functions as func
{
"type": "eventHubTrigger",
"name": "msg",
"direction": "in",
"eventHubName": "myeventhubname",
"connection": "myEventHubReadConnectionAppSetting"
}
@FunctionName("ehprocessor")
public void eventHubProcessor(
@EventHubTrigger(name = "msg",
eventHubName = "myeventhubname",
connection = "myconnvarname") String message,
final ExecutionContext context )
{
context.getLogger().info(message);
}
In the Java functions runtime library, use the EventHubTrigger annotation on parameters whose value would come
from Event Hub. Parameters with these annotations cause the function to run when an event arrives. This
annotation can be used with native Java types, POJOs, or nullable values using Optional.
Trigger - attributes
In C# class libraries, use the EventHubTriggerAttribute attribute.
The attribute's constructor takes the name of the event hub, the name of the consumer group, and the name of an
app setting that contains the connection string. For more information about these settings, see the trigger
configuration section. Here's an EventHubTriggerAttribute attribute example:
[FunctionName("EventHubTriggerCSharp")]
public static void Run([EventHubTrigger("samples-workitems", Connection = "EventHubConnectionAppSetting")]
string myEventHubMessage, ILogger log)
{
...
}
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
EventHubTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
See code examples that use these properties earlier in this article.
Output
Use the Event Hubs output binding to write events to an event stream. You must have send permission to an event
hub to write events to it.
Ensure the required package references are in place: Functions 1.x or Functions 2.x
Output - example
See the language-specific example:
C#
C# script (.csx)
F#
Java
JavaScript
Python
Output - C# example
The following example shows a C# function that writes a message to an event hub, using the method return value
as the output:
[FunctionName("EventHubOutput")]
[return: EventHub("outputEventHubMessage", Connection = "EventHubConnectionAppSetting")]
public static string Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
return $"{DateTime.Now}";
}
The following sample shows how to use the IAsyncCollector interface to send a batch of messages. This scenario
is common when you are processing messages coming from one Event Hub and sending the result to another
Event Hub.
[FunctionName("EH2EH")]
public static async Task Run(
[EventHubTrigger("source", Connection = "EventHubConnectionAppSetting")] EventData[] events,
[EventHub("dest", Connection = "EventHubConnectionAppSetting")]IAsyncCollector<string> outputEvents,
ILogger log)
{
foreach (EventData eventData in events)
{
// do some processing:
var myProcessedEvent = DoSomething(eventData);
{
"type": "eventHub",
"name": "outputEventHubMessage",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"path": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
using System;
using Microsoft.Extensions.Logging;
public static void Run(TimerInfo myTimer, out string outputEventHubMessage, ILogger log)
{
String msg = $"TimerTriggerCSharp1 executed at: {DateTime.Now}";
log.LogInformation(msg);
outputEventHubMessage = msg;
}
Output - F# example
The following example shows an event hub trigger binding in a function.json file and an F# function that uses the
binding. The function writes a message to an event hub.
The following examples show Event Hubs binding data in the function.json file. The first example is for Functions
2.x, and the second one is for Functions 1.x.
{
"type": "eventHub",
"name": "outputEventHubMessage",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"path": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
{
"type": "eventHub",
"name": "outputEventHubMessage",
"path": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
module.exports = function(context) {
var timeStamp = new Date().toISOString();
var message = 'Message created at: ' + timeStamp;
context.bindings.outputEventHubMessage = [];
{
"type": "eventHub",
"name": "$return",
"eventHubName": "myeventhub",
"connection": "MyEventHubSendAppSetting",
"direction": "out"
}
import datetime
import logging
import azure.functions as func
In the Java functions runtime library, use the @EventHubOutput annotation on parameters whose value would be
published to Event Hub. The parameter should be of type OutputBinding<T> , where T is a POJO or any native Java
type.
Output - attributes
For C# class libraries, use the EventHubAttribute attribute.
The attribute's constructor takes the name of the event hub and the name of an app setting that contains the
connection string. For more information about these settings, see Output - configuration. Here's an EventHub
attribute example:
[FunctionName("EventHubOutput")]
[return: EventHub("outputEventHubMessage", Connection = "EventHubConnectionAppSetting")]
public static string Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer, ILogger log)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
EventHub attribute.
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
In C# and C# script, send messages by using a method parameter such as out string paramName . In C# script,
paramName is the value specified in the name property of function.json. To write multiple messages, you can use
ICollector<string> or IAsyncCollector<string> in place of out string .
In JavaScript, access the output event by using context.bindings.<name> . <name> is the value specified in the name
property of function.json.
host.json settings
This section describes the global configuration settings available for this binding in version 2.x. The example
host.json file below contains only the version 2.x settings for this binding. For more information about global
configuration settings in version 2.x, see host.json reference for Azure Functions version 2.x.
NOTE
For a reference of host.json in Functions 1.x, see host.json reference for Azure Functions 1.x.
{
"version": "2.0",
"extensions": {
"eventHubs": {
"batchCheckpointFrequency": 5,
"eventProcessorOptions": {
"maxBatchSize": 256,
"prefetchCount": 512
}
}
}
}
PROPERTY DEFAULT DESCRIPTION
Next steps
Learn more about Azure functions triggers and bindings
Azure Functions HTTP triggers and bindings
4/10/2019 • 21 minutes to read • Edit Online
This article explains how to work with HTTP triggers and output bindings in Azure Functions.
An HTTP trigger can be customized to respond to webhooks.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TIP
If you plan to use the HTTP or WebHook bindings, plan to avoid port exhaustion that can be caused by improper
instantiation of HttpClient . For more information, see How to manage connections in Azure Functions.
The code in this article defaults to Functions 2.x syntax which uses .NET Core. For information on the 1.x
syntax, see the 1.x functions templates.
Trigger
The HTTP trigger lets you invoke a function with an HTTP request. You can use an HTTP trigger to build
serverless APIs and respond to webhooks.
By default, an HTTP trigger returns HTTP 200 OK with an empty body in Functions 1.x, or HTTP 204 No
Content with an empty body in Functions 2.x. To modify the response, configure an HTTP output binding.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
F#
Java
JavaScript
Python
Trigger - C# example
The following example shows a C# function that looks for a name parameter either in the query string or the
body of the HTTP request. Notice that the return value is used for the output binding, but a return value
attribute isn't required.
[FunctionName("HttpTriggerCSharp")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]
HttpRequest req, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
{
"disabled": false,
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get",
"post"
]
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
]
}
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
You can bind to a custom object instead of HttpRequest . This object is created from the body of the request
and parsed as JSON. Similarly, a type can be passed to the HTTP response output binding and returned as
the response body, along with a 200 status code.
using System.Net;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
Trigger - F# example
The following example shows a trigger binding in a function.json file and an F# function that uses the binding.
The function looks for a name parameter either in the query string or the body of the HTTP request.
Here's the function.json file:
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "res",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
open System.Net
open System.Net.Http
open FSharp.Interop.Dynamic
You need a project.json file that uses NuGet to reference the FSharp.Interop.Dynamic and Dynamitey
assemblies, as shown in the following example:
{
"frameworks": {
"net46": {
"dependencies": {
"Dynamitey": "1.0.2",
"FSharp.Interop.Dynamic": "3.0.0"
}
}
}
}
{
"scriptFile": "__init__.py",
"disabled": false,
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
The configuration section explains these properties.
Here's the Python code:
import logging
import azure.functions as func
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
return func.HttpResponse(f"Hello {name}!")
else:
return func.HttpResponse(
"Please pass a name on the query string or in the request body",
status_code=400
)
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
// Item list
context.getLogger().info("GET parameters are: " + request.getQueryParameters());
// Item list
context.getLogger().info("Request body is: " + request.getBody().orElse(""));
// Item list
context.getLogger().info("Route parameters are: " + id);
@Override
public String toString() {
return "ToDoItem={id=" + id + ",description=" + description + "}";
}
}
This example reads the body of a POST request. The request body gets automatically de-serialized into a
ToDoItem object, and is returned to the client, with content type application/json . The ToDoItem parameter
is serialized by the Functions runtime as it is assigned to the body property of the
HttpMessageResponse.Builder class.
@FunctionName("TriggerPojoPost")
public HttpResponseMessage run(
@HttpTrigger(name = "req",
methods = {HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<ToDoItem>> request,
final ExecutionContext context) {
// Item list
context.getLogger().info("Request body is: " + request.getBody().orElse(null));
Trigger - attributes
In C# class libraries, use the HttpTrigger attribute.
You can set the authorization level and allowable HTTP methods in attribute constructor parameters, and
there are properties for webhook type and route template. For more information about these settings, see
Trigger - configuration. Here's an HttpTrigger attribute in a method signature:
[FunctionName("HttpTriggerCSharp")]
public static Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous)] HttpRequest req)
{
...
}
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
HttpTrigger attribute.
Trigger - usage
For C# and F# functions, you can declare the type of your trigger input to be either HttpRequest or a custom
type. If you choose HttpRequest , you get full access to the request object. For a custom type, the runtime tries
to parse the JSON request body to set the object properties.
For JavaScript functions, the Functions runtime provides the request body instead of the request object. For
more information, see the JavaScript trigger example.
Customize the HTTP endpoint
By default when you create a function for an HTTP trigger, the function is addressable with a route of the
form:
http://<yourapp>.azurewebsites.net/api/<funcname>
You can customize this route using the optional route property on the HTTP trigger's input binding. As an
example, the following function.json file defines a route property for an HTTP trigger:
{
"bindings": [
{
"type": "httpTrigger",
"name": "req",
"direction": "in",
"methods": [ "get" ],
"route": "products/{category:alpha}/{id:int?}"
},
{
"type": "http",
"name": "res",
"direction": "out"
}
]
}
Using this configuration, the function is now addressable with the following route instead of the original
route.
http://<yourapp>.azurewebsites.net/api/products/electronics/357
This allows the function code to support two parameters in the address, category and id. You can use any
Web API Route Constraint with your parameters. The following C# function code makes use of both
parameters.
public static Task<IActionResult> Run(HttpRequest req, string category, int? id, ILogger log)
{
if (id == null)
{
return (ActionResult)new OkObjectResult($"All {category} items were requested.");
}
else
{
return (ActionResult)new OkObjectResult($"{category} item with id = {id} has been requested.");
}
// -----
log.LogInformation($"C# HTTP trigger function processed a request. RequestUri={req.RequestUri}");
}
Here is Node.js function code that uses the same route parameters.
module.exports = function (context, req) {
if (!id) {
context.res = {
// status defaults to 200 */
body: "All " + category + " items were requested."
};
}
else {
context.res = {
// status defaults to 200 */
body: category + " item with id = " + id + " was requested."
};
}
context.done();
}
By default, all function routes are prefixed with api. You can also customize or remove the prefix using the
http.routePrefix property in your host.json file. The following example removes the api route prefix by
using an empty string for the prefix in the host.json file.
{
"http": {
"routePrefix": ""
}
}
using System.Net;
using Microsoft.AspNetCore.Mvc;
using System.Security.Claims;
Alternatively, the ClaimsPrincipal can simply be included as an additional parameter in the function signature:
#r "Newtonsoft.Json"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using System.Security.Claims;
using Newtonsoft.Json.Linq;
Authorization keys
Functions lets you use keys to make it harder to access your HTTP function endpoints during development. A
standard HTTP trigger may require such an API key be present in the request.
IMPORTANT
While keys may help obfuscate your HTTP endpoints during development, they are not intended as a way to secure an
HTTP trigger in production. To learn more, see Secure an HTTP endpoint in production.
NOTE
In the Functions 1.x runtime, webhook providers may use keys to authorize requests in a variety of ways, depending on
what the provider supports. This is covered in Webhooks and keys. The version 2.x runtime does not include built-in
support for webhook providers.
Cau t i on
Due to the elevated permissions in your function app granted by the master key, you should not share this
key with third parties or distribute it in native client applications. Use caution when choosing the admin
authorization level.
Obtaining keys
Keys are stored as part of your function app in Azure and are encrypted at rest. To view your keys, create new
ones, or roll keys to new values, navigate to one of your HTTP -triggered functions in the Azure portal and
select Manage.
There is no supported API for programmatically obtaining function keys.
API key authorization
Most HTTP trigger templates require an API key in the request. So your HTTP request normally looks like the
following URL:
https://<APP_NAME>.azurewebsites.net/api/<FUNCTION_NAME>?code=<API_KEY>
The key can be included in a query string variable named code , as above. It can also be included in an
x-functions-key HTTP header. The value of the key can be any function key defined for the function, or any
host key.
You can allow anonymous requests, which do not require keys. You can also require that the master key be
used. You change the default authorization level by using the authLevel property in the binding JSON. For
more information, see Trigger - configuration.
NOTE
When running functions locally, authorization is disabled regardless of the specified authentication level setting. After
publishing to Azure, the authLevel setting in your trigger is enforced.
NOTE
Webhook mode is only available for version 1.x of the Functions runtime. This change was made to improve the
performance of HTTP triggers in version 2.x.
In version 1.x, webhook templates provide additional validation for webhook payloads. In version 2.x, the base
HTTP trigger still works and is the recommended approach for webhooks.
GitHub webhooks
To respond to GitHub webhooks, first create your function with an HTTP Trigger, and set the webHookType
property to github . Then copy its URL and API key into the Add webhook page of your GitHub repository.
Slack webhooks
The Slack webhook generates a token for you instead of letting you specify it, so you must configure a
function-specific key with the token from Slack. See Authorization keys.
Webhooks and keys
Webhook authorization is handled by the webhook receiver component, part of the HTTP trigger, and the
mechanism varies based on the webhook type. Each mechanism does rely on a key. By default, the function
key named "default" is used. To use a different key, configure the webhook provider to send the key name
with the request in one of the following ways:
Query string: The provider passes the key name in the query string parameter, such as
clientid
https://<APP_NAME>.azurewebsites.net/api/<FUNCTION_NAME>?clientid=<KEY_NAME> .
Request header: The provider passes the key name in the x-functions-clientid header.
Trigger - limits
The HTTP request length is limited to 100 MB (104,857,600 bytes), and the URL length is limited to 4 KB
(4,096 bytes). These limits are specified by the httpRuntime element of the runtime's Web.config file.
If a function that uses the HTTP trigger doesn't complete within about 2.5 minutes, the gateway will time out
and return an HTTP 502 error. The function will continue running but will be unable to return an HTTP
response. For long-running functions, we recommend that you follow async patterns and return a location
where you can ping the status of the request. For information about how long a function can run, see Scale
and hosting - Consumption plan.
{
"http": {
"routePrefix": "api",
"maxOutstandingRequests": 200,
"maxConcurrentRequests": 100,
"dynamicThrottlesEnabled": true
}
}
Output
Use the HTTP output binding to respond to the HTTP request sender. This binding requires an HTTP trigger
and allows you to customize the response associated with the trigger's request. If an HTTP output binding is
not provided, an HTTP trigger returns HTTP 200 OK with an empty body in Functions 1.x, or HTTP 204 No
Content with an empty body in Functions 2.x.
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file. For C#
class libraries, there are no attribute properties that correspond to these function.json properties.
PROPERTY DESCRIPTION
name The variable name used in function code for the response,
or $return to use the return value.
Output - usage
To send an HTTP response, use the language-standard response patterns. In C# or C# script, make the
function return type IActionResult or Task<IActionResult> . In C#, a return value attribute isn't required.
For example responses, see the trigger example.
Next steps
Learn more about Azure functions triggers and bindings
Microsoft Graph bindings for Azure Functions
3/14/2019 • 30 minutes to read • Edit Online
This article explains how to configure and work with Microsoft Graph triggers and bindings in Azure
Functions. With these, you can use Azure Functions to work with data, insights, and events from the
Microsoft Graph.
The Microsoft Graph extension provides the following bindings:
An auth token input binding allows you to interact with any Microsoft Graph API.
An Excel table input binding allows you to read data from Excel.
An Excel table output binding allows you to modify Excel data.
A OneDrive file input binding allows you to read files from OneDrive.
A OneDrive file output binding allows you to write to files in OneDrive.
An Outlook message output binding allows you to send email through Outlook.
A collection of Microsoft Graph webhook triggers and bindings allows you to react to events from the
Microsoft Graph.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
NOTE
Microsoft Graph bindings are currently in preview for Azure Functions version 2.x. They are not supported in
Functions version 1.x.
Packages
The auth token input binding is provided in the Microsoft.Azure.WebJobs.Extensions.AuthTokens NuGet
package. The other Microsoft Graph bindings are provided in the
Microsoft.Azure.WebJobs.Extensions.MicrosoftGraph package. Source code for the packages is in the azure-
functions-microsoftgraph-extension GitHub repository.
The following table tells how to add support for this binding in each development environment.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
To learn how to update existing binding extensions in the portal without having to republish your function
app project, see Update your extensions.
NOTE
The in-portal installation process can take up to 10 minutes on a consumption plan.
If you are using Visual Studio, you can get the extensions by installing the NuGet packages that are listed
earlier in this article.
Configuring Authentication / Authorization
The bindings outlined in this article require an identity to be used. This allows the Microsoft Graph to
enforce permissions and audit interactions. The identity can be a user accessing your application or the
application itself. To configure this identity, set up App Service Authentication / Authorization with Azure
Active Directory. You will also need to request any resource permissions your functions require.
NOTE
The Microsoft Graph extension only supports Azure AD authentication. Users need to log in with a work or school
account.
If you're using the Azure portal, you'll see a warning below the prompt to install the extension. The warning
prompts you to configure App Service Authentication / Authorization and request any permissions the
template or binding requires. Click Configure Azure AD now or Add permissions now as appropriate.
Auth token
The auth token input binding gets an Azure AD token for a given resource and provides it to your code as a
string. The resource can be any for which the application has permissions.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
Auth token - example
See the language-specific example:
C# script (.csx)
JavaScript
Auth token - C# script example
The following example gets user profile information.
The function.json file defines an HTTP trigger with a token input binding:
{
"bindings": [
{
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"type": "token",
"direction": "in",
"name": "graphToken",
"resource": "https://graph.microsoft.com",
"identity": "userFromRequest"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The C# script code uses the token to make an HTTP call to the Microsoft Graph and returns the result:
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using Microsoft.Extensions.Logging;
{
"bindings": [
{
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"type": "token",
"direction": "in",
"name": "graphToken",
"resource": "https://graph.microsoft.com",
"identity": "userFromRequest"
},
{
"name": "res",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The JavaScript code uses the token to make an HTTP call to the Microsoft Graph and returns the result.
const rp = require('request-promise');
let options = {
uri: 'https://graph.microsoft.com/v1.0/me/',
headers: {
'Authorization': token
}
};
rp(options)
.then(function(profile) {
context.res = {
body: profile
};
context.done();
})
.catch(function(err) {
context.res = {
status: 500,
body: err
};
context.done();
});
};
NOTE
When developing locally with either of userFromId , userFromToken or userFromRequest options, required token
can be obtained manually and specified in X-MS-TOKEN-AAD-ID-TOKEN request header from a calling client
application.
Excel input
The Excel table input binding reads the contents of an Excel table stored in OneDrive.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
Excel input - example
See the language-specific example:
C# script (.csx)
JavaScript
Excel input - C# script example
The following function.json file defines an HTTP trigger with an Excel input binding:
{
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"type": "excel",
"direction": "in",
"name": "excelTableData",
"path": "{query.workbook}",
"identity": "UserFromRequest",
"tableName": "{query.table}"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The following C# script code reads the contents of the specified table and returns them to the user:
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Microsoft.Extensions.Logging;
The following JavaScript code reads the contents of the specified table and returns them to the user.
RESOURCE PERMISSION
Excel output
The Excel output binding modifies the contents of an Excel table stored in OneDrive.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
Excel output - example
See the language-specific example:
C# script (.csx)
JavaScript
Excel output - C# script example
The following example adds rows to an Excel table.
The function.json file defines an HTTP trigger with an Excel output binding:
{
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "newExcelRow",
"type": "excel",
"direction": "out",
"identity": "userFromRequest",
"updateType": "append",
"path": "{query.workbook}",
"tableName": "{query.table}"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The C# script code adds a new row to the table (assumed to be single-column) based on input from the
query string:
using System.Net;
using System.Text;
using Microsoft.Extensions.Logging;
public static async Task Run(HttpRequest req, IAsyncCollector<object> newExcelRow, ILogger log)
{
string input = req.Query
.FirstOrDefault(q => string.Compare(q.Key, "text", true) == 0)
.Value;
await newExcelRow.AddAsync(new {
Text = input
// Add other properties for additional columns here
});
return;
}
{
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "newExcelRow",
"type": "excel",
"direction": "out",
"identity": "userFromRequest",
"updateType": "append",
"path": "{query.workbook}",
"tableName": "{query.table}"
},
{
"name": "res",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The following JavaScript code adds a new row to the table (assumed to be single-column) based on input
from the query string.
RESOURCE PERMISSION
File input
The OneDrive File input binding reads the contents of a file stored in OneDrive.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
File input - example
See the language-specific example:
C# script (.csx)
JavaScript
File input - C# script example
The following example reads a file that is stored in OneDrive.
The function.json file defines an HTTP trigger with a OneDrive file input binding:
{
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "myOneDriveFile",
"type": "onedrive",
"direction": "in",
"path": "{query.filename}",
"identity": "userFromRequest"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The C# script code reads the file specified in the query string and logs its length:
using System.Net;
using Microsoft.Extensions.Logging;
{
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "myOneDriveFile",
"type": "onedrive",
"direction": "in",
"path": "{query.filename}",
"identity": "userFromRequest"
},
{
"name": "res",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The following JavaScript code reads the file specified in the query string and returns its length.
RESOURCE PERMISSION
File output
The OneDrive file output binding modifies the contents of a file stored in OneDrive.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
File output - example
See the language-specific example:
C# script (.csx)
JavaScript
File output - C# script example
The following example writes to a file that is stored in OneDrive.
The function.json file defines an HTTP trigger with a OneDrive output binding:
{
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "myOneDriveFile",
"type": "onedrive",
"direction": "out",
"path": "FunctionsTest.txt",
"identity": "userFromRequest"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
],
"disabled": false
}
The C# script code gets text from the query string and writes it to a text file (FunctionsTest.txt as defined in
the preceding example) at the root of the caller's OneDrive:
using System.Net;
using System.Text;
using Microsoft.Extensions.Logging;
public static async Task Run(HttpRequest req, ILogger log, Stream myOneDriveFile)
{
string data = req.Query
.FirstOrDefault(q => string.Compare(q.Key, "text", true) == 0)
.Value;
await myOneDriveFile.WriteAsync(Encoding.UTF8.GetBytes(data), 0, data.Length);
myOneDriveFile.Close();
return;
}
The JavaScript code gets text from the query string and writes it to a text file (FunctionsTest.txt as defined in
the config above) at the root of the caller's OneDrive.
RESOURCE PERMISSION
Outlook output
The Outlook message output binding sends a mail message through Outlook.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
Outlook output - example
See the language-specific example:
C# script (.csx)
JavaScript
Outlook output - C# script example
The following example sends an email through Outlook.
The function.json file defines an HTTP trigger with an Outlook message output binding:
{
"bindings": [
{
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "message",
"type": "outlook",
"direction": "out",
"identity": "userFromRequest"
}
],
"disabled": false
}
The C# script code sends a mail from the caller to a recipient specified in the query string:
using System.Net;
using Microsoft.Extensions.Logging;
public static void Run(HttpRequest req, out Message message, ILogger log)
{
string emailAddress = req.Query["to"];
message = new Message(){
subject = "Greetings",
body = "Sent from Azure Functions",
recipient = new Recipient() {
address = emailAddress
}
};
}
The JavaScript code sends a mail from the caller to a recipient specified in the query string:
RESOURCE PERMISSION
Webhooks
Webhooks allow you to react to events in the Microsoft Graph. To support webhooks, functions are needed
to create, refresh, and react to webhook subscriptions. A complete webhook solution requires a combination
of the following bindings:
A Microsoft Graph webhook trigger allows you to react to an incoming webhook.
A Microsoft Graph webhook subscription input binding allows you to list existing subscriptions and
optionally refresh them.
A Microsoft Graph webhook subscription output binding allows you to create or delete webhook
subscriptions.
The bindings themselves do not require any Azure AD permissions, but you need to request permissions
relevant to the resource type you wish to react to. For a list of which permissions are needed for each
resource type, see subscription permissions.
For more information about webhooks, see Working with webhooks in Microsoft Graph.
Webhook trigger
The Microsoft Graph webhook trigger allows a function to react to an incoming webhook from the
Microsoft Graph. Each instance of this trigger can react to one Microsoft Graph resource type.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
Webhook trigger - example
See the language-specific example:
C# script (.csx)
JavaScript
Webhook trigger - C# script example
The following example handles webhooks for incoming Outlook messages. To use a webhook trigger you
create a subscription, and you can refresh the subscription to prevent it from expiring.
The function.json file defines a webhook trigger:
{
"bindings": [
{
"name": "msg",
"type": "GraphWebhookTrigger",
"direction": "in",
"resourceType": "#Microsoft.Graph.Message"
}
],
"disabled": false
}
The C# script code reacts to incoming mail messages and logs the body of those sent by the recipient and
containing "Azure Functions" in the subject:
#r "Microsoft.Graph"
using Microsoft.Graph;
using System.Net;
using Microsoft.Extensions.Logging;
// Testable by sending oneself an email with the subject "Azure Functions" and some text body
if (msg.Subject.Contains("Azure Functions") && msg.From.Equals(msg.Sender)) {
log.LogInformation($"Processed email: {msg.BodyPreview}");
}
}
Webhook trigger - JavaScript example
The following example handles webhooks for incoming Outlook messages. To use a webhook trigger you
create a subscription, and you can refresh the subscription to prevent it from expiring.
The function.json file defines a webhook trigger:
{
"bindings": [
{
"name": "msg",
"type": "GraphWebhookTrigger",
"direction": "in",
"resourceType": "#Microsoft.Graph.Message"
}
],
"disabled": false
}
The JavaScript code reacts to incoming mail messages and logs the body of those sent by the recipient and
containing "Azure Functions" in the subject:
#Microsoft.Graph.DriveItem
- changes made to OneDrive
root items.
#Microsoft.Graph.Contact
- changes made to personal
contacts in Outlook.
#Microsoft.Graph.Event -
changes made to Outlook
calendar items.
NOTE
A function app can only have one function that is registered against a given resourceType value.
Webhook input
The Microsoft Graph webhook input binding allows you to retrieve the list of subscriptions managed by this
function app. The binding reads from function app storage, so it does not reflect other subscriptions created
from outside the app.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
Webhook input - example
See the language-specific example:
C# script (.csx)
JavaScript
Webhook input - C# script example
The following example gets all subscriptions for the calling user and deletes them.
The function.json file defines an HTTP trigger with a subscription input binding and a subscription output
binding that uses the delete action:
{
"bindings": [
{
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"type": "graphWebhookSubscription",
"name": "existingSubscriptions",
"direction": "in",
"filter": "userFromRequest"
},
{
"type": "graphWebhookSubscription",
"name": "subscriptionsToDelete",
"direction": "out",
"action": "delete",
"identity": "userFromRequest"
},
{
"type": "http",
"name": "res",
"direction": "out"
}
],
"disabled": false
}
using System.Net;
using Microsoft.Extensions.Logging;
Webhook output
The webhook subscription output binding allows you to create, delete, and refresh webhook subscriptions in
the Microsoft Graph.
This section contains the following subsections:
Example
Attributes
Configuration
Usage
Webhook output - example
See the language-specific example:
C# script (.csx)
JavaScript
Webhook output - C# script example
The following example creates a subscription. You can refresh the subscription to prevent it from expiring.
The function.json file defines an HTTP trigger with a subscription output binding using the create action:
{
"bindings": [
{
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"type": "graphWebhookSubscription",
"name": "clientState",
"direction": "out",
"action": "create",
"subscriptionResource": "me/mailFolders('Inbox')/messages",
"changeTypes": [
"created"
],
"identity": "userFromRequest"
},
{
"type": "http",
"name": "$return",
"direction": "out"
}
],
"disabled": false
}
The C# script code registers a webhook that will notify this function app when the calling user receives an
Outlook message:
using System;
using System.Net;
using Microsoft.Extensions.Logging;
public static HttpResponseMessage run(HttpRequestMessage req, out string clientState, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
clientState = Guid.NewGuid().ToString();
return new HttpResponseMessage(HttpStatusCode.OK);
}
The JavaScript code registers a webhook that will notify this function app when the calling user receives an
Outlook message:
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 * * */2 * *"
},
{
"type": "graphWebhookSubscription",
"name": "existingSubscriptions",
"direction": "in"
},
{
"type": "graphWebhookSubscription",
"name": "subscriptionsToRefresh",
"direction": "out",
"action": "refresh",
"identity": "clientCredentials"
}
],
"disabled": false
}
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 * * */2 * *"
},
{
"type": "graphWebhookSubscription",
"name": "existingSubscriptions",
"direction": "in"
},
{
"type": "graphWebhookSubscription",
"name": "subscriptionsToRefresh",
"direction": "out",
"action": "refresh",
"identity": "clientCredentials"
}
],
"disabled": false
}
// This template uses application permissions and requires consent from an Azure Active Directory admin.
// See https://go.microsoft.com/fwlink/?linkid=858780
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 * * */2 * *"
},
{
"type": "graphWebhookSubscription",
"name": "existingSubscriptions",
"direction": "in"
}
],
"disabled": false
}
The C# script code refreshes the subscriptions and creates the output binding in code, using each user's
identity:
using System;
using Microsoft.Extensions.Logging;
}
}
Next steps
Learn more about Azure functions triggers and bindings
Mobile Apps bindings for Azure Functions
2/6/2019 • 8 minutes to read • Edit Online
NOTE
Azure Mobile Apps bindings are only available to Azure Functions 1.x. They are not supported in Azure Functions 2.x.
This article explains how to work with Azure Mobile Apps bindings in Azure Functions. Azure Functions supports
input and output bindings for Mobile Apps.
The Mobile Apps bindings let you read and update data tables in mobile apps.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
Input
The Mobile Apps input binding loads a record from a mobile table endpoint and passes it into your function. In
C# and F# functions, any changes made to the record are automatically sent back to the table when the function
exits successfully.
Input - example
See the language-specific example:
C# script (.csx)
JavaScript
Input - C# script example
The following example shows a Mobile Apps input binding in a function.json file and a C# script function that uses
the binding. The function is triggered by a queue message that has a record identifier. The function reads the
specified record and modifies its Text property.
Here's the binding data in the function.json file:
{
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "record",
"type": "mobileTable",
"tableName": "MyTable",
"id" : "{queueTrigger}",
"connection": "My_MobileApp_Url",
"apiKey": "My_MobileApp_Key",
"direction": "in"
}
]
}
#r "Newtonsoft.Json"
using Newtonsoft.Json.Linq;
Input - JavaScript
The following example shows a Mobile Apps input binding in a function.json file and a JavaScript function that
uses the binding. The function is triggered by a queue message that has a record identifier. The function reads the
specified record and modifies its Text property.
Here's the binding data in the function.json file:
{
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "record",
"type": "mobileTable",
"tableName": "MyTable",
"id" : "{queueTrigger}",
"connection": "My_MobileApp_Url",
"apiKey": "My_MobileApp_Key",
"direction": "in"
}
]
}
Input - attributes
In C# class libraries, use the MobileTable attribute.
For information about attribute properties that you can configure, see the following configuration section.
Input - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
MobileTable attribute.
When you're developing locally, app settings go into the local.settings.json file.
IMPORTANT
Don't share the API key with your mobile app clients. It should only be distributed securely to service-side clients, like Azure
Functions. Azure Functions stores your connection information and API keys as app settings so that they are not checked
into your source control repository. This safeguards your sensitive information.
Input - usage
In C# functions, when the record with the specified ID is found, it is passed into the named JObject parameter.
When the record is not found, the parameter value is null .
In JavaScript functions, the record is passed into the context.bindings.<name> object. When the record is not
found, the parameter value is null .
In C# and F# functions, any changes you make to the input record (input parameter) are automatically sent back
to the table when the function exits successfully. You can't modify a record in JavaScript functions.
Output
Use the Mobile Apps output binding to write a new record to a Mobile Apps table.
Output - example
See the language-specific example:
C#
C# script (.csx)
JavaScript
Output - C# example
The following example shows a C# function that is triggered by a queue message and creates a record in a mobile
app table.
[FunctionName("MobileAppsOutput")]
[return: MobileTable(ApiKeySetting = "MyMobileAppKey", TableName = "MyTable", MobileAppUriSetting =
"MyMobileAppUri")]
public static object Run(
[QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")] string myQueueItem,
TraceWriter log)
{
return new { Text = $"I'm running in a C# function! {myQueueItem}" };
}
{
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "record",
"type": "mobileTable",
"tableName": "MyTable",
"connection": "My_MobileApp_Url",
"apiKey": "My_MobileApp_Key",
"direction": "out"
}
]
}
{
"bindings": [
{
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "record",
"type": "mobileTable",
"tableName": "MyTable",
"connection": "My_MobileApp_Url",
"apiKey": "My_MobileApp_Key",
"direction": "out"
}
],
"disabled": false
}
context.bindings.record = {
text : "I'm running in a Node function! Data: '" + myQueueItem + "'"
}
context.done();
};
Output - attributes
In C# class libraries, use the MobileTable attribute.
For information about attribute properties that you can configure, see Output - configuration. Here's a
MobileTable attribute example in a method signature:
[FunctionName("MobileAppsOutput")]
[return: MobileTable(ApiKeySetting = "MyMobileAppKey", TableName = "MyTable", MobileAppUriSetting =
"MyMobileAppUri")]
public static object Run(
[QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")] string myQueueItem,
TraceWriter log)
{
...
}
When you're developing locally, app settings go into the local.settings.json file.
IMPORTANT
Don't share the API key with your mobile app clients. It should only be distributed securely to service-side clients, like Azure
Functions. Azure Functions stores your connection information and API keys as app settings so that they are not checked
into your source control repository. This safeguards your sensitive information.
Output - usage
In C# script functions, use a named output parameter of type out object to access the output record. In C# class
libraries, the MobileTable attribute can be used with any of the following types:
ICollector<T> or IAsyncCollector<T> , where T is either JObject or any type with a public string Id
property.
out JObject
out T or out T[] , where T is any Type with a public string Id property.
Next steps
Learn more about Azure functions triggers and bindings
Notification Hubs output binding for Azure
Functions
4/10/2019 • 7 minutes to read • Edit Online
This article explains how to send push notifications by using Azure Notification Hubs bindings in Azure Functions.
Azure Functions supports output bindings for Notification Hubs.
Azure Notification Hubs must be configured for the Platform Notifications Service (PNS ) you want to use. To
learn how to get push notifications in your client app from Notification Hubs, see Getting started with Notification
Hubs and select your target client platform from the drop-down list near the top of the page.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
IMPORTANT
Google has deprecated Google Cloud Messaging (GCM) in favor of Firebase Cloud Messaging (FCM). This output binding
doesn't support FCM. To send notifications using FCM, use the Firebase API directly in your function or use template
notifications.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
Example - template
The notifications you send can be native notifications or template notifications. Native notifications target a
specific client platform as configured in the platform property of the output binding. A template notification can
be used to target multiple platforms.
See the language-specific example:
C# script - out parameter
C# script - asynchronous
C# script - JSON
C# script - library types
F#
JavaScript
C# script template example - out parameter
This example sends a notification for a template registration that contains a message placeholder in the template.
using System;
using System.Threading.Tasks;
using System.Collections.Generic;
public static void Run(string myQueueItem, out IDictionary<string, string> notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
notification = GetTemplateProperties(myQueueItem);
}
using System;
using System.Threading.Tasks;
using System.Collections.Generic;
public static void Run(string myQueueItem, out string notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
notification = "{\"message\":\"Hello from C#. Processed a queue item!\"}";
}
#r "Microsoft.Azure.NotificationHubs"
using System;
using System.Threading.Tasks;
using Microsoft.Azure.NotificationHubs;
public static void Run(string myQueueItem, out Notification notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
notification = GetTemplateNotification(myQueueItem);
}
F# template example
This example sends a notification for a template registration that contains location and message .
if (myTimer.IsPastDue)
{
context.log('Node.js is running late!');
}
context.log('Node.js timer trigger function ran!', timeStamp);
context.bindings.notification = {
location: "Redmond",
message: "Hello from Node!"
};
context.done();
};
using System;
using Microsoft.Azure.NotificationHubs;
using Newtonsoft.Json;
public static async Task Run(string myQueueItem, IAsyncCollector<Notification> notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a new user to be processed in the form of a JSON string with
// a "name" value.
//
// The JSON format for a native APNS notification is ...
// { "aps": { "alert": "notification message" }}
using System;
using Microsoft.Azure.NotificationHubs;
using Newtonsoft.Json;
public static async Task Run(string myQueueItem, IAsyncCollector<Notification> notification, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a new user to be processed in the form of a JSON string with
// a "name" value.
//
// The XML format for a native WNS toast notification is ...
// <?xml version="1.0" encoding="utf-8"?>
// <toast>
// <visual>
// <binding template="ToastText01">
// <text id="1">notification message</text>
// </binding>
// </visual>
// </toast>
log.Info($"{wnsNotificationPayload}");
await notification.AddAsync(new WindowsNotification(wnsNotificationPayload));
}
Attributes
In C# class libraries, use the NotificationHub attribute.
The attribute's constructor parameters and properties are described in the configuration section.
Configuration
The following table explains the binding configuration properties that you set in the function.json file and the
NotificationHub attribute:
When you're developing locally, app settings go into the local.settings.json file.
function.json file example
Here's an example of a Notification Hubs binding in a function.json file.
{
"bindings": [
{
"type": "notificationHub",
"direction": "out",
"name": "notification",
"tagExpression": "",
"hubName": "my-notification-hub",
"connection": "MyHubConnectionString",
"platform": "apns"
}
],
"disabled": false
}
2. Navigate to your function app in the Azure portal, choose Application settings, add a key such as
MyHubConnectionString, paste the copied DefaultFullSharedAccessSignature for your notification hub as
the value, and then click Save.
The name of this application setting is what goes in the output binding connection setting in function.json or the
.NET attribute. See the Configuration section earlier in this article.
When you're developing locally, app settings go into the local.settings.json file.
Next steps
Learn more about Azure functions triggers and bindings
Azure Queue storage bindings for Azure Functions
5/22/2019 • 15 minutes to read • Edit Online
This article explains how to work with Azure Queue storage bindings in Azure Functions. Azure Functions
supports trigger and output bindings for queues.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Encoding
Functions expect a base64 encoded string. Any adjustments to the encoding type (in order to prepare data as a
base64 encoded string) need to be implemented in the calling service.
Trigger
Use the queue trigger to start a function when a new item is received on a queue. The queue message is
provided as input to the function.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
JavaScript
Java
Trigger - C# example
The following example shows a C# function that polls the myqueue-items queue and writes a log each time a
queue item is processed.
{
"disabled": false,
"bindings": [
{
"type": "queueTrigger",
"direction": "in",
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"MyStorageConnectionAppSetting"
}
]
}
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Queue;
using System;
The usage section explains myQueueItem , which is named by the name property in function.json. The message
metadata section explains all of the other variables shown.
Trigger - JavaScript example
The following example shows a queue trigger binding in a function.json file and a JavaScript function that uses
the binding. The function polls the myqueue-items queue and writes a log each time a queue item is processed.
Here's the function.json file:
{
"disabled": false,
"bindings": [
{
"type": "queueTrigger",
"direction": "in",
"name": "myQueueItem",
"queueName": "myqueue-items",
"connection":"MyStorageConnectionAppSetting"
}
]
}
NOTE
The name parameter reflects as context.bindings.<name> in the JavaScript code which contains the queue item
payload. This payload is also passed as the second parameter to the function.
The usage section explains myQueueItem , which is named by the name property in function.json. The message
metadata section explains all of the other variables shown.
Trigger - Java example
The following Java example shows a storage queue trigger functions which logs the triggered message placed
into queue myqueuename .
@FunctionName("queueprocessor")
public void run(
@QueueTrigger(name = "msg",
queueName = "myqueuename",
connection = "myconnvarname") String message,
final ExecutionContext context
) {
context.getLogger().info(message);
}
Trigger - attributes
In C# class libraries, use the following attributes to configure a queue trigger:
QueueTriggerAttribute
The attribute's constructor takes the name of the queue to monitor, as shown in the following example:
[FunctionName("QueueTrigger")]
public static void Run(
[QueueTrigger("myqueue-items")] string myQueueItem,
ILogger log)
{
...
}
You can set the Connection property to specify the storage account to use, as shown in the following
example:
[FunctionName("QueueTrigger")]
public static void Run(
[QueueTrigger("myqueue-items", Connection = "StorageConnectionAppSetting")] string myQueueItem,
ILogger log)
{
....
}
[StorageAccount("ClassLevelStorageAppSetting")]
public static class AzureFunctions
{
[FunctionName("QueueTrigger")]
[StorageAccount("FunctionLevelStorageAppSetting")]
public static void Run( //...
{
...
}
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
QueueTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
Trigger - usage
In C# and C# script, access the message data by using a method parameter such as string paramName . In C#
script, paramName is the value specified in the name property of function.json. You can bind to any of the
following types:
Object - The Functions runtime deserializes a JSON payload into an instance of an arbitrary class defined in
your code.
string
byte[]
CloudQueueMessage
If you try to bind to CloudQueueMessage and get an error message, make sure that you have a reference to the
correct Storage SDK version.
In JavaScript, use context.bindings.<name> to access the queue item payload. If the payload is JSON, it's
deserialized into an object.
Trigger - concurrency
When there are multiple queue messages waiting, the queue trigger retrieves a batch of messages and invokes
function instances concurrently to process them. By default, the batch size is 16. When the number being
processed gets down to 8, the runtime gets another batch and starts processing those messages. So the
maximum number of concurrent messages being processed per function on one virtual machine (VM ) is 24.
This limit applies separately to each queue-triggered function on each VM. If your function app scales out to
multiple VMs, each VM will wait for triggers and attempt to run functions. For example, if a function app scales
out to 3 VMs, the default maximum number of concurrent instances of one queue-triggered function is 72.
The batch size and the threshold for getting a new batch are configurable in the host.json file. If you want to
minimize parallel execution for queue-triggered functions in a function app, you can set the batch size to 1. This
setting eliminates concurrency only so long as your function app runs on a single virtual machine (VM ).
The queue trigger automatically prevents a function from processing a queue message multiple times;
functions do not have to be written to be idempotent.
Output
Use the Azure Queue storage output binding to write messages to a queue.
Output - example
See the language-specific example:
C#
C# script (.csx)
JavaScript
Java
Output - C# example
The following example shows a C# function that creates a queue message for each HTTP request received.
[StorageAccount("AzureWebJobsStorage")]
public static class QueueFunctions
{
[FunctionName("QueueOutput")]
[return: Queue("myqueue-items")]
public static string QueueOutput([HttpTrigger] dynamic input, ILogger log)
{
log.LogInformation($"C# function processed: {input.Text}");
return input.Text;
}
}
{
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"authLevel": "function",
"name": "input"
},
{
"type": "http",
"direction": "out",
"name": "return"
},
{
"type": "queue",
"direction": "out",
"name": "$return",
"queueName": "outqueue",
"connection": "MyStorageConnectionAppSetting"
}
]
}
You can send multiple messages at once by using an ICollector or IAsyncCollector parameter. Here's C#
script code that sends multiple messages, one with the HTTP request data and one with hard-coded values:
{
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"authLevel": "function",
"name": "input"
},
{
"type": "http",
"direction": "out",
"name": "return"
},
{
"type": "queue",
"direction": "out",
"name": "$return",
"queueName": "outqueue",
"connection": "MyStorageConnectionAppSetting"
}
]
}
module.exports = function(context) {
context.bindings.myQueueItem = ["message 1","message 2"];
context.done();
};
@FunctionName("httpToQueue")
@QueueOutput(name = "item", queueName = "myqueue-items", connection = "AzureWebJobsStorage")
public String pushToQueue(
@HttpTrigger(name = "request", methods = {HttpMethod.POST}, authLevel = AuthorizationLevel.ANONYMOUS)
final String message,
@HttpOutput(name = "response") final OutputBinding<String> result) {
result.setValue(message + " has been added.");
return message;
}
In the Java functions runtime library, use the @QueueOutput annotation on parameters whose value would be
written to Queue storage. The parameter type should be OutputBinding<T> , where T is any native Java type of a
POJO.
Output - attributes
In C# class libraries, use the QueueAttribute.
The attribute applies to an out parameter or the return value of the function. The attribute's constructor takes
the name of the queue, as shown in the following example:
[FunctionName("QueueOutput")]
[return: Queue("myqueue-items")]
public static string Run([HttpTrigger] dynamic input, ILogger log)
{
...
}
You can set the Connection property to specify the storage account to use, as shown in the following example:
[FunctionName("QueueOutput")]
[return: Queue("myqueue-items", Connection = "StorageConnectionAppSetting")]
public static string Run([HttpTrigger] dynamic input, ILogger log)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
Queue attribute.
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
In C# and C# script, write a single queue message by using a method parameter such as out T paramName . In
C# script, paramName is the value specified in the name property of function.json. You can use the method
return type instead of an out parameter, and T can be any of the following types:
An object serializable as JSON
string
byte[]
CloudQueueMessage
If you try to bind to CloudQueueMessage and get an error message, make sure that you have a reference to the
correct Storage SDK version.
In C# and C# script, write multiple queue messages by using one of the following types:
ICollector<T> or IAsyncCollector<T>
CloudQueue
In JavaScript functions, use context.bindings.<name> to access the output queue message. You can use a string
or a JSON -serializable object for the queue item payload.
host.json settings
This section describes the global configuration settings available for this binding in version 2.x. The example
host.json file below contains only the version 2.x settings for this binding. For more information about global
configuration settings in version 2.x, see host.json reference for Azure Functions version 2.x.
NOTE
For a reference of host.json in Functions 1.x, see host.json reference for Azure Functions 1.x.
{
"version": "2.0",
"extensions": {
"queues": {
"maxPollingInterval": "00:00:02",
"visibilityTimeout" : "00:00:30",
"batchSize": 16,
"maxDequeueCount": 5,
"newBatchThreshold": 8
}
}
}
Next steps
Learn more about Azure functions triggers and bindings
Go to a tutorial that uses a Queue storage output binding
Azure Functions SendGrid bindings
2/25/2019 • 5 minutes to read • Edit Online
This article explains how to send email by using SendGrid bindings in Azure Functions. Azure Functions supports
an output binding for SendGrid.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Example
See the language-specific example:
C#
C# script (.csx)
JavaScript
Java
C# example
The following example shows a C# function that uses a Service Bus queue trigger and a SendGrid output binding.
Synchronous C# example:
[FunctionName("SendEmail")]
public static void Run(
[ServiceBusTrigger("myqueue", Connection = "ServiceBusConnection")] Message email,
[SendGrid(ApiKey = "CustomSendGridKeyAppSettingName")] out SendGridMessage message)
{
var emailObject = JsonConvert.DeserializeObject<OutgoingEmail>(Encoding.UTF8.GetString(email.Body));
Asynchronous C# example:
[FunctionName("SendEmail")]
public static async void Run(
[ServiceBusTrigger("myqueue", Connection = "ServiceBusConnection")] Message email,
[SendGrid(ApiKey = "CustomSendGridKeyAppSettingName")] IAsyncCollector<SendGridMessage> messageCollector)
{
var emailObject = JsonConvert.DeserializeObject<OutgoingEmail>(Encoding.UTF8.GetString(email.Body));
await messageCollector.AddAsync(message);
}
You can omit setting the attribute's ApiKey property if you have your API key in an app setting named
"AzureWebJobsSendGridApiKey".
C# script example
The following example shows a SendGrid output binding in a function.json file and a C# script function that uses
the binding.
Here's the binding data in the function.json file:
{
"bindings": [
{
"type": "queueTrigger",
"name": "mymsg",
"queueName": "myqueue",
"connection": "AzureWebJobsStorage",
"direction": "in"
},
{
"type": "sendGrid",
"name": "$return",
"direction": "out",
"apiKey": "SendGridAPIKeyAsAppSetting",
"from": "{FromEmail}",
"to": "{ToEmail}"
}
]
}
#r "SendGrid"
using System;
using SendGrid.Helpers.Mail;
using Microsoft.Azure.WebJobs.Host;
message.AddContent("text/plain", $"{mymsg.Content}");
return message;
}
public class Message
{
public string ToEmail { get; set; }
public string FromEmail { get; set; }
public string Subject { get; set; }
public string Content { get; set; }
}
Java example
The following example uses the @SendGridOutput annotation from the Java functions runtime library to send an
email using the SendGrid output binding.
@FunctionName("SendEmail")
public HttpResponseMessage run(
@HttpTrigger(name = "req", methods = {HttpMethod.GET, HttpMethod.POST}, authLevel =
AuthorizationLevel.FUNCTION) HttpRequestMessage<Optional<String>> request,
@SendGridOutput(
name = "email", dataType = "String", apiKey = "SendGridConnection", to = "test@example.com",
from = "test@example.com",
subject= "Sending with SendGrid", text = "Hello from Azure Functions"
) OutputBinding<String> email
)
{
String name = request.getBody().orElse("World");
email.setValue(emailBody);
return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
}
JavaScript example
The following example shows a SendGrid output binding in a function.json file and a JavaScript function that uses
the binding.
Here's the binding data in the function.json file:
{
"bindings": [
{
"name": "$return",
"type": "sendGrid",
"direction": "out",
"apiKey" : "MySendGridKey",
"to": "{ToEmail}",
"from": "{FromEmail}",
"subject": "SendGrid output bindings"
}
]
}
context.done(null, message);
};
Attributes
In C# class libraries, use the SendGrid attribute.
For information about attribute properties that you can configure, see Configuration. Here's a SendGrid attribute
example in a method signature:
[FunctionName("SendEmail")]
public static void Run(
[ServiceBusTrigger("myqueue", Connection = "ServiceBusConnection")] OutgoingEmail email,
[SendGrid(ApiKey = "CustomSendGridKeyAppSettingName")] out SendGridMessage message)
{
...
}
Configuration
The following table explains the binding configuration properties that you set in the function.json file and the
SendGrid attribute.
When you're developing locally, app settings go into the local.settings.json file.
host.json settings
This section describes the global configuration settings available for this binding in version 2.x. The example
host.json file below contains only the version 2.x settings for this binding. For more information about global
configuration settings in version 2.x, see host.json reference for Azure Functions version 2.x.
NOTE
For a reference of host.json in Functions 1.x, see host.json reference for Azure Functions 1.x.
{
"version": "2.0",
"extensions": {
"sendGrid": {
"from": "Azure Functions <samples@functions.com>"
}
}
}
Next steps
Learn more about Azure functions triggers and bindings
Azure Service Bus bindings for Azure Functions
4/11/2019 • 17 minutes to read • Edit Online
This article explains how to work with Azure Service Bus bindings in Azure Functions. Azure Functions
supports trigger and output bindings for Service Bus queues and topics.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Trigger
Use the Service Bus trigger to respond to messages from a Service Bus queue or topic.
Trigger - example
See the language-specific example:
C#
C# script (.csx)
F#
Java
JavaScript
Trigger - C# example
The following example shows a C# function that reads message metadata and logs a Service Bus queue
message:
[FunctionName("ServiceBusQueueTriggerCSharp")]
public static void Run(
[ServiceBusTrigger("myqueue", AccessRights.Manage, Connection = "ServiceBusConnection")]
string myQueueItem,
Int32 deliveryCount,
DateTime enqueuedTimeUtc,
string messageId,
ILogger log)
{
log.LogInformation($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
log.LogInformation($"EnqueuedTimeUtc={enqueuedTimeUtc}");
log.LogInformation($"DeliveryCount={deliveryCount}");
log.LogInformation($"MessageId={messageId}");
}
This example is for Azure Functions version 1.x. To make this code work for 2.x:
omit the access rights parameter
change the type of the log parameter from TraceWriter to ILogger
change log.Info to log.LogInformation
Trigger - C# script example
The following example shows a Service Bus trigger binding in a function.json file and a C# script function that
uses the binding. The function reads message metadata and logs a Service Bus queue message.
Here's the binding data in the function.json file:
{
"bindings": [
{
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"name": "myQueueItem",
"type": "serviceBusTrigger",
"direction": "in"
}
],
"disabled": false
}
log.Info($"EnqueuedTimeUtc={enqueuedTimeUtc}");
log.Info($"DeliveryCount={deliveryCount}");
log.Info($"MessageId={messageId}");
}
Trigger - F# example
The following example shows a Service Bus trigger binding in a function.json file and an F# function that uses
the binding. The function logs a Service Bus queue message.
Here's the binding data in the function.json file:
{
"bindings": [
{
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"name": "myQueueItem",
"type": "serviceBusTrigger",
"direction": "in"
}
],
"disabled": false
}
@FunctionName("sbprocessor")
public void serviceBusProcess(
@ServiceBusQueueTrigger(name = "msg",
queueName = "myqueuename",
connection = "myconnvarname") String message,
final ExecutionContext context
) {
context.getLogger().info(message);
}
Java functions can also be triggered when a message is added to a Service Bus topic. The following example
uses the @ServiceBusTopicTrigger annotation to describe the trigger configuration.
@FunctionName("sbtopicprocessor")
public void run(
@ServiceBusTopicTrigger(
name = "message",
topicName = "mytopicname",
subscriptionName = "mysubscription",
connection = "ServiceBusConnection"
) String message,
final ExecutionContext context
) {
context.getLogger().info(message);
}
{
"bindings": [
{
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"name": "myQueueItem",
"type": "serviceBusTrigger",
"direction": "in"
}
],
"disabled": false
}
Trigger - attributes
In C# class libraries, use the following attributes to configure a Service Bus trigger:
ServiceBusTriggerAttribute
The attribute's constructor takes the name of the queue or the topic and subscription. In Azure Functions
version 1.x, you can also specify the connection's access rights. If you don't specify access rights, the
default is Manage . For more information, see the Trigger - configuration section.
Here's an example that shows the attribute used with a string parameter:
[FunctionName("ServiceBusQueueTriggerCSharp")]
public static void Run(
[ServiceBusTrigger("myqueue")] string myQueueItem, ILogger log)
{
...
}
You can set the Connection property to specify the Service Bus account to use, as shown in the following
example:
[FunctionName("ServiceBusQueueTriggerCSharp")]
public static void Run(
[ServiceBusTrigger("myqueue", Connection = "ServiceBusConnection")]
string myQueueItem, ILogger log)
{
...
}
[ServiceBusAccount("ClassLevelServiceBusAppSetting")]
public static class AzureFunctions
{
[ServiceBusAccount("MethodLevelServiceBusAppSetting")]
[FunctionName("ServiceBusQueueTriggerCSharp")]
public static void Run(
[ServiceBusTrigger("myqueue", AccessRights.Manage)]
string myQueueItem, ILogger log)
{
...
}
Trigger - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
ServiceBusTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
Trigger - usage
In C# and C# script, you can use the following parameter types for the queue or topic message:
string - If the message is text.
byte[] - Useful for binary data.
A custom type - If the message contains JSON, Azure Functions tries to deserialize the JSON data.
BrokeredMessage - Gives you the deserialized message with the BrokeredMessage.GetBody() method.
These parameters are for Azure Functions version 1.x; for 2.x, use Message instead of BrokeredMessage .
In JavaScript, access the queue or topic message by using context.bindings.<name from function.json> . The
Service Bus message is passed into the function as either a string or JSON object.
NOTE
Currently, trigger only works with queues and subscriptions that don't use sessions. Please track this feature item for any
further updates regarding this feature.
See code examples that use these properties earlier in this article.
{
"serviceBus": {
"maxConcurrentCalls": 16,
"prefetchCount": 100,
"maxAutoRenewDuration": "00:05:00"
}
}
Output
Use Azure Service Bus output binding to send queue or topic messages.
Output - example
See the language-specific example:
C#
C# script (.csx)
F#
Java
JavaScript
Output - C# example
The following example shows a C# function that sends a Service Bus queue message:
[FunctionName("ServiceBusOutput")]
[return: ServiceBus("myqueue", Connection = "ServiceBusConnection")]
public static string ServiceBusOutput([HttpTrigger] dynamic input, ILogger log)
{
log.LogInformation($"C# function processed: {input.Text}");
return input.Text;
}
public static void Run(TimerInfo myTimer, ILogger log, out string outputSbQueue)
{
string message = $"Service Bus queue message created at: {DateTime.Now}";
log.LogInformation(message);
outputSbQueue = message;
}
Output - F# example
The following example shows a Service Bus output binding in a function.json file and an F# script function that
uses the binding. The function uses a timer trigger to send a queue message every 15 seconds.
Here's the binding data in the function.json file:
{
"bindings": [
{
"schedule": "0/15 * * * * *",
"name": "myTimer",
"runsOnStartup": true,
"type": "timerTrigger",
"direction": "in"
},
{
"name": "outputSbQueue",
"type": "serviceBus",
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"direction": "out"
}
],
"disabled": false
}
@FunctionName("httpToServiceBusQueue")
@ServiceBusQueueOutput(name = "message", queueName = "myqueue", connection = "AzureServiceBusConnection")
public String pushToQueue(
@HttpTrigger(name = "request", methods = {HttpMethod.POST}, authLevel = AuthorizationLevel.ANONYMOUS)
final String message,
@HttpOutput(name = "response") final OutputBinding<T> result ) {
result.setValue(message + " has been sent.");
return message;
}
In the Java functions runtime library, use the @QueueOutput annotation on function parameters whose value
would be written to a Service Bus queue. The parameter type should be OutputBinding<T> , where T is any
native Java type of a POJO.
Java functions can also write to a Service Bus topic. The following example uses the @ServiceBusTopicOutput
annotation to describe the configuration for the output binding.
@FunctionName("sbtopicsend")
public HttpResponseMessage run(
@HttpTrigger(name = "req", methods = {HttpMethod.GET, HttpMethod.POST}, authLevel =
AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Optional<String>> request,
@ServiceBusTopicOutput(name = "message", topicName = "mytopicname", subscriptionName =
"mysubscription", connection = "ServiceBusConnection") OutputBinding<String> message,
final ExecutionContext context) {
message.setValue(name);
return request.createResponseBuilder(HttpStatus.OK).body("Hello, " + name).build();
{
"bindings": [
{
"schedule": "0/15 * * * * *",
"name": "myTimer",
"runsOnStartup": true,
"type": "timerTrigger",
"direction": "in"
},
{
"name": "outputSbQueue",
"type": "serviceBus",
"queueName": "testqueue",
"connection": "MyServiceBusConnection",
"direction": "out"
}
],
"disabled": false
}
[FunctionName("ServiceBusOutput")]
[return: ServiceBus("myqueue")]
public static string Run([HttpTrigger] dynamic input, ILogger log)
{
...
}
You can set the Connection property to specify the Service Bus account to use, as shown in the following
example:
[FunctionName("ServiceBusOutput")]
[return: ServiceBus("myqueue", Connection = "ServiceBusConnection")]
public static string Run([HttpTrigger] dynamic input, ILogger log)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
ServiceBus attribute.
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
In Azure Functions 1.x, the runtime creates the queue if it doesn't exist and you have set accessRights to
manage . In Functions version 2.x, the queue or topic must already exist; if you specify a queue or topic that
doesn't exist, the function will fail.
In C# and C# script, you can use the following parameter types for the output binding:
out T paramName - T can be any JSON -serializable type. If the parameter value is null when the function
exits, Functions creates the message with a null object.
out string - If the parameter value is null when the function exits, Functions does not create a message.
out byte[] - If the parameter value is null when the function exits, Functions does not create a message.
out BrokeredMessage - If the parameter value is null when the function exits, Functions does not create a
message.
ICollector<T> or IAsyncCollector<T> - For creating multiple messages. A message is created when you call
the Add method.
In async functions, use the return value or IAsyncCollector instead of an out parameter.
These parameters are for Azure Functions version 1.x; for 2.x, use Message instead of BrokeredMessage .
In JavaScript, access the queue or topic by using context.bindings.<name from function.json> . You can assign a
string, a byte array, or a Javascript object (deserialized into JSON ) to context.binding.<name> .
host.json settings
This section describes the global configuration settings available for this binding in version 2.x. The example
host.json file below contains only the version 2.x settings for this binding. For more information about global
configuration settings in version 2.x, see host.json reference for Azure Functions version 2.x.
NOTE
For a reference of host.json in Functions 1.x, see host.json reference for Azure Functions 1.x.
{
"version": "2.0",
"extensions": {
"serviceBus": {
"prefetchCount": 100,
"messageHandlerOptions": {
"autoComplete": false,
"maxConcurrentCalls": 32,
"maxAutoRenewDuration": "00:55:00"
}
}
}
}
Next steps
Learn more about Azure functions triggers and bindings
SignalR Service bindings for Azure Functions
3/25/2019 • 12 minutes to read • Edit Online
This article explains how to authenticate and send real-time messages to clients connected to Azure SignalR
Service by using SignalR Service bindings in Azure Functions. Azure Functions supports input and output
bindings for SignalR Service.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Java annotations
To use the SignalR Service annotations in Java functions, you need to add a dependency to the azure-functions-
java -library-signalr artifact (version 1.0 or higher) to your pom.xml.
<dependency>
<groupId>com.microsoft.azure.functions</groupId>
<artifactId>azure-functions-java-library-signalr</artifactId>
<version>1.0.0</version>
</dependency>
NOTE
To use the SignalR Service bindings in Java, make sure you are using version 2.4.419 or higher of the Azure Functions Core
Tools (host version 2.0.12332).
Using SignalR Service with Azure Functions
For details on how to configure and use SignalR Service and Azure Functions together, refer to Azure Functions
development and configuration with Azure SignalR Service.
[FunctionName("negotiate")]
public static SignalRConnectionInfo Negotiate(
[HttpTrigger(AuthorizationLevel.Anonymous)]HttpRequest req,
[SignalRConnectionInfo(HubName = "chat")]SignalRConnectionInfo connectionInfo)
{
return connectionInfo;
}
Authenticated tokens
If the function is triggered by an authenticated client, you can add a user ID claim to the generated token. You can
easily add authentication to a function app using App Service Authentication.
App Service Authentication sets HTTP headers named x-ms-client-principal-id and x-ms-client-principal-name
that contain the authenticated user's client principal ID and name, respectively. You can set the UserId property
of the binding to the value from either header using a binding expression: {headers.x-ms-client-principal-id} or
{headers.x-ms-client-principal-name} .
[FunctionName("negotiate")]
public static SignalRConnectionInfo Negotiate(
[HttpTrigger(AuthorizationLevel.Anonymous)]HttpRequest req,
[SignalRConnectionInfo
(HubName = "chat", UserId = "{headers.x-ms-client-principal-id}")]
SignalRConnectionInfo connectionInfo)
{
// connectionInfo contains an access key token with a name identifier claim set to the authenticated user
return connectionInfo;
}
{
"type": "signalRConnectionInfo",
"name": "connectionInfo",
"hubName": "chat",
"connectionStringSetting": "<name of setting containing SignalR Service connection string>",
"direction": "in"
}
Authenticated tokens
If the function is triggered by an authenticated client, you can add a user ID claim to the generated token. You can
easily add authentication to a function app using App Service Authentication.
App Service Authentication sets HTTP headers named x-ms-client-principal-id and x-ms-client-principal-name
that contain the authenticated user's client principal ID and name, respectively. You can set the userId property
of the binding to the value from either header using a binding expression: {headers.x-ms-client-principal-id} or
{headers.x-ms-client-principal-name} .
Example function.json:
{
"type": "signalRConnectionInfo",
"name": "connectionInfo",
"hubName": "chat",
"userId": "{headers.x-ms-client-principal-id}",
"connectionStringSetting": "<name of setting containing SignalR Service connection string>",
"direction": "in"
}
Authenticated tokens
If the function is triggered by an authenticated client, you can add a user ID claim to the generated token. You can
easily add authentication to a function app using App Service Authentication.
App Service Authentication sets HTTP headers named x-ms-client-principal-id and x-ms-client-principal-name
that contain the authenticated user's client principal ID and name, respectively. You can set the UserId property
of the binding to the value from either header using a binding expression: {headers.x-ms-client-principal-id} or
{headers.x-ms-client-principal-name} .
@FunctionName("negotiate")
public SignalRConnectionInfo negotiate(
@HttpTrigger(
name = "req",
methods = { HttpMethod.POST },
authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Optional<String>> req,
@SignalRConnectionInfoInput(
name = "connectionInfo",
hubName = "chat",
userId = "{headers.x-ms-client-principal-id}") SignalRConnectionInfo connectionInfo) {
return connectionInfo;
}
Send to a user
You can send a message only to connections that have been authenticated to a user by setting the UserId
property of the SignalR message.
[FunctionName("SendMessage")]
public static Task SendMessage(
[HttpTrigger(AuthorizationLevel.Anonymous, "post")]object message,
[SignalR(HubName = "chat")]IAsyncCollector<SignalRMessage> signalRMessages)
{
return signalRMessages.AddAsync(
new SignalRMessage
{
// the message will only be sent to this user ID
UserId = "userId1",
Target = "newMessage",
Arguments = new [] { message }
});
}
Send to a group
You can send a message only to connections that have been added to a group by setting the GroupName property
of the SignalR message.
[FunctionName("SendMessage")]
public static Task SendMessage(
[HttpTrigger(AuthorizationLevel.Anonymous, "post")]object message,
[SignalR(HubName = "chat")]IAsyncCollector<SignalRMessage> signalRMessages)
{
return signalRMessages.AddAsync(
new SignalRMessage
{
// the message will be sent to the group with this name
GroupName = "myGroup",
Target = "newMessage",
Arguments = new [] { message }
});
}
[FunctionName("removeFromGroup")]
public static Task RemoveFromGroup(
[HttpTrigger(AuthorizationLevel.Anonymous, "post")]HttpRequest req,
string userId,
[SignalR(HubName = "chat")]
IAsyncCollector<SignalRGroupAction> signalRGroupActions)
{
return signalRGroupActions.AddAsync(
new SignalRGroupAction
{
UserId = userId,
GroupName = "myGroup",
Action = GroupAction.Remove
});
}
{
"type": "signalR",
"name": "signalRMessages",
"hubName": "<hub_name>",
"connectionStringSetting": "<name of setting containing SignalR Service connection string>",
"direction": "out"
}
Send to a user
You can send a message only to connections that have been authenticated to a user by setting the userId
property of the SignalR message.
function.json stays the same. Here's the JavaScript code:
Send to a group
You can send a message only to connections that have been added to a group by setting the groupName property
of the SignalR message.
function.json stays the same. Here's the JavaScript code:
index.js
index.js
@FunctionName("sendMessage")
@SignalROutput(name = "$return", hubName = "chat")
public SignalRMessage sendMessage(
@HttpTrigger(
name = "req",
methods = { HttpMethod.POST },
authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Object> req) {
Send to a user
You can send a message only to connections that have been authenticated to a user by setting the userId
property of the SignalR message.
@FunctionName("sendMessage")
@SignalROutput(name = "$return", hubName = "chat")
public SignalRMessage sendMessage(
@HttpTrigger(
name = "req",
methods = { HttpMethod.POST },
authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Object> req) {
Send to a group
You can send a message only to connections that have been added to a group by setting the groupName property
of the SignalR message.
@FunctionName("sendMessage")
@SignalROutput(name = "$return", hubName = "chat")
public SignalRMessage sendMessage(
@HttpTrigger(
name = "req",
methods = { HttpMethod.POST },
authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Object> req) {
@FunctionName("addToGroup")
@SignalROutput(name = "$return", hubName = "chat")
public SignalRGroupAction addToGroup(
@HttpTrigger(
name = "req",
methods = { HttpMethod.POST },
authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Object> req,
@BindingName("userId") String userId) {
Configuration
SignalRConnectionInfo
The following table explains the binding configuration properties that you set in the function.json file and the
SignalRConnectionInfo attribute.
SignalR
The following table explains the binding configuration properties that you set in the function.json file and the
SignalR attribute.
When you're developing locally, app settings go into the local.settings.json file.
Next steps
Learn more about Azure functions triggers and bindings
Azure Functions development and configuration with Azure SignalR Service
Azure Table storage bindings for Azure Functions
3/6/2019 • 17 minutes to read • Edit Online
This article explains how to work with Azure Table storage bindings in Azure Functions. Azure Functions supports
input and output bindings for Azure Table storage.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Input
Use the Azure Table storage input binding to read a table in an Azure Storage account.
Input - example
See the language-specific example:
C# read one entity
C# bind to IQueryable
C# bind to CloudTable
C# script read one entity
C# script bind to IQueryable
C# script bind to CloudTable
F#
JavaScript
Java
Input - C# example - one entity
The following example shows a C# function that reads a single table row.
The row key value "{queueTrigger}" indicates that the row key comes from the queue message string.
[FunctionName("TableInput")]
public static void TableInput(
[QueueTrigger("table-items")] string input,
[Table("MyTable", "MyPartition", "{queueTrigger}")] MyPoco poco,
ILogger log)
{
log.LogInformation($"PK={poco.PartitionKey}, RK={poco.RowKey}, Text={poco.Text}");
}
}
[FunctionName("TableInput")]
public static void TableInput(
[QueueTrigger("table-items")] string input,
[Table("MyTable", "MyPartition")] IQueryable<MyPoco> pocos,
ILogger log)
{
foreach (MyPoco poco in pocos)
{
log.LogInformation($"PK={poco.PartitionKey}, RK={poco.RowKey}, Text={poco.Text}");
}
}
}
namespace FunctionAppCloudTable2
{
public class LogEntity : TableEntity
{
public string OriginalName { get; set; }
}
public static class CloudTableDemo
{
[FunctionName("CloudTableDemo")]
public static async Task Run(
[TimerTrigger("0 */1 * * * *")] TimerInfo myTimer,
[Table("AzureWebJobsHostLogscommon")] CloudTable cloudTable,
ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
For more information about how to use CloudTable, see Get started with Azure Table storage.
If you try to bind to CloudTable and get an error message, make sure that you have a reference to the correct
Storage SDK version.
Input - C# script example - one entity
The following example shows a table input binding in a function.json file and C# script code that uses the binding.
The function uses a queue trigger to read a single table row.
The function.json file specifies a partitionKey and a rowKey . The rowKey value "{queueTrigger}" indicates that
the row key comes from the queue message string.
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "personEntity",
"type": "table",
"tableName": "Person",
"partitionKey": "Test",
"rowKey": "{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
}
],
"disabled": false
}
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Table;
using Microsoft.Extensions.Logging;
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Table;
using System;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
public static async Task Run(TimerInfo myTimer, CloudTable cloudTable, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
For more information about how to use CloudTable, see Get started with Azure Table storage.
If you try to bind to CloudTable and get an error message, make sure that you have a reference to the correct
Storage SDK version.
Input - F# example
The following example shows a table input binding in a function.json file and F# script code that uses the binding.
The function uses a queue trigger to read a single table row.
The function.json file specifies a partitionKey and a rowKey . The rowKey value "{queueTrigger}" indicates that
the row key comes from the queue message string.
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "personEntity",
"type": "table",
"tableName": "Person",
"partitionKey": "Test",
"rowKey": "{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
}
],
"disabled": false
}
[<CLIMutable>]
type Person = {
PartitionKey: string
RowKey: string
Name: string
}
@FunctionName("getallcount")
public int run(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET},
authLevel = AuthorizationLevel.ANONYMOUS) Object dummyShouldNotBeUsed,
@TableInput(name = "items",
tableName = "mytablename", partitionKey = "myparkey",
connection = "myconnvarname") MyItem[] items
) {
return items.length;
}
Input - attributes
In C# class libraries, use the following attributes to configure a table input binding:
TableAttribute
The attribute's constructor takes the table name, partition key, and row key. It can be used on an out
parameter or on the return value of the function, as shown in the following example:
[FunctionName("TableInput")]
public static void Run(
[QueueTrigger("table-items")] string input,
[Table("MyTable", "Http", "{queueTrigger}")] MyPoco poco,
ILogger log)
{
...
}
You can set the Connection property to specify the storage account to use, as shown in the following
example:
[FunctionName("TableInput")]
public static void Run(
[QueueTrigger("table-items")] string input,
[Table("MyTable", "Http", "{queueTrigger}", Connection = "StorageConnectionAppSetting")] MyPoco
poco,
ILogger log)
{
...
}
[StorageAccount("ClassLevelStorageAppSetting")]
public static class AzureFunctions
{
[FunctionName("TableInput")]
[StorageAccount("FunctionLevelStorageAppSetting")]
public static void Run( //...
{
...
}
Input - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
Table attribute.
FUNCTION.JSON PROPERTY ATTRIBUTE PROPERTY DESCRIPTION
When you're developing locally, app settings go into the local.settings.json file.
Input - usage
The Table storage input binding supports the following scenarios:
Read one row in C# or C# script
Set partitionKey and rowKey . Access the table data by using a method parameter T <paramName> . In C#
script, paramName is the value specified in the name property of function.json. T is typically a type that
implements ITableEntity or derives from TableEntity . The filter and take properties are not used in
this scenario.
Read one or more rows in C# or C# script
Access the table data by using a method parameter IQueryable<T> <paramName> . In C# script, paramName is
the value specified in the name property of function.json. T must be a type that implements ITableEntity
or derives from TableEntity . You can use IQueryable methods to do any filtering required. The
partitionKey , rowKey , filter , and take properties are not used in this scenario.
NOTE
IQueryable isn't supported in the Functions v2 runtime. An alternative is to use a CloudTable paramName method
parameter to read the table by using the Azure Storage SDK. If you try to bind to CloudTable and get an error
message, make sure that you have a reference to the correct Storage SDK version.
Output
Use an Azure Table storage output binding to write entities to a table in an Azure Storage account.
NOTE
This output binding does not support updating existing entities. Use the TableOperation.Replace operation from the
Azure Storage SDK to update an existing entity.
Output - example
See the language-specific example:
C#
C# script (.csx)
F#
JavaScript
Output - C# example
The following example shows a C# function that uses an HTTP trigger to write a single table row.
public class TableStorage
{
public class MyPoco
{
public string PartitionKey { get; set; }
public string RowKey { get; set; }
public string Text { get; set; }
}
[FunctionName("TableOutput")]
[return: Table("MyTable")]
public static MyPoco TableOutput([HttpTrigger] dynamic input, ILogger log)
{
log.LogInformation($"C# http trigger function processed: {input.Text}");
return new MyPoco { PartitionKey = "Http", RowKey = Guid.NewGuid().ToString(), Text = input.Text };
}
}
{
"bindings": [
{
"name": "input",
"type": "manualTrigger",
"direction": "in"
},
{
"tableName": "Person",
"connection": "MyStorageConnectionAppSetting",
"name": "tableBinding",
"type": "table",
"direction": "out"
}
],
"disabled": false
}
Output - F# example
The following example shows a table output binding in a function.json file and F# script code that uses the
binding. The function writes multiple table entities.
Here's the function.json file:
{
"bindings": [
{
"name": "input",
"type": "manualTrigger",
"direction": "in"
},
{
"tableName": "Person",
"connection": "MyStorageConnectionAppSetting",
"name": "tableBinding",
"type": "table",
"direction": "out"
}
],
"disabled": false
}
{
"bindings": [
{
"name": "input",
"type": "manualTrigger",
"direction": "in"
},
{
"tableName": "Person",
"connection": "MyStorageConnectionAppSetting",
"name": "tableBinding",
"type": "table",
"direction": "out"
}
],
"disabled": false
}
context.bindings.tableBinding = [];
context.done();
};
Output - attributes
In C# class libraries, use the TableAttribute.
The attribute's constructor takes the table name. It can be used on an out parameter or on the return value of the
function, as shown in the following example:
[FunctionName("TableOutput")]
[return: Table("MyTable")]
public static MyPoco TableOutput(
[HttpTrigger] dynamic input,
ILogger log)
{
...
}
You can set the Connection property to specify the storage account to use, as shown in the following example:
[FunctionName("TableOutput")]
[return: Table("MyTable", Connection = "StorageConnectionAppSetting")]
public static MyPoco TableOutput(
[HttpTrigger] dynamic input,
ILogger log)
{
...
}
Output - configuration
The following table explains the binding configuration properties that you set in the function.json file and the
Table attribute.
When you're developing locally, app settings go into the local.settings.json file.
Output - usage
The Table storage output binding supports the following scenarios:
Write one row in any language
In C# and C# script, access the output table entity by using a method parameter such as out T paramName
or the function return value. In C# script, paramName is the value specified in the name property of
function.json. T can be any serializable type if the partition key and row key are provided by the
function.json file or the Table attribute. Otherwise, T must be a type that includes PartitionKey and
RowKey properties. In this scenario, T typically implements ITableEntity or derives from TableEntity ,
but it doesn't have to.
Write one or more rows in C# or C# script
In C# and C# script, access the output table entity by using a method parameter ICollector<T> paramName
or IAsyncCollector<T> paramName . In C# script, paramName is the value specified in the name property of
function.json. T specifies the schema of the entities you want to add. Typically, T derives from
TableEntity or implements ITableEntity , but it doesn't have to. The partition key and row key values in
function.json or the Table attribute constructor are not used in this scenario.
An alternative is to use a CloudTable method parameter to write to the table by using the Azure Storage
SDK. If you try to bind to CloudTable and get an error message, make sure that you have a reference to
the correct Storage SDK version. For an example of code that binds to CloudTable , see the input binding
examples for C# or C# script earlier in this article.
Write one or more rows in JavaScript
In JavaScript functions, access the table output using context.bindings.<name> .
Next steps
Learn more about Azure functions triggers and bindings
Timer trigger for Azure Functions
5/8/2019 • 9 minutes to read • Edit Online
This article explains how to work with timer triggers in Azure Functions. A timer trigger lets you run a function
on a schedule.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
Example
See the language-specific example:
C#
C# script (.csx)
F#
JavaScript
Java
C# example
The following example shows a C# function that is executed each time the minutes have a value divisible by five
(eg if the function starts at 18:57:00, the next performance will be at 19:00:00). The TimerInfo object is passed
into the function.
[FunctionName("TimerTriggerCSharp")]
public static void Run([TimerTrigger("0 */5 * * * *")]TimerInfo myTimer, ILogger log)
{
if (myTimer.IsPastDue)
{
log.LogInformation("Timer is running late!");
}
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
}
C# script example
The following example shows a timer trigger binding in a function.json file and a C# script function that uses the
binding. The function writes a log indicating whether this function invocation is due to a missed schedule
occurrence. The TimerInfo object is passed into the function.
Here's the binding data in the function.json file:
{
"schedule": "0 */5 * * * *",
"name": "myTimer",
"type": "timerTrigger",
"direction": "in"
}
F# example
The following example shows a timer trigger binding in a function.json file and an F# script function that uses the
binding. The function writes a log indicating whether this function invocation is due to a missed schedule
occurrence. The TimerInfo object is passed into the function.
Here's the binding data in the function.json file:
{
"schedule": "0 */5 * * * *",
"name": "myTimer",
"type": "timerTrigger",
"direction": "in"
}
{
"schedule": "0 */5 * * * *",
"name": "myTimer",
"type": "timerTrigger",
"direction": "in"
}
if (myTimer.IsPastDue)
{
context.log('Node is running late!');
}
context.log('Node timer trigger function ran!', timeStamp);
context.done();
};
Java example
The following example function triggers and executes every five minutes. The @TimerTrigger annotation on the
function defines the schedule using the same string format as CRON expressions.
@FunctionName("keepAlive")
public void keepAlive(
@TimerTrigger(name = "keepAliveTrigger", schedule = "0 */5 * * * *") String timerInfo,
ExecutionContext context
) {
// timeInfo is a JSON string, you can deserialize it to an object using your favorite JSON library
context.getLogger().info("Timer is triggered: " + timerInfo);
}
Attributes
In C# class libraries, use the TimerTriggerAttribute.
The attribute's constructor takes a CRON expression or a TimeSpan . You can use TimeSpan only if the function
app is running on an App Service plan. The following example shows a CRON expression:
[FunctionName("TimerTriggerCSharp")]
public static void Run([TimerTrigger("0 */5 * * * *")]TimerInfo myTimer, ILogger log)
{
if (myTimer.IsPastDue)
{
log.LogInformation("Timer is running late!");
}
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
}
Configuration
The following table explains the binding configuration properties that you set in the function.json file and the
TimerTrigger attribute.
When you're developing locally, app settings go into the local.settings.json file.
Cau t i on
We recommend against setting runOnStartup to true in production. Using this setting makes code execute at
highly unpredictable times. In certain production settings, these extra executions can result in significantly higher
costs for apps hosted in Consumption plans. For example, with runOnStartup enabled the trigger is invoked
whenever your function app is scaled. Make sure you fully understand the production behavior of your functions
before enabling runOnStartup in production.
Usage
When a timer trigger function is invoked, a timer object is passed into the function. The following JSON is an
example representation of the timer object.
{
"Schedule":{
},
"ScheduleStatus": {
"Last":"2016-10-04T10:15:00+00:00",
"LastUpdated":"2016-10-04T10:16:00+00:00",
"Next":"2016-10-04T10:20:00+00:00"
},
"IsPastDue":false
}
The IsPastDue property is true when the current function invocation is later than scheduled. For example, a
function app restart might cause an invocation to be missed.
CRON expressions
Azure Functions uses the NCronTab library to interpret CRON expressions. A CRON expression includes six
fields:
{second} {minute} {hour} {day} {month} {day-of-week}
An interval value ( / operator) "0 */5 * * * *" at hh:05:00, hh:10:00, hh:15:00, and so
on through hh:55:00 where hh is every
hour (12 times an hour)
To specify months or days you can use numeric values, names, or abbreviations of names:
For days, the numeric values are 0 to 6 where 0 starts with Sunday.
Names are in English. For example: Monday , January .
Names are case-insensitive.
Names can be abbreviated. Three letters is the recommended abbreviation length. For example: Mon , Jan .
CRON examples
Here are some examples of CRON expressions you can use for the timer trigger in Azure Functions.
NOTE
You can find CRON expression examples online, but many of them omit the {second} field. If you copy from one of
them, add the missing {second} field. Usually you'll want a zero in that field, not an asterisk.
Or create an app setting for your function app named WEBSITE_TIME_ZONE and set the value to Eastern Standard
Time. Then uses the following CRON expression:
When you use WEBSITE_TIME_ZONE , the time is adjusted for time changes in the specific timezone, such as
daylight savings time.
TimeSpan
A TimeSpan can be used only for a function app that runs on an App Service Plan.
Unlike a CRON expression, a TimeSpan value specifies the time interval between each function invocation. When
a function completes after running longer than the specified interval, the timer immediately invokes the function
again.
Expressed as a string, the TimeSpan format is hh:mm:ss when hh is less than 24. When the first two digits are
24 or greater, the format is dd:hh:mm . Here are some examples:
"24:00:00" everyday
Scale-out
If a function app scales out to multiple instances, only a single instance of a timer-triggered function is run across
all instances.
Retry behavior
Unlike the queue trigger, the timer trigger doesn't retry after a function fails. When a function fails, it isn't called
again until the next time on the schedule.
Troubleshooting
For information about what to do when the timer trigger doesn't work as expected, see Investigating and
reporting issues with timer triggered functions not firing.
Next steps
Go to a quickstart that uses a timer trigger
Learn more about Azure functions triggers and bindings
Twilio binding for Azure Functions
12/6/2018 • 9 minutes to read • Edit Online
This article explains how to send text messages by using Twilio bindings in Azure Functions. Azure Functions
supports output bindings for Twilio.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 1.X
TO ADD SUPPORT IN
DEVELOPMENT ENVIRONMENT FUNCTIONS 2.X
Local development - C# script, JavaScript, F#, Java and Register the extension
Python
To learn how to update existing binding extensions in the portal without having to republish your function app
project, see Update your extensions.
Example - Functions 1.x
See the language-specific example:
C#
C# script (.csx)
JavaScript
C# example
The following example shows a C# function that sends a text message when triggered by a queue message.
[FunctionName("QueueTwilio")]
[return: TwilioSms(AccountSidSetting = "TwilioAccountSid", AuthTokenSetting = "TwilioAuthToken", From =
"+1425XXXXXXX" )]
public static SMSMessage Run(
[QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")] JObject order,
TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {order}");
return message;
}
This example uses the TwilioSms attribute with the method return value. An alternative is to use the attribute with
an out SMSMessage parameter or an ICollector<SMSMessage> or IAsyncCollector<SMSMessage> parameter.
C# script example
The following example shows a Twilio output binding in a function.json file and a C# script function that uses the
binding. The function uses an out parameter to send a text message.
Here's binding data in the function.json file:
Example function.json:
{
"type": "twilioSms",
"name": "message",
"accountSid": "TwilioAccountSid",
"authToken": "TwilioAuthToken",
"to": "+1704XXXXXXX",
"from": "+1425XXXXXXX",
"direction": "out",
"body": "Azure Functions Testing"
}
using System;
using Newtonsoft.Json;
using Twilio;
public static void Run(string myQueueItem, out SMSMessage message, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
dynamic order = JsonConvert.DeserializeObject(myQueueItem);
string msg = "Hello " + order.name + ", thank you for your order.";
// Even if you want to use a hard coded message and number in the binding, you must at least
// initialize the SMSMessage variable.
message = new SMSMessage();
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message to the mobile number provided for
// order status updates.
message.Body = msg;
message.To = order.mobileNumber;
}
You can't use out parameters in synchronous code. Here's an asynchronous C# script code example:
#r "Newtonsoft.Json"
#r "Twilio.Api"
using System;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Twilio;
public static async Task Run(string myQueueItem, IAsyncCollector<SMSMessage> message, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
dynamic order = JsonConvert.DeserializeObject(myQueueItem);
string msg = "Hello " + order.name + ", thank you for your order.";
// Even if you want to use a hard coded message and number in the binding, you must at least
// initialize the SMSMessage variable.
SMSMessage smsText = new SMSMessage();
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message to the mobile number provided for
// order status updates.
smsText.Body = msg;
smsText.To = order.mobileNumber;
await message.AddAsync(smsText);
}
JavaScript example
The following example shows a Twilio output binding in a function.json file and a JavaScript function that uses the
binding.
Here's binding data in the function.json file:
Example function.json:
{
"type": "twilioSms",
"name": "message",
"accountSid": "TwilioAccountSid",
"authToken": "TwilioAuthToken",
"to": "+1704XXXXXXX",
"from": "+1425XXXXXXX",
"direction": "out",
"body": "Azure Functions Testing"
}
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
var msg = "Hello " + myQueueItem.name + ", thank you for your order.";
// Even if you want to use a hard coded message and number in the binding, you must at least
// initialize the message binding.
context.bindings.message = {};
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message to the mobile number provided for
// order status updates.
context.bindings.message = {
body : msg,
to : myQueueItem.mobileNumber
};
context.done();
};
return message;
}
}
}
This example uses the TwilioSms attribute with the method return value. An alternative is to use the attribute with
an out CreateMessageOptions parameter or an ICollector<CreateMessageOptions> or
IAsyncCollector<CreateMessageOptions> parameter.
{
"type": "twilioSms",
"name": "message",
"accountSidSetting": "TwilioAccountSid",
"authTokenSetting": "TwilioAuthToken",
"from": "+1425XXXXXXX",
"direction": "out",
"body": "Azure Functions Testing"
}
using System;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Microsoft.Azure.WebJobs.Extensions.Twilio;
using Twilio.Rest.Api.V2010.Account;
using Twilio.Types;
public static void Run(string myQueueItem, out CreateMessageOptions message, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
dynamic order = JsonConvert.DeserializeObject(myQueueItem);
string msg = "Hello " + order.name + ", thank you for your order.";
// You must initialize the CreateMessageOptions variable with the "To" phone number.
message = new CreateMessageOptions(new PhoneNumber("+1704XXXXXXX"));
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message.
message.Body = msg;
}
You can't use out parameters in synchronous code. Here's an asynchronous C# script code example:
#r "Newtonsoft.Json"
#r "Twilio"
#r "Microsoft.Azure.WebJobs.Extensions.Twilio"
using System;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Microsoft.Azure.WebJobs.Extensions.Twilio;
using Twilio.Rest.Api.V2010.Account;
using Twilio.Types;
public static async Task Run(string myQueueItem, IAsyncCollector<CreateMessageOptions> message, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
dynamic order = JsonConvert.DeserializeObject(myQueueItem);
string msg = "Hello " + order.name + ", thank you for your order.";
// You must initialize the CreateMessageOptions variable with the "To" phone number.
CreateMessageOptions smsText = new CreateMessageOptions(new PhoneNumber("+1704XXXXXXX"));
// A dynamic message can be set instead of the body in the output binding. In this example, we use
// the order information to personalize a text message.
smsText.Body = msg;
await message.AddAsync(smsText);
}
{
"type": "twilioSms",
"name": "message",
"accountSidSetting": "TwilioAccountSid",
"authTokenSetting": "TwilioAuthToken",
"from": "+1425XXXXXXX",
"direction": "out",
"body": "Azure Functions Testing"
}
// In this example the queue item is a JSON string representing an order that contains the name of a
// customer and a mobile number to send text updates to.
var msg = "Hello " + myQueueItem.name + ", thank you for your order.";
// Even if you want to use a hard coded message in the binding, you must at least
// initialize the message binding.
context.bindings.message = {};
// A dynamic message can be set instead of the body in the output binding. The "To" number
// must be specified in code.
context.bindings.message = {
body : msg,
to : myQueueItem.mobileNumber
};
context.done();
};
Attributes
In C# class libraries, use the TwilioSms attribute.
For information about attribute properties that you can configure, see Configuration. Here's a TwilioSms
attribute example in a method signature:
[FunctionName("QueueTwilio")]
[return: TwilioSms(AccountSidSetting = "TwilioAccountSid", AuthTokenSetting = "TwilioAuthToken", From =
"+1425XXXXXXX")]
public static CreateMessageOptions Run(
[QueueTrigger("myqueue-items", Connection = "AzureWebJobsStorage")] JObject order, ILogger log)
{
...
}
Configuration
The following table explains the binding configuration properties that you set in the function.json file and the
TwilioSms attribute.
V1 FUNCTION.JSON PROPERTY V2 FUNCTION.JSON PROPERTY ATTRIBUTE PROPERTY DESCRIPTION
When you're developing locally, app settings go into the local.settings.json file.
Next steps
Learn more about Azure functions triggers and bindings
host.json reference for Azure Functions 2.x
5/10/2019 • 6 minutes to read • Edit Online
The host.json metadata file contains global configuration options that affect all functions for a function app. This
article lists the settings that are available for the v2 runtime.
NOTE
This article is for Azure Functions 2.x. For a reference of host.json in Functions 1.x, see host.json reference for Azure Functions
1.x.
Other function app configuration options are managed in your app settings.
Some host.json settings are only used when running locally in the local.settings.json file.
The following sections of this article explain each top-level property. All are optional unless otherwise indicated.
aggregator
Specifies how many function invocations are aggregated when calculating metrics for Application Insights.
{
"aggregator": {
"batchSize": 1000,
"flushTimeout": "00:00:30"
}
}
PROPERTY DEFAULT DESCRIPTION
Function invocations are aggregated when the first of the two limits are reached.
applicationInsights
This setting is a child of logging.
Controls the sampling feature in Application Insights.
{
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"maxTelemetryItemsPerSecond" : 5
}
}
}
NOTE
Log sampling may cause some executions to not show up in the Application Insights monitor blade.
cosmosDb
Configuration setting can be found in Cosmos DB triggers and bindings.
durableTask
Configuration setting can be found in bindings for Durable Functions.
eventHub
Configuration settings can be found in Event Hub triggers and bindings.
extensions
Property that returns an object that contains all of the binding-specific settings, such as http and eventHub.
functions
A list of functions that the job host runs. An empty array means run all functions. Intended for use only when
running locally. In function apps in Azure, you should instead follow the steps in How to disable functions in Azure
Functions to disable specific functions rather than using this setting.
{
"functions": [ "QueueProcessor", "GitHubWebHook" ]
}
functionTimeout
Indicates the timeout duration for all functions. In a serverless Consumption plan, the valid range is from 1 second
to 10 minutes, and the default value is 5 minutes. In an App Service plan, there is no overall limit and the default
depends on the runtime version. In version 2.x, the default value for an App Service plan is 30 minutes. In version
1.x, it's null, which indicates no timeout.
{
"functionTimeout": "00:05:00"
}
healthMonitor
Configuration settings for Host health monitor.
{
"healthMonitor": {
"enabled": true,
"healthCheckInterval": "00:00:10",
"healthCheckWindow": "00:02:00",
"healthCheckThreshold": 6,
"counterThreshold": 0.80
}
}
http
Configuration settings can be found in http triggers and bindings.
{
"http": {
"routePrefix": "api",
"maxOutstandingRequests": 200,
"maxConcurrentRequests": 100,
"dynamicThrottlesEnabled": true
}
}
logging
Controls the logging behaviors of the function app, including Application Insights.
"logging": {
"fileLoggingMode": "debugOnly",
"logLevel": {
"Function.MyFunction": "Information",
"default": "None"
},
"console": {
...
},
"applicationInsights": {
...
}
}
console
This setting is a child of logging. It controls the console logging when not in debugging mode.
{
"logging": {
...
"console": {
"isEnabled": "false"
},
...
}
}
queues
Configuration settings can be found in Storage queue triggers and bindings.
sendGrid
Configuration setting can be found in SendGrid triggers and bindings.
serviceBus
Configuration setting can be found in Service Bus triggers and bindings.
singleton
Configuration settings for Singleton lock behavior. For more information, see GitHub issue about singleton
support.
{
"singleton": {
"lockPeriod": "00:00:15",
"listenerLockPeriod": "00:01:00",
"listenerLockRecoveryPollingInterval": "00:01:00",
"lockAcquisitionTimeout": "00:01:00",
"lockAcquisitionPollingInterval": "00:00:03"
}
}
version
The version string "version": "2.0" is required for a function app that targets the v2 runtime.
watchDirectories
A set of shared code directories that should be monitored for changes. Ensures that when code in these directories
is changed, the changes are picked up by your functions.
{
"watchDirectories": [ "Shared" ]
}
managedDependency
Managed dependency is a preview feature that is currently only supported with PowerShell based functions. It
enables dependencies to be automatically managed by the service. When the enabled property is set to true, the
requirements.psd1 file will be processed. Dependencies will be updated when any minor versions are released.
{
"managedDependency": {
"enabled": true
}
}
Next steps
Learn how to update the host.json file
See global settings in environment variables
host.json reference for Azure Functions 2.x
5/10/2019 • 6 minutes to read • Edit Online
The host.json metadata file contains global configuration options that affect all functions for a function
app. This article lists the settings that are available for the v2 runtime.
NOTE
This article is for Azure Functions 2.x. For a reference of host.json in Functions 1.x, see host.json reference for
Azure Functions 1.x.
Other function app configuration options are managed in your app settings.
Some host.json settings are only used when running locally in the local.settings.json file.
The following sections of this article explain each top-level property. All are optional unless otherwise
indicated.
aggregator
Specifies how many function invocations are aggregated when calculating metrics for Application
Insights.
{
"aggregator": {
"batchSize": 1000,
"flushTimeout": "00:00:30"
}
}
Function invocations are aggregated when the first of the two limits are reached.
applicationInsights
This setting is a child of logging.
Controls the sampling feature in Application Insights.
{
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"maxTelemetryItemsPerSecond" : 5
}
}
}
NOTE
Log sampling may cause some executions to not show up in the Application Insights monitor blade.
cosmosDb
Configuration setting can be found in Cosmos DB triggers and bindings.
durableTask
Configuration setting can be found in bindings for Durable Functions.
eventHub
Configuration settings can be found in Event Hub triggers and bindings.
extensions
Property that returns an object that contains all of the binding-specific settings, such as http and
eventHub.
functions
A list of functions that the job host runs. An empty array means run all functions. Intended for use only
when running locally. In function apps in Azure, you should instead follow the steps in How to disable
functions in Azure Functions to disable specific functions rather than using this setting.
{
"functions": [ "QueueProcessor", "GitHubWebHook" ]
}
functionTimeout
Indicates the timeout duration for all functions. In a serverless Consumption plan, the valid range is
from 1 second to 10 minutes, and the default value is 5 minutes. In an App Service plan, there is no
overall limit and the default depends on the runtime version. In version 2.x, the default value for an App
Service plan is 30 minutes. In version 1.x, it's null, which indicates no timeout.
{
"functionTimeout": "00:05:00"
}
healthMonitor
Configuration settings for Host health monitor.
{
"healthMonitor": {
"enabled": true,
"healthCheckInterval": "00:00:10",
"healthCheckWindow": "00:02:00",
"healthCheckThreshold": 6,
"counterThreshold": 0.80
}
}
http
Configuration settings can be found in http triggers and bindings.
{
"http": {
"routePrefix": "api",
"maxOutstandingRequests": 200,
"maxConcurrentRequests": 100,
"dynamicThrottlesEnabled": true
}
}
logging
Controls the logging behaviors of the function app, including Application Insights.
"logging": {
"fileLoggingMode": "debugOnly",
"logLevel": {
"Function.MyFunction": "Information",
"default": "None"
},
"console": {
...
},
"applicationInsights": {
...
}
}
console
This setting is a child of logging. It controls the console logging when not in debugging mode.
{
"logging": {
...
"console": {
"isEnabled": "false"
},
...
}
}
queues
Configuration settings can be found in Storage queue triggers and bindings.
sendGrid
Configuration setting can be found in SendGrid triggers and bindings.
serviceBus
Configuration setting can be found in Service Bus triggers and bindings.
singleton
Configuration settings for Singleton lock behavior. For more information, see GitHub issue about
singleton support.
{
"singleton": {
"lockPeriod": "00:00:15",
"listenerLockPeriod": "00:01:00",
"listenerLockRecoveryPollingInterval": "00:01:00",
"lockAcquisitionTimeout": "00:01:00",
"lockAcquisitionPollingInterval": "00:00:03"
}
}
version
The version string "version": "2.0" is required for a function app that targets the v2 runtime.
watchDirectories
A set of shared code directories that should be monitored for changes. Ensures that when code in these
directories is changed, the changes are picked up by your functions.
{
"watchDirectories": [ "Shared" ]
}
managedDependency
Managed dependency is a preview feature that is currently only supported with PowerShell based
functions. It enables dependencies to be automatically managed by the service. When the enabled
property is set to true, the requirements.psd1 file will be processed. Dependencies will be updated when
any minor versions are released.
{
"managedDependency": {
"enabled": true
}
}
Next steps
Learn how to update the host.json file
See global settings in environment variables
host.json reference for Azure Functions 1.x
3/25/2019 • 11 minutes to read • Edit Online
The host.json metadata file contains global configuration options that affect all functions for a function app. This
article lists the settings that are available for the v1 runtime. The JSON schema is at
http://json.schemastore.org/host.
NOTE
This article is for Azure Functions 1.x. For a reference of host.json in Functions 2.x, see host.json reference for Azure
Functions 2.x.
Other function app configuration options are managed in your app settings.
Some host.json settings are only used when running locally in the local.settings.json file.
{
"aggregator": {
"batchSize": 1000,
"flushTimeout": "00:00:30"
},
"applicationInsights": {
"sampling": {
"isEnabled": true,
"maxTelemetryItemsPerSecond" : 5
}
},
"eventHub": {
"maxBatchSize": 64,
"prefetchCount": 256,
"batchCheckpointFrequency": 1
},
"functions": [ "QueueProcessor", "GitHubWebHook" ],
"functionTimeout": "00:05:00",
"healthMonitor": {
"enabled": true,
"healthCheckInterval": "00:00:10",
"healthCheckWindow": "00:02:00",
"healthCheckThreshold": 6,
"counterThreshold": 0.80
},
"http": {
"routePrefix": "api",
"maxOutstandingRequests": 20,
"maxConcurrentRequests": 10,
"dynamicThrottlesEnabled": false
},
"id": "9f4ea53c5136457d883d685e57164f08",
"logger": {
"categoryFilter": {
"defaultLevel": "Information",
"categoryLevels": {
"Host": "Error",
"Function": "Error",
"Host.Aggregator": "Information"
"Host.Aggregator": "Information"
}
}
},
"queues": {
"maxPollingInterval": 2000,
"visibilityTimeout" : "00:00:30",
"batchSize": 16,
"maxDequeueCount": 5,
"newBatchThreshold": 8
},
"serviceBus": {
"maxConcurrentCalls": 16,
"prefetchCount": 100,
"autoRenewTimeout": "00:05:00"
},
"singleton": {
"lockPeriod": "00:00:15",
"listenerLockPeriod": "00:01:00",
"listenerLockRecoveryPollingInterval": "00:01:00",
"lockAcquisitionTimeout": "00:01:00",
"lockAcquisitionPollingInterval": "00:00:03"
},
"tracing": {
"consoleLevel": "verbose",
"fileLoggingMode": "debugOnly"
},
"watchDirectories": [ "Shared" ],
}
The following sections of this article explain each top-level property. All are optional unless otherwise indicated.
aggregator
Specifies how many function invocations are aggregated when calculating metrics for Application Insights.
{
"aggregator": {
"batchSize": 1000,
"flushTimeout": "00:00:30"
}
}
Function invocations are aggregated when the first of the two limits are reached.
applicationInsights
Controls the sampling feature in Application Insights.
{
"applicationInsights": {
"sampling": {
"isEnabled": true,
"maxTelemetryItemsPerSecond" : 5
}
}
}
durableTask
Configuration settings for Durable Functions.
{
"durableTask": {
"hubName": "MyTaskHub",
"controlQueueBatchSize": 32,
"partitionCount": 4,
"controlQueueVisibilityTimeout": "00:05:00",
"workItemQueueVisibilityTimeout": "00:05:00",
"maxConcurrentActivityFunctions": 10,
"maxConcurrentOrchestratorFunctions": 10,
"maxQueuePollingInterval": "00:00:30",
"azureStorageConnectionStringName": "AzureWebJobsStorage",
"trackingStoreConnectionStringName": "TrackingStorage",
"trackingStoreNamePrefix": "DurableTask",
"traceInputsAndOutputs": false,
"logReplayEvents": false,
"eventGridTopicEndpoint": "https://topic_name.westus2-1.eventgrid.azure.net/api/events",
"eventGridKeySettingName": "EventGridKey",
"eventGridPublishRetryCount": 3,
"eventGridPublishRetryInterval": "00:00:30",
"eventGridPublishEventTypes": ["Started", "Pending", "Failed", "Terminated"]
}
}
Task hub names must start with a letter and consist of only letters and numbers. If not specified, the default task
hub name for a function app is DurableFunctionsHub. For more information, see Task hubs.
maxConcurrentActivityFunctions 10X the number of processors on the The maximum number of activity
current machine functions that can be processed
concurrently on a single host instance.
maxConcurrentOrchestratorFunctions 10X the number of processors on the The maximum number of orchestrator
current machine functions that can be processed
concurrently on a single host instance.
Many of these settings are for optimizing performance. For more information, see Performance and scale.
eventHub
Configuration settings for Event Hub triggers and bindings.
{
"eventHub": {
"maxBatchSize": 64,
"prefetchCount": 256,
"batchCheckpointFrequency": 1
}
}
functions
A list of functions that the job host runs. An empty array means run all functions. Intended for use only when
running locally. In function apps in Azure, you should instead follow the steps in How to disable functions in
Azure Functions to disable specific functions rather than using this setting.
{
"functions": [ "QueueProcessor", "GitHubWebHook" ]
}
functionTimeout
Indicates the timeout duration for all functions. In a serverless Consumption plan, the valid range is from 1
second to 10 minutes, and the default value is 5 minutes. In an App Service plan, there is no overall limit and the
default depends on the runtime version.
{
"functionTimeout": "00:05:00"
}
healthMonitor
Configuration settings for Host health monitor.
{
"healthMonitor": {
"enabled": true,
"healthCheckInterval": "00:00:10",
"healthCheckWindow": "00:02:00",
"healthCheckThreshold": 6,
"counterThreshold": 0.80
}
}
PROPERTY DEFAULT DESCRIPTION
http
Configuration settings for http triggers and bindings.
{
"http": {
"routePrefix": "api",
"maxOutstandingRequests": 200,
"maxConcurrentRequests": 100,
"dynamicThrottlesEnabled": true
}
}
id
Version 1.x only.
The unique ID for a job host. Can be a lower case GUID with dashes removed. Required when running locally.
When running in Azure, we recommend that you not set an ID value. An ID is generated automatically in Azure
when id is omitted.
If you share a Storage account across multiple function apps, make sure that each function app has a different
id . You can omit the id property or manually set each function app's id to a different value. The timer
trigger uses a storage lock to ensure that there will be only one timer instance when a function app scales out to
multiple instances. If two function apps share the same id and each uses a timer trigger, only one timer will
run.
{
"id": "9f4ea53c5136457d883d685e57164f08"
}
logger
Controls filtering for logs written by an ILogger object or by context.log.
{
"logger": {
"categoryFilter": {
"defaultLevel": "Information",
"categoryLevels": {
"Host": "Error",
"Function": "Error",
"Host.Aggregator": "Information"
}
}
}
}
queues
Configuration settings for Storage queue triggers and bindings.
{
"queues": {
"maxPollingInterval": 2000,
"visibilityTimeout" : "00:00:30",
"batchSize": 16,
"maxDequeueCount": 5,
"newBatchThreshold": 8
}
}
serviceBus
Configuration setting for Service Bus triggers and bindings.
{
"serviceBus": {
"maxConcurrentCalls": 16,
"prefetchCount": 100,
"autoRenewTimeout": "00:05:00"
}
}
singleton
Configuration settings for Singleton lock behavior. For more information, see GitHub issue about singleton
support.
{
"singleton": {
"lockPeriod": "00:00:15",
"listenerLockPeriod": "00:01:00",
"listenerLockRecoveryPollingInterval": "00:01:00",
"lockAcquisitionTimeout": "00:01:00",
"lockAcquisitionPollingInterval": "00:00:03"
}
}
tracing
Version 1.x
Configuration settings for logs that you create by using a TraceWriter object. See C# Logging and Node.js
Logging.
{
"tracing": {
"consoleLevel": "verbose",
"fileLoggingMode": "debugOnly"
}
}
watchDirectories
A set of shared code directories that should be monitored for changes. Ensures that when code in these
directories is changed, the changes are picked up by your functions.
{
"watchDirectories": [ "Shared" ]
}
Next steps
Learn how to update the host.json file
See global settings in environment variables
Frequently asked questions about networking in
Azure Functions
4/26/2019 • 3 minutes to read • Edit Online
This article lists frequently asked questions about networking in Azure Functions. For a more comprehensive
overview, see Functions networking options.
Next steps
To learn more about networking and functions:
Follow the tutorial about getting started with virtual network integration
Learn more about the networking options in Azure Functions
Learn more about virtual network integration with App Service and Functions
Learn more about virtual networks in Azure
Enable more networking features and control with App Service Environments
OpenAPI 2.0 metadata support in Azure Functions
(preview)
3/15/2019 • 3 minutes to read • Edit Online
OpenAPI 2.0 (formerly Swagger) metadata support in Azure Functions is a preview feature that you can use to
write an OpenAPI 2.0 definition inside a function app. You can then host that file by using the function app.
IMPORTANT
The OpenAPI preview feature is only available today in the 1.x runtime. Information on how to create a 1.x function app can
be found here.
OpenAPI metadata allows a function that's hosting a REST API to be consumed by a wide variety of other
software. This software includes Microsoft offerings like PowerApps and the API Apps feature of Azure App
Service, third-party developer tools like Postman, and many more packages.
This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the
following resources:
Create your first function: C#, JavaScript, Java, or Python.
Azure Functions developer reference.
Language-specific reference: C#, C# script, F#, Java, JavaScript, or Python.
Azure Functions triggers and bindings concepts.
Code and test Azure Functions locally.
TIP
We recommend starting with the getting started tutorial and then returning to this document to learn more about specific
features.
NOTE
Function API definition feature is not supported for beta runtime currently.
To enable the generation of a hosted OpenAPI definition and a quickstart definition, set API definition source to
Function (Preview). External URL allows your function to use an OpenAPI definition that's hosted elsewhere.
*The operation ID is required only for integrating with PowerApps and Flow.
NOTE
The x-ms-summary extension provides a display name in Logic Apps, PowerApps, and Flow.
To learn more, see Customize your Swagger definition for PowerApps.
Next steps
Getting started tutorial. Try our walkthrough to see an OpenAPI definition in action.
Azure Functions GitHub repository. Check out the Functions repository to give us feedback on the API
definition support preview. Make a GitHub issue for anything you want to see updated.
Azure Functions developer reference. Learn about coding functions and defining triggers and bindings.
Azure Functions scale and hosting
5/17/2019 • 11 minutes to read • Edit Online
Azure Functions runs in two different plans: Consumption plan and Premium plan (public preview ). The
Consumption plan automatically adds compute power when your code is running. Your app is scaled out
when needed to handle load, and scaled down when code stops running. You don't have to pay for idle
VMs or reserve capacity in advance. The Premium plan will also automatically scale and add additional
compute power when your code is running. The Premium plan comes with additional features like
premium compute instances, the ability to keep instances warm indefinitely, and VNet connectivity. If you
have an existing App Service Plan, you can also run your function apps within them.
NOTE
Both Premium plan and Consumption plan for Linux are currently in preview.
If you aren't familiar with Azure Functions, see the Azure Functions overview.
When you create a function app, you choose the hosting plan for functions in the app. In either plan, an
instance of the Azure Functions host executes the functions. The type of plan controls:
How host instances are scaled out.
The resources that are available to each host.
Instance features like VNet connectivity.
NOTE
You can switch between Consumption and Premium plans by changing the plan property of the function app
resource.
Consumption plan
When you're using the Consumption plan, instances of the Azure Functions host are dynamically added
and removed based on the number of incoming events. This serverless plan scales automatically, and
you're charged for compute resources only when your functions are running. On a Consumption plan, a
function execution times out after a configurable period of time.
Billing is based on number of executions, execution time, and memory used. Billing is aggregated across
all functions within a function app. For more information, see the Azure Functions pricing page.
The Consumption plan is the default hosting plan and offers the following benefits:
Pay only when your functions are running.
Scale out automatically, even during periods of high load.
NOTE
The premium plan preview currently supports functions running in .NET, Node, or Java through Windows
infrastructure.
When running JavaScript functions on a Premium plan, you should choose an instance that has fewer
vCPUs. For more information, see the Choose single-core Premium plans.
Consumption 1.x 5 10
Consumption 2.x 5 10
NOTE
Regardless of the function app timeout setting, 230 seconds is the maximum amount of time that an HTTP
triggered function can take to respond to a request. This is because of the default idle timeout of Azure Load
Balancer. For longer processing times, consider using the Durable Functions async pattern or defer the actual
work and return an immediate response.
You can also use the Azure CLI to determine the plan, as follows:
NOTE
When you're using a blob trigger on a Consumption plan, there can be up to a 10-minute delay in processing
new blobs. This delay occurs when a function app has gone idle. After the function app is running, blobs are
processed immediately. To avoid this cold-start delay, use the Premium plan, or use the Event Grid trigger. For
more information, see the blob trigger binding reference article.
Runtime scaling
Azure Functions uses a component called the scale controller to monitor the rate of events and
determine whether to scale out or scale in. The scale controller uses heuristics for each trigger type. For
example, when you're using an Azure Queue storage trigger, it scales based on the queue length and the
age of the oldest queue message.
The unit of scale is the function app. When the function app is scaled out, additional resources are
allocated to run multiple instances of the Azure Functions host. Conversely, as compute demand is
reduced, the scale controller removes function host instances. The number of instances is eventually
scaled down to zero when no functions are running within a function app.
Understanding scaling behaviors
Scaling can vary on a number of factors, and scale differently based on the trigger and language selected.
However there are a few aspects of scaling that exist in the system today:
A single function app only scales up to a maximum of 200 instances. A single instance may process
more than one message or request at a time though, so there isn't a set limit on number of concurrent
executions.
For HTTP triggers, new instances will only be allocated at most once every 1 second.
For non-HTTP triggers, new instances will only be allocated at most once every 30 seconds.
Different triggers may also have different scaling limits as well as documented below:
Event Hub
Best practices and patterns for scalable apps
There are many aspects of a function app that will impact how well it will scale, including host
configuration, runtime footprint, and resource efficiency. For more information, see the scalability section
of the performance considerations article. You should also be aware of how connections behave as your
function app scales. For more information, see How to manage connections in Azure Functions.
Billing model
Billing for the Consumption plan is described in detail on the Azure Functions pricing page. Usage is
aggregated at the function app level and counts only the time that function code is executed. The
following are units for billing:
Resource consumption in gigabyte-seconds (GB -s). Computed as a combination of memory size
and execution time for all functions within a function app.
Executions. Counted each time a function is executed in response to an event trigger.
Useful queries and information on how to understand your consumption bill can be found on the billing
FAQ.
Service limits
The following table indicates the limits that apply to function apps when running in the various hosting
plans:
RESOURCE CONSUMPTION PLAN PREMIUM PLAN APP SERVICE PLAN1
App Service plans 100 per region 100 per resource group 100 per resource group
Custom domain SSL Not supported, wildcard unbounded SNI SSL and 1 unbounded SNI SSL and 1
support certificate for IP SSL connections IP SSL connections
*.azurewebsites.net included included
available by default
1For specific limits for the various App Service plan options, see the App Service plan limits.
2By default, the timeout for the Functions 1.x runtime in an App Service plan is unbounded.
3Requires the App Service plan be set to Always On. Pay at standard rates.
4 These limits are set in the host.
5 The actual number of function apps that you can host depends on the activity of the apps, the size of the
function apps in a Premium plan or an App Service plan, you can map a custom domain using either a
CNAME or an A record.