This is something I always have to look up, so I figured I’d finally write it down. When writing code for Azure Function Apps, what local storage can I use, and how do I get to it?
I absolutely love Azure Function Apps! I use them all the time for both professional and personal projects, primarily writing the function code in either PowerShell or Python.
On several of these projects, I’ve had a requirement to temporarily save local data within the code, for example, download a file, manipulate it, then upload it somewhere else.
With that in mind, there are a few things to consider:
Azure Functions Apps can be hosted in a variety of ways: Consumption Plan, App Service Plan (also known as a Dedicated Plan), Functions Premium (also known as a Premium Plan), App Service Environment (ASE) or Kubernetes. The differences between each is best described here.
The main takeaway (in the context of this article) is the underlying storage. With any new Function App, a storage account is created by default. Within the same storage account, Blob storage is used (regardless of the plan) for storing config such bindings state and function keys. In addition to Blob, Consumption and Premium Plans use Azure Files for local storage, whereas App Service Plans and ASEs make use of local storage made available from within the plan itself (the underlying host).
As described here, different plans come with different storage limits, which is an important consideration.
It’s worth noting that Durable Functions use both Table and Queue storage, and Kubernetes can use Azure Files or Azure Disks.
The point I’m trying to make is that the type and amount of storage available to your functions will vary depending on how you host them. For further storage considerations, click here for the official guidance from Microsoft.
Having a good understanding about the underlying storage is important - but what about accessing said storage?
When I first started writing code to save data to the local storage, I’d often face errors about how the directory the code was scoped to wasn’t writable, or files I’d previously created were no longer readable (even though they were definitely present). It didn’t help that I’d frequently be jumping between PowerShell and Python, but also Windows and Linux based plans.
I also found that on Windows based App Service Plans, if I opened up the console, the home directory would differ between Function Apps. On some it would be
D:\home\ and other
What I needed was somewhere reliable for my code to use every time without issue.
The answer to that is
D:\home\ for Windows and
/home/data for Linux. This works regardless of the plan you use, and for Windows
D: always works regardless of what the console shows you.
Before I continue, let’s have a quick terminology check:
One Function App can host multiple functions.
One App Service Plan can host multiple Function Apps.
It’s important to note that the home directory is scoped to each specific Function App, e.g. the home directory for MyFunctionApp01 is not visible or accessible to MyFunctionApp02, even if they are on the same plan.
Please also note that one Function App can host multiple functions. In this case, each function can access the files and folders you create within the Function App home directory. For example, MyHttpTrigger01 and MyHttpTrigger02 (both hosted on MyFunctionApp01) can both access the same files in
/home/data (for Linux).
It’s best practice that Consumption App plans are not shared.
For shared storage between Function Apps (be that on the same or different plans), you’d need to make use of a storage account.
You can far better control, reliability and resiliency by using an Azure storage account - making a Consumption based Function App the best candidate for this requirement given it used Azure Files.
My personal recommendation is to only use the local storage of a Function App for temporary purposes.
On a Windows based plan - the writable directory is:
On a Linux based plan - the writable directory is: