How to Use Blob Storage via Azure File Storage

Published:4 November 2021 - 9 min. read

Michael Soule Image

Michael Soule

Read more tutorials by Michael Soule!

Azure Cloud Labs: these FREE, on‑demand Azure Cloud Labs will get you into a real‑world environment and account, walking you through step‑by‑step how to best protect, secure, and recover Azure data.

You may find yourself in need of a cheap yet efficient solution to store your files at some point, but where do you find that solution? Look into Microsoft Azure’s Binary Large Object (blob) storage! Blob storage is one of the Azure storage services and lets you store large amounts of text and binary data files, streaming content, or even static content for distribution.

Not a reader? Watch this related video tutorial!
Not seeing the video? Make sure your ad blocker is disabled.

In this tutorial, you’ll learn how to work with blob storage in Azure by walking through some common examples.

Read on to jump in!

Prerequisites

This tutorial will be a hands-on demonstration. If you’d like to follow along, be sure you have the following installed and available.

Building an Azure Environment

Before using blob storage to store your files, you’ll first need to import PowerShell Core modules, connect with your Azure Subscription, and build an Azure environment.

1. Launch PowerShell 7 and run the following command to import modules you’ll be using to store files in blob storage.

Importing the necessary modules
Importing the necessary modules
# Az.Accounts - Provides credential management cmdlets
# Az.Resources - Provides cmdlets to work with the top-level Azure resource providers, 
               # like subscriptions
# Az.Storage - Provides the cmdlets that will help you work with 
             # different storage resources, like blobs
Import-Module Az.Accounts, Az.Resources, Az.Storage

2. Next, log in to your Azure Active Directory (AD) tenant, then run the command below to complete an interactive authentication in your web browser, as shown below.

Although beyond the scope of this tutorial, there are other authentication methods, such as a Service Principal or using an access token.

Connect-AzAccount
Azure Portal interactive login.
Azure Portal interactive login.

Always make sure the tenant and subscription shown after logging in are that which you intend to use. If needed, you may change your context.

3. Now run the below command to create a new resource group called demo, and appended with five random numbers (Get-Random -Maximum 99999). Resource groups are hierarchically below subscriptions and contain resources that allow for more granular management.

Notice the -Location of the resource group is set to Central US for this example. When the command completes, it stores the result in the $resourceGroup variable.

$resourceGroup = New-AzResourceGroup "demo$(Get-Random -Maximum 99999)" -Location 'Central US'
Creating a new resource group
Creating a new resource group

4. Run the command below to perform the following tasks and create a new Azure storage account. For this example, the storage account is named storage, and appended with five random numbers (Get-Random -Maximum 99999). The $storageAccount variable will hold the returned object after the command completes.

# Pass the -ResourceGroupName with the ResourceGroupName property 
# of the $resourceGroup variable you created in step three.
# Append random numbers to the storage account -Name 
# similar to the resource group.
# Set the same -Location as the $resourceGroup variable's Location property.
# Placing resources in the same region as the parent resource group is a good practice. 
# Specify the storage account name -SkuName with either
# locally redundant storage (LRS) or a globally unique name within Azure.

$storageAccount = New-AzStorageAccount `
 -ResourceGroupName $resourceGroup.ResourceGroupName `
 -Name storage$(Get-Random -Maximum 99999) `
 -Location $resourceGroup.Location `
 -SkuName Standard_LRS
Creating a new storage account
Creating a new storage account

5. Execute the below command to run a couple of tasks for the Azure AD Role assignment:

  • The -SignInName value uses the account you’re currently logged in via the UserID property returned by the Get-AzAccessToken cmdlet.
  • The value of -RoleDefinitionName is the Storage Blob Data Contributor built-in role you are assigning.
  • The -Scope value sets the scope of the role assignment for the storage account you created (storage10029 shown below) via the $storageAccount variable’s Id property.

You can always provide more granular role assignments to individual containers as necessary.

New-AzRoleAssignment `
 -SignInName (Get-AzAccessToken).UserId`
 -RoleDefinitionName "Storage Blob Data Contributor"`
 -Scope $storageAccount.Id
Creating and verifying a new file
Creating and verifying a new file

6. Finally, run the series of commands below to create a file called temp.dat on your local system. You’ll be uploading and downloading this file from the storage account in the following sections to demonstrate how blob storage works.

# Load the FileStream .NET Class
$file = New-Object System.IO.FileStream .\temp.dat,Create,ReadWrite
# Set the size of the file
$file.SetLength(10MB)
# Close the handle
$file.Close()
# Lookup the file to confirm the size
(Get-ChildItem $file.Name).Length
Creating and verifying a new file
Creating and verifying a new file

Uploading Files via PowerShell

Now that you have built an Azure environment and created a sample file let’s start uploading the file to blob storage. Blob storage works differently than standard filesystems. Each file in the blob storage is an object and kept within containers.

The core functionality of blobs is similar to other filesystems, but there are use cases where either could be a better solution. Blobs can even back virtual filesystems (e.g., BlobFuse).

Microsoft offers multiple methods to upload files to your storage accounts via PowerShell, AzCopy, and Azure Portal. But let’s upload the sample file (temp.dat) to blob storage via PowerShell for a start. PowerShell allows you a consistent experience to work with your Azure Storage Accounts.

The required actions to perform this demo will incur costs. Monitor your consumption and delete resources when you no longer intend to use them.

Run the commands below to create a new container and upload the temp.dat file ($file) as an object. The container is named demo for this example, but you can name it differently as you prefer.

# Creates a container within $storageAccount via Context property of the storage account
# The returned object is then passed to the $container variable
$container = New-AzStorageContainer -Name demo -Context $storageAccount.Context
# Uploads the temp.dat file ($file) to the demo container ($container)
# The blob name (-Blob) will use the same name of the file you're uploading (Get-ChildItem $file.Name)
Set-AzStorageBlobContent -File $file.Name -Container $container.Name -Blob (Get-ChildItem $file.Name).Name -Context $storageAccount.Context
Uploading a file to Azure Storage Account
Uploading a file to Azure Storage Account

Uploading Files via AzCopy

Perhaps you have more complex use cases, such as synchronizing content or copying content between different accounts at scale. If so, the AzCopy command-line tool is what you need.

Run the commands below to login to your Azure tenant and copy your local file ($file) to the URL endpoint of your container. You’re logging in to Azure tenant since AzCopy is not aware of the credentials you are using with PowerShell.

# Login to the Azure tenant
& .\azcopy.exe login
# Copy the local $file to the full URI of the destination $container
& .\azcopy.exe copy $file.Name $container.CloudBlobContainer.Uri.AbsoluteUri
Uploading to Azure Storage Account using AzCopy
Uploading to Azure Storage Account using AzCopy

Instead of uploading, perhaps you want to download files via AzCopy. If so, run the command below to copy the specified files (temp.dat) from your container to the current local directory: & .\azopy.exe copy "$($container.CloudBlobContainer.Uri.AbsoluteUri)/temp.dat" .\temp.dat

Uploading Files via Azure Portal

If you prefer a GUI method of uploading your files, then Azure Storage Explorer is your friend. Azure Storage Explorer is one of the best graphical methods to manage your blob storage. You can access the storage explorer from your storage account resource in the Azure Portal.

1. Open your favorite web browser, and navigate to your Storage Explorer in Azure Portal.

2. Click on the demo container under BLOB CONTAINERS, as shown below, then click on Upload to access the Upload blob blade (right panel).

3. Now click on the folder icon at the Upload blob panel to select which files to upload (temp.dat).

4. Finally, click Upload (blue button) to upload your file.

Using the Upload blob blade in Azure Storage Explorer
Using the Upload blob blade in Azure Storage Explorer

Once the upload completes, you can close the Upload blob blade and see your uploaded blob, like the image below.

Viewing contents of a blob container in Azure Storage Explorer
Viewing contents of a blob container in Azure Storage Explorer

Downloading Files via Azure Portal

Similar to uploading content to blob storage, Azure supports downloading content in many ways. But since you just uploaded a file (temp.dat) via Azure Portal, let’s download the same file using Azure Storage Explorer in Azure Portal.

Select the file (temp.dat) to download and click on the Download button in the Azure Storage Explorer, as shown below. Doing so opens a new dialog box to confirm the download you’ll see in the next step.

Selecting Files to Download
Selecting Files to Download

Now click on the Click here to begin download button to download the files you selected.

Downloading Selected Files from the Blob Storage
Downloading Selected Files from the Blob Storage

Downloading Files via PowerShell

Like uploading files, you also get an option to download files from the blob storage by running commands in PowerShell. With PowerShell, you can list the objects within a container, then download them.

Run the below commands to list all objects in your container and download temp.dat to your local directory.

# List all the objects within the $container to verify the empyt container was created
Get-AzStorageBlob -Container $container.Name -Context $storageAccount.Context
# Download the temp.dat object from the $container
Get-AzStorageBlobContent -Blob temp.dat -Container $container.Name -Context $storageAccount.Context
Downloading files from Azure Storage Account
Downloading files from Azure Storage Account

If you prefer to use short-lived unique links to download files, you can use Shared Access Signature (SAS) tokens to create a preauthorized download link. These tokens are unique and private authentication tokens you can use to verify your access.

Run the commands below to create a new download link for the file (temp.dat) you want to download. The generated download link expires after 10 seconds and will download (Invoke-WebRequest $uri) the content using that link to the $temp variable.

# Generate a new download link valid for 10 minutes
$uri = New-AzStorageBlobSASToken -Context $storageAccount.Context -Container $container.Name -Blob temp.dat -Permission r -ExpiryTime (Get-Date).AddSeconds(10) -FullUri
# Use the link to download the file to the $temp variable
$temp = Invoke-WebRequest $uri

# Alternatively write the file to the current directory
Invoke-WebRequest $uri -OutFile .\temp.dat
Download from Azure Storage Account using a SAS token
Download from Azure Storage Account using a SAS token

Hosting a Web Page on Public Internet from Blob Storage

Up to this point, you’ve seen use cases of downloading files by authenticated users. But did you know that blob storage can provide an excellent option for public content too? One example is using a blob to host your web page content, which you’ll accomplish in this demo.

Even if your web page contents are encrypted both in transit and at rest, anyone can access those contents if public access is set.

Since you are setting up a different use case, you’ll use one of the major benefits of the public cloud in scale and elasticity. You can provision a new storage account for a specific use case and limit the risk of using public containers.

1. Run the command below to create a new storage account as you did in step four of the “Building an Azure Environment” section. But this time, you’ll pass the returned object to the $publicStorageAccount variable.

$publicStorageAccount = New-AzStorageAccount -ResourceGroupName $resourceGroup.ResourceGroupName -Name storage$(Get-Random -Maximum 99999) -Location $resourceGroup.Location -SkuName Standard_LRS
Creating a storage account
Creating a storage account

You now have a dedicated storage account for your public content, and you can configure it to host static web content with the following command.

2. Next, run the Enable-AzStorageStaticWebsite cmdlet to configure the storage account ($publicStorageAccount) for your new use case. The -IndexDocument sets the default web page you want to present to users. The -Context will be the new storage account you just created.

# Create the $web container and configure the storage account
Enable-AzStorageStaticWebsite -IndexDocument index.html -Context $publicStorageAccount.Context
Enable storage account for website hosting
Enable storage account for website hosting

3. Run the commands below to create a new HTML document in your current directory, and upload that document to the container specifically for hosting web content. The content type is set to HTML (ContentType="text/html"), so web browsers can properly interpret the document.

Accessing the document on a web browser prints the Hello from <storage account name> message.

# Create a simple HTML file
"<body><h1>Hello from $($publicStorageAccount.StorageAccountName)!</h1></body>"|Out-File .\index.html
# Upload the HTML file to the static web hosting container and set the ContentType to text/html
Set-AzStorageBlobContent -File .\index.html -Container "`$web" -Properties @{ContentType="text/html"} -Context $publicStorageAccount.Context
Create and upload an HTML document
Create and upload an HTML document

4. Now run the following command to get the URL where users can access your content.

$publicStorageAccount.PrimaryEndpoints.Web
Get the URI of the endpoint
Get the URI of the endpoint

5. Finally, open the URL in your browser, you’ll see something similar to the following screenshot.

Accessing HTML Document from Blob Storage
Accessing HTML Document from Blob Storage

Cleaning up Resources

Now that you’ve gone through testing these new concepts in using blob storage, you will want to clean up your resources. Why? Doing so helps you keep your subscription clean. More importantly, you stop incurring additional charges.

Since all resources you used in this tutorial are in a single resource group, you can clean up all resources by deleting the resource group.

Resources won’t always be contained within a single resource group, which illustrates why liberal use of logical segmentation can be beneficial, especially when testing or iterating frequently.

Run the Remove-AzResourceGroup cmdlet below, specifying the ResourceGroupName property of the $resourceGroup variable to delete the resource group and all resources within.

Remove-AzResourceGroup -Name $resourceGroup.ResourceGroupName
Delete resource group and contents
Delete resource group and contents

Conclusion

In this tutorial, you’ve touched on uploading and downloading files to and from blobs in cloud storage on different platforms. You’ve also learned it’s possible to host a web page from blob storage that users can publicly access.

You can do much more with blob storage and other storage types, so how would you build on these concepts? Perhaps work with file storage accounts, provide serverless file systems, or use page blobs for virtual hard disks with Azure virtual machines?

Hate ads? Want to support the writer? Get many of our tutorials packaged as an ATA Guidebook.

Explore ATA Guidebooks

Looks like you're offline!