What Is Azure Archive Storage?
Azure Archive Storage is one of the Tiers of Azure Storage. One thing you should know about Azure storage is that you are charged for two things.
- Storage – The cost of having your data sit in an Azure Storage Account.
- Access – The cost of accessing your data.
The available tiers of Azure Storage are:




- Hot – Frequently accessed data
- High storage cost
- Low accessed cost
- Cool – In-frequently accessed data. Data should be stored a minimum of 30 days.
- Low storage cost
- High access cost
- Cold – In-frequently accessed data. Data should be stored a minimum of 90 days.
- Lower than Cool Tier storage cost
- Higher than Cool Tier access cost
- Archive – Data that is rarely accessed. In this tier, the data is stored off-line. Latency is more like hours than milliseconds. Should be stored a minimum of 180 days.
- Lowest storage cost
- Higher access cost
Use the Azure Cost Calculator to get specific cost numbers for a given region and Storage Tier.
What Are the Benefits of Azure Archive Storage?
Azure Archive Storage’s greatest advantage is the storage cost when data will not be accessed for extended periods of time. If you do not need to access this data in any fashion, but you don’t want to get rid of it, and you’re OK with not having instant access to it when you do need it. The Archive Tier may be your best option.
What Are the Disadvantages of Azure Archive Storage?
The biggest drawback to Azure Archive Storage is the latency associated when you do need your data. It will take time, on the order of hours, for your data to be made available to you. The reason for this is that the data is stored off-line and not on-line like Hot, Cool and Cold Tiers. Off-line data needs to be “re-hydrated” to an on-line tier (Hot, Cool, or Cold) and that can take up to 15 hours.
Data in the Azure Archive Storage Tier is removed before 180 will be charged an early deletion charge.
You can’t set an Azure Storage Account’s default Access Tier to Archive. Individual blobs must be configured to use the Archive Tier.
Archiving with Purpose: Real-World Use Cases
Azure Archiving Tier Storage is ideal in any situation where you either won’t be accessing the data, or you don’t want to access the data but still want to keep it around. some examples:
- Long-term backup – Any situation where you will be keeping a backup of data (longer than 180 days). but might want to recover it. This data should NOT be DR (Disaster Recovery) data unless of course waiting, potentially, 18 hours for the data to be available is acceptable.
- Having a long-term copy of the original – In a case where you will be working with some data but you want to keep the original intact and untouched for an extended period (again, 180+ days).
- Compliance – When you want/need to hold on to data for an extended period (180+ days) and keep it untouched.
What is the difference between Archive Storage and Backup?
The Archive Storage Tier is how your data will be stored. Backup is a process of copying your data to a different location/storage medium, in case something goes wrong with the original data. Data that is backed up can be placed in an Azure Storage Account in the Archive Storage Tier.
CAUTION
I would caution that if you do this and you need to access that data stored in the Archive Storage Tier in less than 180 days from the day you put it there, you will be charged an additional fee.
If you put data in the Archive Tier, you’re saying to Microsoft that you don’t intend on needing access to that data in the near future. Microsoft will store that data in off-line storage, and they don’t want to have to be pulling that data out so soon, so they charge you an additional fee.
What Storage Account configurations support Archive Tier?
Performance
It may go without saying, but I’ll say it anyways. Blobs in Premium Block Blob storage cannot be moved to the Archive Tier. The reason is, a Premium Azure Storage Account is made for low latency and the Azure Archive Tier is the exact opposite with low latency. So, an Azure Storage Account must be a Standard performance Storage Account.
Replication
For an Azure Storage Account to support Archive, it must be LRS, GRS or RA-GRS. That means that the Zone Redundant replication settings won’t be able to have Archive configured on a Blob. So, ZRS, GZRS and RA-GZRS cannot lever Archive Tier.
Maximizing savings with Azure Archive Storage
Explain the cost structure for Archive Storage
How Does Azure Archive Storage Work?
When a Storage Account Blob is configured for the Archive Tier. The file is moved “offline.” In other words, it is not longer in the “easy to access” data stores used for Hot, Cool or Cold Tiers of storage. The file
How do I use Azure Archive Storage?
Notice that when you create an Azure Storage Account, the only options for Access Tier is either Hot, or Cool.

Also, when you go through the portal to set a Storage Account default Storage Tier to Archive, you don’t get that option.

Cool and Hot are the only options for configuring an Azure Storage Account’s default Access Tier. The Default Access Tier is the Access Tier that any Blob, or file, put in the Storage Account will be set to when the Access Tier isn’t specified at creation time. This can be changed at any time and from that point forward, any new Blobs added to the Storage Account will have the new default Access Tier.
In order to leverage the Archive Access Tier, you need to make that change on individual Blobs. There are three different ways to do this.
Set Blob to Archive at upload
- Portal
- PowerShell
- Azure CLI
- Python
- Az copy
The following code requires the authenticated user to have the “Storage Blob Data Contributor” Role Assignment.
# Storage Account variables
$resourceGroupName = 'LearnAzureNow'
$storageAccountName = 'learnazurenowdemo'
$containerName = 'root-container'
# File variables
$fileName = 'LearnAzureNow_com.txt'
$filePath = 'c:\temp'
# Create the file if it does not already exist
if (-not (test-path "$filePath\$fileName" -ErrorAction 'Continue'))
{
"LearnAzureNow.com" | out-file "$filePath\$fileName"
}
# Get the key for the Storage Account
$key = get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName | select -first 1 | % value
# Get the context for the Storage Account
$context = new-azStorageContext -StorageAccountName $storageAccountName -storageAccountKey $key
Set-AzStorageBlobContent -File "$filePath\$fileName" -Container $containerName -StandardBlobTier "Archive" -Context $context
Explanation
The following code requires the authenticated user to have the “Storage Blob Data Contributor” Role Assignment.
az storage blob upload --container-name "root-container" --account-name "learnazurenowdemo" --file c:\temp\LearnAzureNow_com.txt --tier Archive
The following code requires the authenticated user to have the “Storage Blob Data Contributor” Role Assignment.
# pip install azure-identity
# pip install azure-storage-blob
import os
from azure.identity import AzureCliCredential
from azure.storage.blob import BlobServiceClient, StandardBlobTier
credential = AzureCliCredential()
storage_account_name = 'learnazurenowdemo'
source_path = 'c:\\temp\\LearnAzureNow_com.txt'
file_name = os.path.basename(source_path)
# Create the url for the Storage Account
account_url = f"https://{storage_account_name}.blob.core.windows.net/"
# Obtain the blob service client.
blob_service_client = BlobServiceClient(account_url, credential=credential)
# Get the container name as the first "directory" in the destination path
container_name = 'root-container'
#container_client = blob_service_client.get_container_client(container=container_name)
blob_client = blob_service_client.get_blob_client(container=container_name, blob=file_name)
with open(file=source_path, mode="rb") as data:
new_blob_client = blob_client.upload_blob(data=data,
overwrite=True,
validate_content=True,
standard_blob_tier=StandardBlobTier.Archive)
The following code requires the authenticated user to have the “Storage Blob Data Contributor” Role Assignment.
azcopy copy "c:\temp\LearnAzureNow_com.txt" "https://learnazurenowdemo.blob.core.windows.net/root-container" --block-blob-tier "Archive"
Set Blob to Archive Tier Manually/Programmatically
- Portal
- PowerShell
- Azure CLI
- Python
In this, we’re calling SetAccessTier method on the BlobClient property of the blob that was returned from Get-AzStorageBlob.
# Storage Account variables
$resourceGroupName = 'LearnAzureNow'
$storageAccountName = 'learnazurenowdemo'
$containerName = 'root-container'
$tier = "Archive"
$blobName = 'LearnAzureNow_com.txt'
# Get the key for the Storage Account
$key = get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName | select -first 1 | % value
# Get the context for the Storage Account
$context = new-azStorageContext -StorageAccountName $storageAccountName -storageAccountKey $key
# Change the blob's access tier.
$blob = Get-AzStorageBlob -Container $containerName -Blob $blobName -Context $context
$blob.BlobClient.SetAccessTier($tier, $null, "Standard")
az storage blob set-tier --container-name "root-container" --account-name "learnazurenowdemo" name "LearnAzureNow_com.txt" --tier
Add your content here...import os
from azure.identity import AzureCliCredential
from azure.storage.blob import BlobServiceClient
credential = AzureCliCredential()
storage_account_name = 'learnazurenowdemo'
source_path = 'c:\\temp\\LearnAzureNow_com.txt'
file_name = os.path.basename(source_path)
# Create the url for the Storage Account
account_url = f"https://{storage_account_name}.blob.core.windows.net/"
# Obtain the blob service client.
blob_service_client = BlobServiceClient(account_url, credential=credential)
# Get the container name as the first "directory" in the destination path
container_name = 'root-container'
#container_client = blob_service_client.get_container_client(container=container_name)
blob_client = blob_service_client.get_blob_client(container=container_name, blob=file_name)
blob_client.set_standard_blob_tier(standard_blob_tier = "Archive")
Set Blob to Archive using Lifecycle Management
- Portal
- PowerShell
- Azure CLI
- Python
Add your content here…
# Storage Account variables
$resourceGroupName = 'LearnAzureNow'
$storageAccountName = 'learnazurenowdemo'
$containerName = 'root-container'
$tier = "Archive"
$blobName = 'LearnAzureNow_com.txt'
# Enable Access Time Tracking on the Storage Account
Enable-AzStorageBlobLastAccessTimeTracking -ResourceGroupName $resourceGroupName `
-StorageAccountName $storageAccountName -PassThru
# Create a new action object.
$action = Add-AzStorageAccountManagementPolicyAction -BaseBlobAction TierToArchive `
-daysAfterModificationGreaterThan 30
# Create a new filter object.
$filter = New-AzStorageAccountManagementPolicyFilter -PrefixMatch $containerName `
-BlobType blockBlob
# Create a new rule object.
$rule1 = New-AzStorageAccountManagementPolicyRule -Name "30Day_Archive-rule" `
-Action $action `
-Filter $filter
# Create the policy.
Set-AzStorageAccountManagementPolicy -ResourceGroupName $resourceGroupName `
-StorageAccountName $storageAccountName `
-Rule $rule1
Create this json file and place it in the same directory you’re termial and call it ‘policy.json’.
Be sure to update fields as appropriate to your environment.
{
"rules": [
{
"enabled": True,
"name": "30Day_Archive-rule",
"type": "Lifecycle",
"definition": {
"filters": {
"blob_types": [
"blockBlob"
],
"prefix_match": [
"root-container"
]
},
"actions": {
"base_blob": {
"tier_to_archive": {
"days_after_modification_greater_than": "30"
}
},
}
}
}
]
}
Then run this CLI command:
az storage account management-policy create --account-name learnazurenowdemo --resource-group LearnAzureNow --policy @policy.json
# pip install azure-identity
# pip install azure-mgmt-storage
import os
from azure.identity import AzureCliCredential
from azure.mgmt.storage import StorageManagementClient
subscription_id = 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
credential = AzureCliCredential()
resource_group_name = 'LearnAzureNow'
storage_account_name = 'learnazurenowdemo'
container_name = 'root-container'
# Create the management client for the storage account
mangement_client = StorageManagementClient(credential=credential,
subscription_id=subscription_id)
management_policy = mangement_client.management_policies.create_or_update(
resource_group_name,
storage_account_name,
"default",
{
"policy": {
"rules": [
{
"enabled": True,
"name": "30Day_Archive-rule",
"type": "Lifecycle",
"definition": {
"filters": {
"blob_types": [
"blockBlob"
],
"prefix_match": [
f"{container_name}"
]
},
"actions": {
"base_blob": {
"tier_to_archive": {
"days_after_modification_greater_than": "30"
}
},
}
}
}
]
}
}
)
Case Study
As a method of illustrating how much money can be saved using the Archive Azure Strorage Tier
1024 10MB files were uploaded on 6/20 using script “Add-1GBToArchiveStorage.ps1”
Generate graphs over time to illustrate costs.
First file uploaded 6/20/2023 8:32:00 PM +00:00
Last file uploaded: about an hour later.
Code used:
################################################################
# Purpose:
# Create 1024 files of size 1MB (1024) to a Storage Account
# all of which will be in the Archive Tier.
#
################################################################
param(
[Parameter(Mandatory=$true,
Position=0,
ValueFromPipelineByPropertyName=$true)]
[ValidateNotNullOrEmpty()]
[string[]] $storageAccountId
)
#-----------------------------
# Variables
#-----------------------------
#-----------------------------
# Functions
#-----------------------------
#-----------------------------
# Main
#-----------------------------
# Loop through each Storage Account Id
foreach ($id in $storageAccountID) {
$idSplit = $id.trim('/').split('/')
$subscriptionId = $idSplit[1]
$resourceGroupName = $idSplit[3]
$storageAccountName = $idSplit[7]
$key = Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName `
-Name $storageAccountName | select -first 1 | % value
$context = New-AzStorageContext -StorageAccountName $storageAccountName `
-storageAccountKey $key
$containerName = "cost-analysis"
New-AzStorageContainer -Name $containerName -Context $context
$numberOfFiles = 1024
foreach ($fileNumber in (1..$numberOfFiles)) {
$fileSize = (1024 * 1024 * 10) - 2 # 10MB file
$filename = "$($fileNumber).txt"
$filepath = "$pwd\$filename"
'x' * $fileSize | out-file $filepath
dir $filepath |fl * -force | % length
Set-AzStorageBlobContent -File $filepath `
-Container $containerName `
-StandardBlobTier Archive `
-Context $context
Remove-Item $filepath -force
}
}
Cost Comparison Archive Tier vs. Cool Tier

In this image we can see that the initial upload to the Archive Storage Tier cost a bit more than the upload to Cool Storage Tier, $0.04 vs. $0.03. For each day after both Storage Accounts incurred a <$0.01 charge. Now, the Archive Storage had a charge for a total of 30 days with a total charge of $0.06 (which includes the $0.04 upload charge).