Skip to main content

Notes on Azure + PowerShell + Account SAS

Well, below are my notes on using account Shared access signatures in Azure using Azure PowerShell modules.

Theory

Let's get the basics out of the way first.

A shared access signature is a way to delegate access to resources in a storage account, without sharing the storage account keys.

SAS gives us granular control over the delegated access by :
  • Specifying the start and expiry time.
  • Specifying the permissions granted e.g Read/Write/Delete
  • Specifying the Source IP address where the requests will originate from.
  • Specifying the protocol to be used e.g HTTP/HTTPS.


There are two types of SAS.
  1. Service SAS: This type of SAS delegates access to resources in a single storage service. Note - Azure storage is made of Blob, Queue, Table and File services.
  2. Account SAS: This type of SAS delegates access to resources in one or more storage services. In addition, it can also delegate access to the operations that apply to a given service.
So, in a nutshell, SAS is a signed URI that delegates access to one or more storage resources. Note that this URI basically contains all the information in it.

Now the SAS can take two forms.
  1. Ad-hoc SAS: This type of SAS contains/implies all the constraints in the SAS URI e.g. start time, end time, and permissions. Both Service and Account SAS can take this form.
  2. SAS with stored access policy: A stored access policy can be used to define the above constraints e.g. start/end time and permissions on a resource container (blob container, table, queue, or file share). So when a SAS is applied on the resource container it inherits the above constraints from the stored access policy.

    Note - Currently Service SAS can only take this form.
One more thing of importance is that while creating Service SAS tokens, it is a best practice to have stored access policy associated with the resource containers in place because the SAS can simply be revoked (if needed) by deleting the stored access policy.

If you do not follow above then you have to revoke the storage account key which was used to generate the SAS token.

Example: Create and use an account SAS

For this post, I will be showing how to create an account SAS to grant service-level API access to Blob and file storage services and then using a client to update the service properties.

Following the .NET code samples listed here

 +azureprep (resource group)
   \-azprepstore (storage account)
       \-testblobcontainer1 (blob container)
           \- docker.png (blob)


Create an Account SAS token


First, step is to create the Account SAS token using AzureRM and AzureSM PowerShell modules.

# variables
$resourceGroupName = 'azureprep'
$StorageAccountName = 'azprepstore'
# Login to ASM & ARM modules, the cmdlets for creating SAS tokens are not yet available under ARM modules
Add-AzureAccount
Login-AzureRmAccount
# Get all azure subscription which are attached with your account
Get-AzureSubscription | Sort-Object -Property SubscriptionName | Select-Object -Property SubscriptionName
# Select a subscription where action need to perform
Select-AzureSubscription –SubscriptionName "Visual Studio Dev Essentials"
# Discover the Azure.Storage module to list out cmdlets which have the verb New
Get-Command -Verb New -Module Azure.Storage
# get Storage Key
$storKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $StorageAccountName).Value[0]
# create the main Storage account Context
$storCont = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storKey
# Create an account SAS token - valid for Blob & File services.
# Gives client permission to read, write, and list permissions to access service-level APIs.
$storSAS = New-AzureStorageAccountSASToken -Service Blob,File -ResourceType Service `
-Permission "rwl" -Protocol HttpsOnly -Context $storCont
# Display the SAS token generated
$storSAS



Use Account SAS token (created above)


Open another PowerShell console, this will act as a client. The intent here is to show that using SAS token one can access the storage resource independently from another client.


# variables
$StorageAccountName = 'azprepstore'
$storSAS = '<SAS token value generated in previous step>'
# Create the new storage context with the Account SAS token generated above.
$ClientContext = New-AzureStorageContext -SasToken $storSAS -StorageAccountName $StorageAccountName
# try to fetch the Blobs using the above context and it will fail
# since the SAS token only grants access to the Service level APIs
Get-AzureStorageBlob -Container testblobcontainer1 -Context $ClientContext
# now try to fetch the Blob service properties (logging and metrics)
Get-AzureStorageServiceMetricsProperty -ServiceType Blob -MetricsType Hour -Context $ClientContext
Get-AzureStorageServiceLoggingProperty -ServiceType Blob -Context $ClientContext
# In order to set the service properties follow through
# create the storage credentials from the Account SAS token
$storCreds = [Microsoft.WindowsAzure.Storage.Auth.StorageCredentials]::new($storSAS)
# create cloud storage account object
$cloudStorageAccount = [Microsoft.WindowsAzure.Storage.CloudStorageAccount]::new($storCreds, $StorageAccountName, $null, $true)
# create a BlobClient from the above cloud storage account object
$BlobClient = $cloudStorageAccount.CreateCloudBlobClient()
# retrieve the service properties (logging and metrics)
$BlobClient.GetServiceProperties()
# set the service properties
# In the sample below, we will modify the logging retention days for the Blob service
# first create copy of the existing service properties
$copyofServiceProperties = $BlobClient.GetServiceProperties()
# Now modify the retention days in the copied object
$copyofServiceProperties.Logging.RetentionDays = 14
# Now use this modified copy of the service properties with the SetServiceProperties() method on BlobClient
$BlobClient.SetServiceProperties($copyofServiceProperties)
# Wait for few seconds for these changes to be reflected
Start-Sleep -seconds 4
# Now fetch the updated service properties from the BlobClient object
# You must notice that the retention days have been modified
$BlobClient.GetServiceProperties()



Hope this is useful.

References:


Using Shared access signatures

Create and use an account SAS (.NET)

Popular posts from this blog

Azure DevOps Tips & Tricks - Find private REST APIs

Original source -  Azure DevOps Tip - Find private APIs Often working with Azure DevOps, I hit a wall trying to automate some tasks but there are no REST API's made public yet. It was one of those task of automating creation of Environments in multi-stage YAML based pipelines in AzDO. Quick research reveals that this has been requested in uservoice  (please upvote). Let's see one of the very simple ways to discover some of these APIs.

Azure + GoLang SDK : Authenticating Part-2

The auth package lives at "github.com/Azure/go-autorest/autorest/azure/auth" In the above package, at the moment I have explored below two functions (my notes): NewAuthorizerFromFile method NewAuthorizerFromEnvironment method (this post)  This function definition looks like below :

Test connectivity via a specific network interface

Recently while working on a Private cloud implementation, I came across a scenario where I needed to test connectivity of a node to the AD/DNS via multiple network adapters.  Many of us would know that having multiple network routes is usually done to take care of redundancy. So that if a network adapter goes down, one can use the other network interface to reach out to the node. In order to make it easy for everyone to follow along, below is an analogy for the above scenario: My laptop has multiple network adapters (say Wi-Fi and Ethernet) connected to the same network. Now how do I test connectivity to a Server on the network only over say Wi-Fi network adapter?