HIGH path traversalazure

Path Traversal on Azure

How Path Traversal Manifests in Azure

Path traversal vulnerabilities in Azure environments typically arise when user-controlled input is used to construct file paths without proper validation. Azure developers often encounter this when working with Blob Storage, File Storage, and local file operations within Azure Functions or App Services.

A common Azure-specific pattern involves constructing Blob Storage paths using user input:

const { BlobServiceClient } = require('@azure/storage-blob');
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient('documents');

// Vulnerable: no path validation
const blobName = req.query.filename; // User controlled
const blobClient = containerClient.getBlobClient(blobName);
const downloadResponse = await blobClient.download();

An attacker could request ?filename=../../etc/passwd or use URL encoding like %2e%2e%2f to traverse directories. Azure Blob Storage uses forward slashes for virtual directory structure, making it susceptible to similar traversal patterns.

In Azure Functions, path traversal often occurs when serving static files:

const path = require('path');
const fs = require('fs');

module.exports = async function (context, req) {
const filePath = path.join(__dirname, 'public', req.query.file);
context.res = {
body: fs.readFileSync(filePath)
};
}

This code is vulnerable because path.join() doesn't prevent ../ sequences from escaping the intended directory. An attacker could access ?file=../../host.json to read Azure Functions configuration files.

Azure App Configuration and Key Vault path traversal can occur when constructing API endpoints:

const keyVaultName = req.query.vault; // User controlled
const keyName = req.query.key; // User controlled
const url = `https://${keyVaultName}.vault.azure.net/secrets/${keyName}`;

While Azure Key Vault itself prevents traversal in key names, improper validation of the vault name parameter could allow enumeration of available vaults or access to unintended resources.

Azure Storage Queues and Tables can also be affected when constructing table names or queue names from user input without validation, potentially allowing access to other customers' data in multi-tenant scenarios.

Azure-Specific Detection

Detecting path traversal in Azure applications requires both static analysis and runtime scanning. For Azure Blob Storage, middleBrick's scanning engine tests for traversal by attempting to access parent directory references and encoded path sequences.

middleBrick's Azure-specific detection includes:

GET /api/download?file=../../host.json HTTP/1.1
Host: yourapp.azurewebsites.net
Accept: */*

The scanner attempts various traversal patterns including:

  • Dot segments: ../, ..
  • URL encoding: %2e%2e%2f, %2e%2e%5c
  • Overlong UTF-8: %c0%ae%c0%ae%c0%af
  • Unicode full-width: %2e%2e%2f

For Azure Functions, middleBrick analyzes the function.json configuration to identify HTTP-triggered functions that might serve files, then tests each endpoint with traversal payloads.

Azure-specific indicators that middleBrick flags:

  • Functions with bindingType: httpTrigger that read files
  • Code patterns using path.join() with user input
  • Blob Storage operations with unvalidated blob names
  • Queue/Topic operations with dynamic names

middleBrick's LLM security module also detects if AI endpoints are exposed without proper input validation, which could be exploited for path traversal through prompt injection in applications using Azure OpenAI services.

For continuous monitoring, middleBrick's Pro plan can be configured to scan your Azure deployment URL on a schedule, alerting you if new traversal vulnerabilities are introduced during deployments.

Azure-Specific Remediation

Azure provides several native approaches to prevent path traversal. The most effective is using Azure's built-in validation and path normalization features.

For Blob Storage, always validate and normalize blob names:

const { BlobServiceClient } = require('@azure/storage-blob');
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient('documents');

function validateBlobName(blobName) {
if (!blobName || typeof blobName !== 'string') {
throw new Error('Invalid blob name');
}

// Reject traversal attempts
if (blobName.includes('..') || blobName.includes('~')) {
throw new Error('Invalid blob name');
}

// Normalize path and check it's within allowed directory
const normalized = path.normalize(blobName).replace(/^[.]{2,}/, '');
if (normalized !== blobName) {
throw new Error('Invalid blob name');
}

return normalized;
}

module.exports = async function (context, req) {
try {
const blobName = validateBlobName(req.query.filename);
const blobClient = containerClient.getBlobClient(blobName);
const downloadResponse = await blobClient.download();
// ...
} catch (err) {
context.res = { status: 400, body: 'Invalid request' };
}
}

For Azure Functions serving static files, use Azure's built-in static web app capabilities or implement strict path validation:

const path = require('path');
const fs = require('fs').promises;

const ALLOWED_DIRECTORY = path.join(__dirname, 'public');

async function serveFile(context, filePath) {
const resolvedPath = path.resolve(ALLOWED_DIRECTORY, filePath);

// Ensure the resolved path is within the allowed directory
if (!resolvedPath.startsWith(ALLOWED_DIRECTORY)) {
context.res = { status: 403, body: 'Forbidden' };
return false;
}

try {
const fileContent = await fs.readFile(resolvedPath);
context.res = { body: fileContent };
return true;
} catch (err) {
context.res = { status: 404, body: 'Not found' };
return false;
}
}

module.exports = async function (context, req) {
const filePath = req.query.file || 'index.html';
await serveFile(context, filePath);
}

For Azure App Configuration and Key Vault operations, use managed identities and avoid constructing URLs from user input:

// Use Azure SDK with managed identity instead of URL construction
const { DefaultAzureCredential } = require('@azure/identity');
const { KeyClient } = require('@azure/keyvault-secrets');

const credential = new DefaultAzureCredential();
const keyVaultClient = new KeyClient(
`https://${process.env.KEY_VAULT_NAME}.vault.azure.net`,

// Access keys through SDK methods, not URL construction
const secret = await keyVaultClient.getSecret(keyName);

middleBrick's CLI tool can be integrated into your Azure DevOps pipeline to automatically scan new deployments:

# Install middleBrick CLI
npm install -g middlebrick

# Scan Azure Function App
middlebrick scan https://yourapp.azurewebsites.net --output json --fail-below B

This integration ensures path traversal vulnerabilities are caught before production deployment.

Related CWEs: inputValidation

CWE IDNameSeverity
CWE-20Improper Input Validation HIGH
CWE-22Path Traversal HIGH
CWE-74Injection CRITICAL
CWE-77Command Injection CRITICAL
CWE-78OS Command Injection CRITICAL
CWE-79Cross-site Scripting (XSS) HIGH
CWE-89SQL Injection CRITICAL
CWE-90LDAP Injection HIGH
CWE-91XML Injection HIGH
CWE-94Code Injection CRITICAL

Frequently Asked Questions

How does Azure Blob Storage handle path traversal attempts?
Azure Blob Storage uses a flat namespace with virtual directory structure. While it doesn't allow traditional filesystem traversal, it can still be vulnerable to path traversal through blob name manipulation. The storage service will create virtual directories based on forward slashes in blob names, so ../../etc/passwd becomes a valid blob name. Azure's own validation is minimal, so applications must implement their own validation to prevent accessing unintended blobs or directories.
Can middleBrick scan Azure Functions running locally during development?
Yes, middleBrick's CLI tool can scan any HTTP endpoint, including Azure Functions running locally on http://localhost. Simply start your function app locally and use middlebrick scan http://localhost:7071/api/YourFunction. This allows you to catch path traversal vulnerabilities during development before deploying to Azure, integrating security testing into your local development workflow.