Stack Overflow in Fastapi with Dynamodb
Stack Overflow in Fastapi with Dynamodb — how this specific combination creates or exposes the vulnerability
A Stack Overflow in a FastAPI application that uses DynamoDB typically occurs when recursive data structures or uncontrolled depth in serialization cause the runtime call stack to exceed its limit. In FastAPI, this often surfaces in request/response models that recursively reference themselves (for example, a parent resource that contains a list of child resources that again reference the parent), or in JSON rendering when circular references are not explicitly handled before calling json.dumps() or a FastAPI Response. When the application also interacts with DynamoDB, additional risk paths appear.
DynamoDB itself does not introduce stack overflow, but the way data is modeled and retrieved can amplify the issue. For example:
- Deeply nested attribute access in application code (e.g., iterating over nested dictionaries without depth checks) can lead to recursion that overflows the stack.
- Large item scans or queries that return many items, each with complex nested attributes, may trigger recursive serialization logic in FastAPI’s dependency injection or response models, increasing stack depth.
- Misconfigured
pydanticmodels with recursive types or validators that re-fetch related items from DynamoDB can cause repeated database calls and recursive validation, raising the risk of stack exhaustion under certain inputs.
An attacker can probe this surface by sending deeply nested JSON payloads or by requesting resources that trigger recursive lookups (e.g., repeatedly expanding references). If input validation and depth limits are absent, the service may crash or become unresponsive. Because FastAPI’s default JSON encoder relies on jsonable_encoder and json.dumps, uncontrolled recursion in user-controlled data can propagate into the serialization layer, turning a modeling or integration issue into a denial-of-service vector.
OpenAPI/Swagger analysis performed by middleBrick highlights these recursive model patterns by cross-referencing spec definitions with runtime findings. When combined with the 12 security checks, this helps identify BFLA/Privilege Escalation risks around data exposure and unsafe consumption patterns that can exacerbate stack-related issues.
Dynamodb-Specific Remediation in Fastapi — concrete code fixes
To mitigate Stack Overflow risks in FastAPI with DynamoDB, focus on controlling recursion depth, avoiding automatic recursive traversal, and validating inputs before they reach business logic or serialization. Below are concrete, realistic patterns and code examples.
1. Flatten nested models and avoid recursive references
Do not define Pydantic models that directly or indirectly reference themselves in a way that encourages deep recursion. Instead, use identifiers and lazy resolution.
from pydantic import BaseModel
from typing import List, Optional
class ItemBase(BaseModel):
id: str
name: str
class ItemResponse(ItemBase):
# Use references (IDs) instead of embedding full parent objects
parent_id: Optional[str] = None
children_ids: List[str] = []
class Config:
orm_mode = True
2. Control recursion in serialization
When you must include related objects, cap depth and avoid automatic recursion with jsonable_encoder by explicitly selecting fields.
from fastapi.encoders import jsonable_encoder
def safe_serialize(item: ItemResponse, max_depth: int = 3):
# Use explicit include/exclude or a custom dict to avoid deep recursion
return {
"id": item.id,
"name": item.name,
"parent_id": item.parent_id,
"children_ids": item.children_ids[:max_depth], # limit list size
}
3. Validate input depth before DynamoDB operations
Reject or sanitize payloads that exceed safe nesting levels before they touch DynamoDB.
def validate_depth(data: dict, max_depth: int = 5, current: int = 0) -> bool:
if current > max_depth:
return False
if isinstance(data, dict):
return all(validate_depth(v, max_depth, current + 1) for v in data.values())
if isinstance(data, list):
return all(validate_depth(item, max_depth, current + 1) for item in data)
return True
payload = {"a": {"b": {"c": {"d": {"e": {"f": 1}}}}}}
if not validate_depth(payload, max_depth=4):
raise ValueError("Payload exceeds allowed depth")
4. Use safe DynamoDB retrieval patterns in FastAPI dependencies
Avoid recursive attribute access when reading from DynamoDB. Project only the fields you need and enforce limits on related item expansion.
import boto3
from botocore.exceptions import ClientError
dynamodb = boto3.resource('dynamodb', region_name='us-east-1')
table = dynamodb.Table('Items')
def get_item_safe(item_id: str, limit_related: int = 10):
try:
resp = table.get_item(Key={'id': item_id})
item = resp.get('Item')
if not item:
return None
# Avoid deeply nested traversal; fetch related IDs separately and cap expansion
children_ids = item.get('children_ids', [])[:limit_related]
# Do not recursively fetch children here; resolve in separate controlled steps
return {"id": item['id'], "name": item['name'], "children_ids": children_ids}
except ClientError as e:
# Handle error appropriately
raise RuntimeError(f"DynamoDB error: {e.response['Error']['Message']}")
5. Enforce limits in FastAPI route logic
Add explicit checks in endpoints that expand related data, and prefer pagination or batch reads over unbounded recursion.
from fastapi import APIRouter, HTTPException
router = APIRouter()
@router.get("/items/{item_id}")
def read_item(item_id: str, expand: int = 0):
if expand > 3:
raise HTTPException(status_code=400, detail="Expansion depth limit exceeded")
item = get_item_safe(item_id, limit_related=5)
if not item:
raise HTTPException(status_code=404, detail="Item not found")
# Controlled expansion example (not recursive)
if expand > 0 and item["children_ids"]:
children = [get_item_safe(cid) for cid in item["children_ids"][:5]]
item["children"], item["depth"] = children, expand
return item
These patterns reduce the likelihood of Stack Overflow by bounding recursion, avoiding automatic deep traversal, and validating structure early. middleBrick can highlight insecure recursive models and unsafe consumption patterns in your OpenAPI spec and runtime findings, helping you prioritize fixes around data exposure and unsafe consumption checks.