Ssrf Server Side in Cockroachdb
How SSRF Manifests in CockroachDB
Server-Side Request Forgery (SSRF) in CockroachDB environments typically occurs when the database is configured to connect to external services based on user-controlled input. This can happen through several CockroachDB-specific mechanisms:
- External data source connections using
IMPORTstatements with URLs - CDC (Change Data Capture) sink configurations pointing to user-supplied endpoints
- Backup/restore operations with remote storage URLs
- JDBC/ODBC connection strings built from user input
- Cloud storage integrations (S3, GCS, Azure) with dynamic endpoint resolution
The most common attack pattern involves an attacker manipulating a URL parameter that gets passed to a CockroachDB IMPORT statement. For example:
IMPORT TABLE users_csv (id UUID PRIMARY KEY, name TEXT, email TEXT) CSV DATA 'http://attacker.com/data.csv';Since CockroachDB will make an HTTP request to fetch the CSV file, an attacker can control the URL to target internal services. This could allow port scanning of internal networks, accessing metadata services (like AWS metadata at 169.254.169.254), or triggering interactions with other internal APIs.
Another CockroachDB-specific SSRF vector is through the use of external data sources. CockroachDB supports creating external connections to cloud storage:
CREATE EXTERNAL CONNECTION my_s3_conn TO S3 CONFIGURATION (AWS_ACCESS_KEY_ID = '...', AWS_SECRET_ACCESS_KEY = '...', BUCKET = 'my-bucket', REGION = 'us-west-2');If the bucket name or region parameters are derived from user input without validation, an attacker could manipulate these values to access unintended resources or trigger requests to internal endpoints.
CDC configurations also present SSRF risks. When setting up change data capture to external sinks, CockroachDB needs to connect to the specified endpoint:
CREATE CHANGEFEED INTO 'kafka://my-broker:9092/my-topic';If the broker address is user-controlled, this creates an SSRF vulnerability where the database will attempt to connect to arbitrary hosts and ports.
CockroachDB-Specific Detection
Detecting SSRF vulnerabilities in CockroachDB requires examining both configuration files and query patterns. Here are specific detection methods:
Configuration Analysis
Examine your cockroach.conf or environment variables for settings that might enable SSRF:
# Check for external connection configurations in conf files or environment variables:COCKROACH_EXTERNAL_CONNECTIONS={"s3":{"bucket":"${BUCKET_NAME}"}}Look for dynamic configuration where environment variables or user inputs populate connection parameters.
Query Pattern Analysis
Search for dangerous patterns in your SQL code:
SELECT * FROM sql_statements WHERE statement LIKE '%IMPORT%CSV%DATA%' OR statement LIKE '%CREATE%EXTERNAL%CONNECTION%' OR statement LIKE '%CREATE%CHANGEFEED%INTO%'Using middleBrick's API security scanner, you can specifically target CockroachDB endpoints for SSRF vulnerabilities. The scanner tests for:
- External data import functionality with various URL schemes
- Cloud storage endpoint resolution
- CDC configuration endpoints
- Backup/restore URL handling
middleBrick's black-box scanning approach is particularly effective here because it tests the actual runtime behavior without needing access to source code. The scanner will attempt to trigger SSRF conditions and report on any successful external connections made by the CockroachDB instance.
Network Traffic Analysis
Monitor network traffic from your CockroachDB nodes to identify unexpected outbound connections. Look for:
tcpdump -i any -n host cockroachdb-node and not port 26257 and not port 10257This captures traffic from CockroachDB nodes to non-standard ports, which could indicate SSRF exploitation attempts.
CockroachDB-Specific Remediation
Remediating SSRF vulnerabilities in CockroachDB environments requires a defense-in-depth approach. Here are specific mitigations:
1. Input Validation and Sanitization
Implement strict validation on any user-controlled parameters that might reach CockroachDB:
def validate_s3_bucket(bucket_name): # Only allow specific, approved buckets allowed_buckets = ['my-app-prod', 'my-app-staging', 'my-app-dev'] if bucket_name not in allowed_buckets: raise ValueError(f'Invalid bucket: {bucket_name}') # Disallow special characters that could manipulate endpoints if re.search(r'[^a-zA-Z0-9-.]', bucket_name): raise ValueError(f'Invalid characters in bucket name: {bucket_name}')2. Network-Level Controls
Implement egress filtering at the network level to restrict CockroachDB's outbound connections:
# Using iptables to restrict outbound connections from cockroachdb useriptables -A OUTPUT -m owner --uid-owner cockroach -d 169.254.0.0/16 -j DROPiptables -A OUTPUT -m owner --uid-owner cockroach -d 127.0.0.0/8 -j DROPiptables -A OUTPUT -m owner --uid-owner cockroach -d 0.0.0.0/8 -j DROPiptables -A OUTPUT -m owner --uid-owner cockroach -p tcp --dport 22 -j DROP3. CockroachDB Configuration Hardening
Configure CockroachDB to disable or restrict external connections:
# In cockroach.conf, restrict external data access:server.allowed_http_hosts = 'localhost,127.0.0.1' # Only allow localhost for external requests# Disable potentially dangerous features:server.feature_flag.ENABLE_UNTRUSTED_SOURCES = false4. Parameterized Queries and Stored Procedures
Use stored procedures with strict parameter validation:
CREATE OR REPLACE PROCEDURE safe_import_csv(bucket_name STRING, file_key STRING) AS $$DECLARE allowed_buckets ARRAY<STRING> := ARRAY['my-app-prod', 'my-app-staging']; url STRING;BEGIN -- Validate bucket IF bucket_name NOT IN allowed_buckets THEN RAISE 'Invalid bucket name'; END IF; -- Construct and validate URL url := format('https://s3.amazonaws.com/%s/%s', bucket_name, file_key); -- Import with validated URL IMPORT TABLE users_csv CSV DATA @url;END;$$ LANGUAGE plpgsql;5. Runtime Monitoring
Enable CockroachDB's auditing features to detect suspicious import operations:
ALTER RANGE default CONFIGURE ZONE USING experimental_enable_audit_log = on, audit_log_policy = 'read_write';CREATE TABLE audit_imports ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), user_name STRING, operation_time TIMESTAMP DEFAULT now(), operation_type STRING, target_url STRING, success BOOL);Then create a trigger to log import operations:
CREATE OR REPLACE FUNCTION log_import_operation() RETURNS TRIGGER AS $$BEGIN INSERT INTO audit_imports (user_name, operation_type, target_url, success) VALUES (current_user(), 'IMPORT', NEW.url, TRUE); RETURN NEW;END;$$ LANGUAGE plpgsql;CREATE TRIGGER import_audit_trigger AFTER IMPORT ON DATABASE EXECUTE FUNCTION log_import_operation();Frequently Asked Questions
Can CockroachDB's IMPORT statement be completely disabled to prevent SSRF?
server.feature_flag.ENABLE_UNTRUSTED_SOURCES = false in your CockroachDB configuration. This prevents any external data import operations, eliminating that SSRF vector. However, this may impact legitimate data loading workflows, so consider if you can use alternative methods like COPY with local files or trusted data pipelines.