Prototype Pollution in Aspnet with Cockroachdb
Prototype Pollution in Aspnet with Cockroachdb — how this specific combination creates or exposes the vulnerability
Prototype pollution in ASP.NET applications interacting with CockroachDB typically arises when user-controlled input is merged into objects that later influence database operations, query construction, or object mapping. Because CockroachDB is a distributed SQL database often accessed via parameterized queries or an ORM, the risk is not direct query injection (which is prevented by prepared statements), but rather pollution of server-side objects that affect how data is interpreted, serialized, or validated before being sent to CockroachDB.
In ASP.NET, models are often bound from JSON payloads using model binders. If an attacker can inject properties such as $where, $mod, or custom metadata keys into a DTO, and that DTO is later used to construct updates or filter logic, the polluted object may affect batch operations or conditional logic that ultimately reaches CockroachDB. For example, a merge function that copies request.UserData into an entity object can propagate unexpected keys into a dictionary that is passed to a CockroachDB row update routine. While CockroachDB itself does not execute JavaScript, the polluted object can lead to unsafe dynamic queries, incorrect row filtering, or unintended updates when combined with dynamic LINQ or string-based WHERE clause assembly.
Another vector involves logging, telemetry, or audit trails. If polluted objects are serialized into structured logs before being shipped to CockroachDB for storage, the pollution can alter record semantics or expose sensitive fields. Because CockroachDB supports JSONB columns, storing loosely validated JSON payloads without schema enforcement increases exposure. A common pattern is deserializing incoming JSON into ExpandoObject or Dictionary and later inserting into a JSONB column; if keys like __proto__ or constructor are present, they may affect client-side reconstruction even if the database stores raw JSON. The vulnerability therefore manifests not in SQL execution, but in application-layer object handling that feeds into CockroachDB operations.
ASP.NET’s default model validation does not inherently prevent prototype-style key pollution, especially when using [FromBody] with permissive deserialization settings. If the application later uses reflection or dynamic access to populate CockroachDB entity properties, polluted keys can silently map to unexpected fields. This is particularly risky when using community libraries or custom mappers that iterate over all enumerable properties without filtering. The interaction between ASP.NET’s flexible binding and CockroachDB’s schemaless JSON capabilities creates a pathway where prototype pollution affects data integrity and access controls applied before database submission.
Cockroachdb-Specific Remediation in Aspnet — concrete code fixes
Remediation focuses on strict input validation, avoiding dynamic property merging, and using typed, schema-bound models before any CockroachDB interaction. Prefer strongly typed DTOs with explicit property definitions and disable support for polymorphic deserialization of object graphs.
- Use immutable, typed models and avoid
ExpandoObjectorDictionaryfor incoming data that will reach CockroachDB. - Validate and whitelist allowed fields before mapping to database entities.
- Use parameterized SQL or an ORM with built-in protection, and never concatenate user input into query strings.
Example: Safe deserialization and insertion into CockroachDB using Npgsql and typed models.
using System.Text.Json;
using Npgsql;
public class ProductDto
{
public int Id { get; set; }
public string Name { get; set; } = string.Empty;
public decimal Price { get; set; }
}
public async Task InsertProductAsync(string connectionString, JsonElement payload)
{
// Explicitly map only expected fields; ignore extra keys
var product = new ProductDto
{
Id = payload.GetProperty("id").GetInt32(),
Name = payload.GetProperty("name").GetString() ?? string.Empty,
Price = payload.GetProperty("price").GetDecimal()
};
await using var conn = new NpgsqlConnection(connectionString);
await conn.OpenAsync();
await using var cmd = new NpgsqlCommand(
"INSERT INTO products (id, name, price) VALUES (@id, @name, @price)", conn);
cmd.Parameters.AddWithValue("id", product.Id);
cmd.Parameters.AddWithValue("name", product.Name);
cmd.Parameters.AddWithValue("price", product.Price);
await cmd.ExecuteNonQueryAsync();
}
Example: Validating against unexpected keys with JsonDocument before processing.
using System.Text.Json;
public bool ValidateAllowedFields(string json)
{
using var doc = JsonDocument.Parse(json);
var root = doc.RootElement;
if (root.ValueKind != JsonValueKind.Object)
return false;
var allowed = new HashSet<string> { "id", "name", "price" };
foreach (var prop in root.EnumerateObject())
{
if (!allowed.Contains(prop.Name))
return false;
}
return true;
}
In the ASP.NET pipeline, apply model validation attributes and configure JSON options to disallow incomplete metadata merges:
services.AddControllers()
.AddJsonOptions(options =>
{
options.JsonSerializerOptions.IncludeFields = false;
options.JsonSerializerOptions.PropertyNameCaseInsensitive = true;
// Do not enable support for reading arbitrary object graphs
});
For JSONB columns in CockroachDB, enforce schema validation at the application layer or via database constraints rather than relying on the database to accept arbitrary keys. This prevents polluted objects from being stored and later reconstructed in client code that may traverse __proto__ chains.