Memory Leak in Feathersjs with Api Keys
Memory Leak in Feathersjs with Api Keys — how this specific combination creates or exposes the vulnerability
A memory leak in a Feathersjs service that uses API keys typically arises when per-request key validation state or key metadata is retained beyond the request lifecycle. Feathersjs services often rely on hooks and custom service logic; if API key information (for example, a decoded key payload or a cached authorization decision) is stored on the service, connection, or in a module-level cache without cleanup, each authenticated request adds more data that the garbage collector cannot reclaim promptly.
Consider a Feathersjs hook that decodes an API key and attaches user-like metadata to the context for downstream use:
// api-keys.hooks.js
const apiKeyCache = new Map();
function attachApiKeyData(context) {
const { keyId } = context.params.query;
if (keyId) {
// Risk: storing resolved key data on context without cleanup
const cached = apiKeyCache.get(keyId);
if (!cached) {
apiKeyCache.set(keyId, { keyId, scope: 'extensive-access', requests: 0 });
}
apiKeyCache.get(keyId).requests += 1;
context.apiKeyData = apiKeyCache.get(keyId);
}
return context;
}
If apiKeyCache grows indefinitely because keys are never evicted or pruned, the Node.js process heap grows steadily — a classic memory leak. In a Feathersjs app, this is exacerbated when the service handles many short-lived connections (e.g., REST or Socket.io) and each connection attaches large objects to the context. The leak may not be obvious initially because Feathersjs abstracts much of the request handling, but the retained references prevent V8 from reclaiming memory, eventually increasing RSS and potentially causing process instability or restarts.
Another scenario involves service methods that perform per-call authorization checks using API keys and unintentionally create closures that capture large request-scoped objects. For instance:
// services/orders/orders.class.js
class OrdersService {
find(params) {
const { apiKey } = params.query;
// Risk: closure captures params and large intermediate objects
return getOrders().then(orders => {
return orders.filter(order => {
// Expensive per-call authorization logic retained in closure scope
return authorizeOrder(apiKey, order);
});
});
}
}
If orders or the authorization context holds references to sizable buffers or parsed payloads, and the service does not release them promptly, the memory footprint grows with each call. Over time, this pattern can manifest as a memory leak in production, especially under sustained load. Because middleBrick scans the unauthenticated attack surface and includes checks such as Unsafe Consumption and Property Authorization, it can surface risky authorization patterns that may contribute to retention issues, providing prioritized findings with severity and remediation guidance.
In summary, the combination of Feathersjs hooks, long-lived caches keyed by API key identifiers, and closure-heavy authorization logic creates conditions where memory is retained unintentionally. Monitoring RSS and using heap snapshots are practical ways to detect such leaks, while refactoring to avoid persistent attachment of key metadata is an effective mitigation.
Api Keys-Specific Remediation in Feathersjs — concrete code fixes
To remediate memory leaks related to API keys in Feathersjs, focus on avoiding long-lived references, limiting cache growth, and ensuring per-request data does not persist beyond the request lifecycle. Below are concrete fixes and examples that align with secure usage patterns supported by middleBrick checks.
1. Use WeakMap for key metadata instead of Map. WeakMap allows garbage collection of keys when there are no other references, preventing unbounded growth:
// api-keys.hooks.js
const keyMetadataStore = new WeakMap();
function attachApiKeyData(context) {
const { keyId, keyObject } = context.params.query; // keyObject should be the parsed key
if (keyObject) {
// WeakMap holds metadata only while keyObject is referenced elsewhere
if (!keyMetadataStore.has(keyObject)) {
keyMetadataStore.set(keyObject, { scope: 'extensive-access', requests: 0 });
}
const meta = keyMetadataStore.get(keyObject);
meta.requests += 1;
// Attach only lightweight references to context
context.apiKeyMeta = meta;
}
return context;
}
Because WeakMap does not prevent garbage collection, memory pressure remains bounded even under sustained traffic.
2. Limit cache size and add TTL eviction. If you must use a Map, implement size limits and time-based cleanup:
// secure-cache.js
const MAX_ENTRIES = 1000;
const TTL_MS = 300_000; // 5 minutes
class BoundedCache {
constructor() {
this.store = new Map();
}
get(key) {
const entry = this.store.get(key);
if (!entry) return undefined;
if (Date.now() - entry.ts > TTL_MS) {
this.store.delete(key);
return undefined;
}
return entry.value;
}
set(key, value) {
if (this.store.size >= MAX_ENTRIES) {
// Remove oldest entry (simple strategy)
const oldestKey = this.store.keys().next().value;
this.store.delete(oldestKey);
}
this.store.set(key, { value, ts: Date.now() });
}
}
const apiKeyCache = new BoundedCache();
function attachApiKeyData(context) {
const { keyId } = context.params.query;
if (keyId) {
const cached = apiKeyCache.get(keyId);
if (cached) {
context.apiKeyCacheHit = true;
context.apiKeyData = cached;
} else {
apiKeyCache.set(keyId, { scope: 'standard' });
context.apiKeyCacheHit = false;
}
}
return context;
}
This pattern ensures memory usage remains predictable and aligns with middleBrick’s findings around Property Authorization and BFLA/Privilege Escalation by encouraging bounded, short-lived caches.
3. Avoid attaching heavy objects to context. Store only lightweight identifiers and fetch data on demand:
// services/users/users.class.js
class UsersService {
async get(context) {
const { keyId } = context.params.query;
if (!keyId) throw new Error('keyId required');
// Fetch user data per request instead of caching large objects
const user = await fetchUserById(keyId);
// Do not attach user object to context for the entire app lifetime
return { userId: user.id, name: user.name };
}
}
By fetching data per request and returning only necessary fields, you reduce the risk of inadvertently retaining large objects in closures or caches. middleBrick’s checks on Input Validation and Unsafe Consumption can highlight overly permissive key usage that encourages such patterns.
These remediation steps help ensure API key handling in Feathersjs remains efficient and leak-free while supporting compliance mappings that middleBrick can surface in reports, including OWASP API Top 10 and SOC2 considerations.