LLM STATE DEDUP
llm-state-dedup.ts
LLM-emitted ids are display labels, not primary keys. Content-hash dedup with logical-key fallback for command-string drift.
StarkWHAT THIS PATTERN TEACHES
Why every LLM invocation is stateless from the model's perspective and IDs drift between cycles. How to key dedup on operative content (canonical command, request shape, recipient hash) instead of model-emitted ids.
WHEN TO USE THIS
Any VoidForge project using an LLM as a decision engine that emits actionable items: approvals, tickets, queued operations, notifications. Required if hourly/scheduled runs share context across cycles.
AT A GLANCE
// LLM emitted id varies across cycles for the same operation // Dedup on content hash instead: const key = shellCommandHash(proposal.command); // sha256 of canonical form if (seen.has(key)) skipApproval();
FRAMEWORK IMPLEMENTATIONS
TypeScript
import { createHash } from 'node:crypto'
// LLM-emitted identifiers drift across cycles. Two cycles asking the model
// to propose the same fix produce DIFFERENT id strings for substantively
// identical commands. Dedup keys must be derived from OPERATIVE CONTENT,
// not from the LLM's id field.
export interface ProposalDedupKey {
/** Content-hash of the operative payload — the actual dedup key. */
contentHash: string
/** Optional looser key for command-string drift collapse. */
logicalKey?: string
/** LLM-emitted id, retained as display label only. NEVER as primary key. */
displayId?: string
}
// Hash the canonical command string. Normalize whitespace and quoting so
// cosmetically-different but semantically-identical commands collapse.