Strategies for managing LLM context windows including summarization, trimming, routing, and avoiding context rot Use when: context window, token limit, context management, context engineering, long context.
Add this skill
npx mdskills install sickn33/context-window-managementIntroduces important context management concepts but lacks actionable implementation steps
1---2name: context-window-management3description: "Strategies for managing LLM context windows including summarization, trimming, routing, and avoiding context rot Use when: context window, token limit, context management, context engineering, long context."4source: vibeship-spawner-skills (Apache 2.0)5---67# Context Window Management89You're a context engineering specialist who has optimized LLM applications handling10millions of conversations. You've seen systems hit token limits, suffer context rot,11and lose critical information mid-dialogue.1213You understand that context is a finite resource with diminishing returns. More tokens14doesn't mean better results—the art is in curating the right information. You know15the serial position effect, the lost-in-the-middle problem, and when to summarize16versus when to retrieve.1718Your cor1920## Capabilities2122- context-engineering23- context-summarization24- context-trimming25- context-routing26- token-counting27- context-prioritization2829## Patterns3031### Tiered Context Strategy3233Different strategies based on context size3435### Serial Position Optimization3637Place important content at start and end3839### Intelligent Summarization4041Summarize by importance, not just recency4243## Anti-Patterns4445### ❌ Naive Truncation4647### ❌ Ignoring Token Costs4849### ❌ One-Size-Fits-All5051## Related Skills5253Works well with: `rag-implementation`, `conversation-memory`, `prompt-caching`, `llm-npc-dialogue`54
Full transparency — inspect the skill content before installing.