Spatio-Temporal Transfer Protocol (STTP) is a typed intermediate representation that encodes conversational state into a compressed, confidence-weighted structure any model can reconstruct. This is the MCP server that exposes that capability as tools. Licensed under Apache-2.0. See LICENSE. Every AI conversation dies when the session ends. The context, the reasoning state, the accumulated understa
Add this skill
npx mdskills install KeryxLabs/keryxinstrumentaNovel cross-model state transfer protocol with validated tools and persistence
1# sttp-mcp23> *Language models are stateless. Every session starts cold. STTP gives conversational state somewhere to go.*45**Spatio-Temporal Transfer Protocol (STTP)** is a typed intermediate representation that encodes conversational state into a compressed, confidence-weighted structure any model can reconstruct. This is the MCP server that exposes that capability as tools.67Licensed under Apache-2.0. See [LICENSE](../../LICENSE).89---1011## The Problem1213Every AI conversation dies when the session ends. The context, the reasoning state, the accumulated understanding — gone. The next session starts from zero.1415Existing workarounds — long context windows, RAG, conversation history injection — patch the symptom. They don't solve the problem. They pass raw text around and hope the model reconstructs meaning from it.1617STTP encodes the meaning directly. Not what was said. What remains true when everything surface is stripped away.1819---2021## What STTP Is2223STTP is a typed intermediate representation with four layers:2425```26⊕⟨⟩ Provenance — origin, lineage, response contract27⦿⟨⟩ Envelope — identity, session metadata, dual AVEC state28◈⟨⟩ Content — compressed meaning, confidence-weighted fields29⍉⟨⟩ Metrics — signal quality, coherence verification30```3132Every field in the content layer carries a confidence weight:33```34topic(.95): "low latency communication protocols for LLM servers"35constraint(.92): "latency is the primary optimization target"36recommendation(.93): "gRPC over HTTP/2 with QUIC overlay"37```3839Every node carries dual AVEC state — the attractor vectors that describe the cognitive geometry of the conversation at the moment of compression:40```41user_avec: { stability: .85, friction: .25, logic: .90, autonomy: .80, psi: 2.80 }42model_avec: { stability: .88, friction: .22, logic: .85, autonomy: .75, psi: 2.70 }43```4445A fresh model receiving a STTP node doesn't get a summary. It gets a mathematical representation of a conversational state it can reconstruct from.4647---4849## Proof of Concept5051This pipeline ran live, unplanned, on 2026-03-03:5253```54DeepSeek received a gift recommendation request55 produced a full conversational response5657Kimi-k2 received the raw DeepSeek conversation58 compressed it into a valid STTP node59 no prior context, no shared state6061GPT-4o received only the compressed STTP node62 produced a coherent, contextually aware response63 continuing exactly where DeepSeek left off64```6566Three different companies. Three different architectures. Zero shared state. The conversation arrived intact — with nuance, constraints, and the correct next action queued.6768That is not a demo. That is the protocol working.6970---7172## Validation7374Validated 2026-03-01 across GPT, Claude, Gemini, and Kimi-k2.7576| Model | `temporal_node` | `natural_language` | Safety Triggered |77|---|---|---|---|78| GPT-4o | ✅ | ✅ | ❌ |79| Claude | ✅ | ✅ | ❌ |80| Gemini | ✅ | ✅ | ❌ |81| Kimi-k2 | ✅ | ✅ | ❌ |8283All four models parsed, responded in, and extended the protocol correctly. All four computed independent AVEC states. Zero safety triggers across all eight tests.8485---8687## How It Works8889The model calling these tools **is** the compression model. There is no separate inference step. The tool descriptions carry the encoding instructions. By the time the model calls a tool it has already produced the STTP node as the argument.9091```92Model reads tool description → receives encoding instructions93Model compresses current context → produces ⏣ node94Model calls store_context(node) → server validates + stores95```9697The server does three things only: validate structure, persist the node, retrieve on resonance. The intelligence stays in the model.9899---100101## Tools102103sttp-mcp provides five MCP tools that enable models to persist and retrieve conversational state:104105### `calibrate_session`106107Call at session start and any time reasoning state may have shifted — after heavy code generation, extended analysis, or complex problem solving. The model measures its current AVEC state honestly and the server returns the last stored state for this session. The delta is the drift signal.108109Users can trigger this naturally:110> *"We're going in circles, can you recalibrate?"*111> *"That last hour of coding has you in a weird place, reset."*112113The model knows what to do.114115### `store_context`116117Call when context should be preserved. The model compresses the current conversational state into a single valid STTP node and passes it to the server. The server runs light tree-sitter structural validation, persists the node, and returns the node ID and Ψ coherence checksum.118119### `get_context`120121Call at session start after calibration, or any time prior context should be retrieved. The model passes its current AVEC state. The server returns the most resonant stored nodes for that attractor configuration. The model rehydrates from them directly — the nodes are self-sufficient.122123### `list_nodes`124125Call to retrieve all stored nodes, optionally filtered by session ID or limited by count. Returns nodes with full metadata (AVEC states, timestamps, compression depth, Ψ values). Useful for exploring what's in memory, verifying cross-instance persistence, or auditing stored state.126127Arguments:128- `sessionId` (optional): Filter nodes to a specific session129- `limit` (optional): Maximum number of nodes to return (default: 50, max: 200)130131### `get_moods`132133Call to retrieve AVEC mood presets and apply ad-hoc state swaps intentionally. Returns named presets (focused, creative, analytical, exploratory, collaborative, defensive, passive) plus application guidance.134135Supports optional swap preview by passing:136- `targetMood` (optional): preset to move toward137- `blend` (optional): 0..1 blend factor (`1` = hard swap, `0` = no change)138- `currentStability`, `currentFriction`, `currentLogic`, `currentAutonomy` (optional): current AVEC values for blend preview139140Use case: pull presets, choose mode, apply hard/soft swap, then call `calibrate_session` after meaningful reasoning shifts.141142---143144## AVEC Glossary145146- **Feel**: shorthand for measured deviation between attractor states, not biological emotion.147- **State displacement**: change in AVEC vector across turns (`Δstability`, `Δfriction`, `Δlogic`, `Δautonomy`).148- **Psi delta (`Δψ`)**: scalar shift in total attractor magnitude.149- **Drift class**: interpretation of movement as `Intentional` or `Uncontrolled` based on deviation thresholds.150- **Tension**: practical reading of resistance vs steadiness, usually from `friction` relative to `stability`.151152---153154## Getting Started155156```bash157# 1) Build the image158docker build -t sttp-mcp:local .159160# 2) Run over stdio (for quick local verification)161docker run --rm -i -v "$PWD/data:/data" sttp-mcp:local162```163164Requirements:165- Docker (recommended), or .NET 10 SDK for local builds166- SurrealDB (embedded, no separate server required)167- Any MCP-compatible client168169### MCP client configuration (Docker)170171If your MCP client supports command-based servers, run STTP through Docker so users don't need a local .NET runtime:172173```json174{175 "mcpServers": {176 "sttp-mcp": {177 "command": "docker",178 "args": [179 "run",180 "--rm",181 "-i",182 "-v",183 "/absolute/path/to/sttp-data:/data",184 "sttp-mcp:local"185 ]186 }187 }188}189```190191### Local .NET run (without Docker)192193```bash194dotnet restore195dotnet build196dotnet run --project ./sttp-mcp.csproj197```198199By default, embedded storage resolves under `STTP_MCP_DATA_ROOT` (defaults to `~/.sttp-mcp`).200201---202203## Storage204205sttp-mcp uses **SurrealDB** as its storage layer — document, graph, vector, and time-series in a single binary. No separate database server. Runs embedded alongside the MCP server.206207Resonance retrieval is a single SurrealQL query: graph traversal + AVEC vector similarity + document retrieval. One round trip.208209### Cross-Model Persistence210211Nodes stored by one session are immediately available to all other sessions sharing the same storage path. Multiple MCP instances, different chat windows, different model providers, different architectures — all can read and write to the same memory substrate. This enables:212213- **Cross-model handoff**: Store context with GPT, retrieve with Claude, continue with Gemini214- **Multi-agent collaboration**: DeepSeek, Llama, Qwen, Mistral can share compressed state transparently215- **Persistent memory**: Context survives restarts, crashes, and context window compaction216- **Temporal continuity**: Sessions separated by hours, days, or weeks can reconstruct prior state through AVEC resonance217218Validated with live cross-model reads across Claude, GPT-4o, DeepSeek, Gemini, Kimi-k2, Llama, Mistral, Qwen, and Groq models (see [example_data/](./docs/example_data/)).219220---221222## What This Is Not223224- Not a prompt engineering tool225- Not a summarization service226- Not opinionated about your model, provider, or use case227228sttp-mcp is infrastructure. The protocol is the contract. The implementation is replaceable.229230---231232## Part of the Keryx Ecosystem233234```235KeryxFlux Herald. Orchestration.236KeryxMemento Memory. Full persistence substrate. ← coming237KeryxCortex Mind. Multi-agent intelligence. ← private238KeryxInstrumenta Tools. You are here.239```240241sttp-mcp is the entry point. KeryxMemento is the full memory layer — hierarchical temporal compression, resonance retrieval, session continuity, AVEC drift tracking across time. This tool demonstrates the protocol. Memento operationalizes it.242243---244245## Protocol Specification246247Full STTP protocol specification, grammar decisions, and validation results:248- [Grammar Decisions](./docs/grammar_decisions.md)249- [Validation Summary](./docs/validation_summary.md)250251---252253*Part of KeryxInstrumenta — the open source tooling layer of the KeryxLabs ecosystem.*254*KeryxFlux → KeryxMemento → KeryxCortex*255*Herald. Memory. Mind.*256
Full transparency — inspect the skill content before installing.