One-shot context assembly for LLM prompts — memories + timeline + graph
chunks, sources, graph_context, and timeline sections.| Header | Meaning |
|---|---|
x-credit-balance | Wallet balance after this charge |
x-credit-charged | 0.000500 |
x-billing-tx | Audit row UUID |
/memory/context returns prompt_ready: "" instead of failing. Your application can continue with no context — the SLM chat will still work, just without memory-grounded responses.
prompt_ready is non-empty before prepending it.