The ask
I am the solo maintainer of context-mode. I built it, I maintain it, I write the docs, I fix the bugs, and I try to tell people about it. That last part is where I am still stuck.
No marketing budget. No DevRel team. No growth hacker. Just a tool that solves a real problem and a community of developers who use it every day.
If context-mode has saved you a session, a debugging hour, or a sleepless night, this is me asking for one thing back: help more people find it.
Where we are right now (May 2026)
| Metric |
Value |
| Users |
120,000+ |
| npm installs |
100,000+ |
| Marketplace installs |
20,000+ |
| Platforms supported |
14 |
| Context savings (with hooks) |
~98% per session |
| Session extension |
~30 min to ~3 hours |
| Telemetry |
zero |
What context-mode actually does
AI coding agents (Claude Code, Cursor, Copilot, Gemini CLI, etc.) burn context windows fast. A single Playwright snapshot eats 56 KB. Twenty GitHub issues cost 59 KB. One access log wipes 45 KB. The agent starts forgetting things, code quality drops, the session restarts.
context-mode fixes this. It sandboxes all tool output (commands, API responses, file analysis, web pages) into a local FTS5 knowledge base. The agent searches it instead of reading raw data. About 98% context savings with hooks enabled. Sessions go from ~30 minutes to ~3 hours.
Supported platforms (14)
Claude Code, Cursor, Codex CLI, Gemini CLI, VS Code Copilot, JetBrains Copilot, OpenCode, OpenClaw, KiloCode, Qwen Code, Antigravity, Kiro, Zed, and Pi Coding Agent.
Privacy-first. Source-available (Elastic-2.0). Free, no per-seat pricing, no usage limits.
Before anything else: thank you
context-mode would not be at 120,000 users without the people who told a friend, opened an issue, sent a screenshot at 2am, retweeted a thread, fixed a typo nobody else noticed, or just kept the project tab open while they worked.
Every issue, every PR, every DM, every install, every kind word, every honest critique. I see all of it. I will never stop being grateful for that.
The community is the project. Always.
How you can help
Pick whatever feels natural to you. Even one small action makes a difference.
Share on social media
Post about context-mode on your platform of choice (X, LinkedIn, Reddit, Mastodon, Bluesky, Dev.to, Medium, Hashnode, anywhere). Tag me so I can amplify it.
Write about it
Blog posts, tutorials, comparisons, "how I set up my AI coding workflow" articles, anything that shows context-mode in action. If you write something, open a PR to add it to the README and I will link it.
Make a video
YouTube tutorials, shorts, screen recordings of your install, before/after comparisons of context savings. Visual proof is the most convincing format.
Recommend it at work
If your team uses AI coding agents, suggest context-mode in your next standup. Two minute install. Savings are measurable from the first session.
Connect us with the right people
- Developer advocates and content creators who cover AI coding tools
- Engineering teams evaluating AI agent infrastructure
- Investors in AI developer tooling
- Community managers of relevant Discord, Slack, or forum communities
- Conference organizers looking for AI developer experience talks
Suggest communities
Know a subreddit, Discord server, forum, or newsletter where developers discuss AI coding tools? Drop it in a comment. I will engage genuinely, not spam.
Ready-to-use sharing templates
Copy, modify, or use as a starting point. Updated for May 2026 numbers.
X / Twitter
I have been using context-mode, an open-source MCP plugin that saves about 98% of context window for AI coding agents.
Works with Claude Code, Cursor, Codex, Gemini CLI, Copilot (VS Code + JetBrains), OpenCode, OpenClaw, KiloCode, Qwen, Antigravity, Kiro, Zed, Pi.
Sessions go from 30 min to 3+ hours. 120k+ users. Free, self-hosted, zero telemetry.
https://github.com/mksglu/context-mode
LinkedIn
If your engineering team uses AI coding agents, you are probably burning context windows faster than you think.
context-mode is an open-source MCP plugin that sandboxes tool output into a local knowledge base. The AI searches it instead of reading raw data. About 98% context savings, sessions stretch from 30 minutes to 3+ hours.
14 platforms supported: Claude Code, Cursor, Codex CLI, Gemini CLI, VS Code Copilot, JetBrains Copilot, OpenCode, OpenClaw, KiloCode, Qwen Code, Antigravity, Kiro, Zed, Pi. Runs entirely on your machine, zero telemetry, source-available (Elastic-2.0).
120,000+ users. Worth evaluating if your team is hitting context limits during code reviews, debugging, or repo research.
https://github.com/mksglu/context-mode
Reddit / Forum
context-mode: open-source MCP plugin that saves 98% of context window for AI coding agents (now 120k+ users, 14 agents)
I have been using this tool that changes how AI agents handle context. Instead of dumping raw command output, API responses, and file contents into the context window, it sandboxes everything into a local FTS5 database. The agent searches it with intent-driven queries instead of reading megabytes of raw text.
Real numbers: a Playwright snapshot goes from 56 KB to 299 bytes. 20 GitHub issues go from 59 KB to 1.1 KB. Sessions extend from 30 min to 3 hours.
Works with Claude Code, Cursor, Codex CLI, Gemini CLI, VS Code Copilot, JetBrains Copilot, OpenCode, OpenClaw, KiloCode, Qwen Code, Antigravity, Kiro, Zed, Pi. Privacy-first, everything stays local, no cloud, no telemetry.
GitHub: https://github.com/mksglu/context-mode
Key talking points
If someone asks "why should I care", here is what matters:
| Stat |
Detail |
| Users |
120,000+ across npm and Claude Code marketplace |
| Context savings |
~98% with hooks, ~60% without |
| Session extension |
~30 min to ~3 hours |
| Platforms |
14 (Claude Code, Cursor, Codex CLI, Gemini CLI, VS Code Copilot, JetBrains Copilot, OpenCode, OpenClaw, KiloCode, Qwen Code, Antigravity, Kiro, Zed, Pi) |
| Privacy |
zero telemetry, fully local, no cloud, no account |
| License |
Elastic-2.0 (source-available, free to use) |
| Install time |
under 2 minutes |
| Playwright snapshot |
56.2 KB to 299 B (99% saved) |
| 20 GitHub issues |
58.9 KB to 1.1 KB (98% saved) |
| 500-row CSV analysis |
85.5 KB to 222 B (100% saved) |
| Full session (real usage) |
315 KB to 5.4 KB |
Relevant links
A note on why this matters
Context window management is not a nice-to-have. It is the bottleneck that decides whether AI coding agents are useful for real work or just demos. Every developer hitting "context limit reached" mid-debugging session feels this pain. context-mode exists because I felt it too, and I think every developer using AI agents deserves a solution that is free, private, and open.
If this tool has saved you time or frustration, paying it forward with a share or recommendation means more than you know. Thank you. Genuinely.
Maintained by @mksglu. Every share, every retweet, every kind word keeps this project alive and growing.
The ask
I am the solo maintainer of context-mode. I built it, I maintain it, I write the docs, I fix the bugs, and I try to tell people about it. That last part is where I am still stuck.
No marketing budget. No DevRel team. No growth hacker. Just a tool that solves a real problem and a community of developers who use it every day.
If context-mode has saved you a session, a debugging hour, or a sleepless night, this is me asking for one thing back: help more people find it.
Where we are right now (May 2026)
What context-mode actually does
AI coding agents (Claude Code, Cursor, Copilot, Gemini CLI, etc.) burn context windows fast. A single Playwright snapshot eats 56 KB. Twenty GitHub issues cost 59 KB. One access log wipes 45 KB. The agent starts forgetting things, code quality drops, the session restarts.
context-mode fixes this. It sandboxes all tool output (commands, API responses, file analysis, web pages) into a local FTS5 knowledge base. The agent searches it instead of reading raw data. About 98% context savings with hooks enabled. Sessions go from ~30 minutes to ~3 hours.
Supported platforms (14)
Claude Code, Cursor, Codex CLI, Gemini CLI, VS Code Copilot, JetBrains Copilot, OpenCode, OpenClaw, KiloCode, Qwen Code, Antigravity, Kiro, Zed, and Pi Coding Agent.
Privacy-first. Source-available (Elastic-2.0). Free, no per-seat pricing, no usage limits.
Before anything else: thank you
context-mode would not be at 120,000 users without the people who told a friend, opened an issue, sent a screenshot at 2am, retweeted a thread, fixed a typo nobody else noticed, or just kept the project tab open while they worked.
Every issue, every PR, every DM, every install, every kind word, every honest critique. I see all of it. I will never stop being grateful for that.
The community is the project. Always.
How you can help
Pick whatever feels natural to you. Even one small action makes a difference.
Share on social media
Post about context-mode on your platform of choice (X, LinkedIn, Reddit, Mastodon, Bluesky, Dev.to, Medium, Hashnode, anywhere). Tag me so I can amplify it.
Write about it
Blog posts, tutorials, comparisons, "how I set up my AI coding workflow" articles, anything that shows context-mode in action. If you write something, open a PR to add it to the README and I will link it.
Make a video
YouTube tutorials, shorts, screen recordings of your install, before/after comparisons of context savings. Visual proof is the most convincing format.
Recommend it at work
If your team uses AI coding agents, suggest context-mode in your next standup. Two minute install. Savings are measurable from the first session.
Connect us with the right people
Suggest communities
Know a subreddit, Discord server, forum, or newsletter where developers discuss AI coding tools? Drop it in a comment. I will engage genuinely, not spam.
Ready-to-use sharing templates
Copy, modify, or use as a starting point. Updated for May 2026 numbers.
X / Twitter
LinkedIn
Reddit / Forum
Key talking points
If someone asks "why should I care", here is what matters:
Relevant links
npm install -g context-modeornpx -y context-modeA note on why this matters
Context window management is not a nice-to-have. It is the bottleneck that decides whether AI coding agents are useful for real work or just demos. Every developer hitting "context limit reached" mid-debugging session feels this pain. context-mode exists because I felt it too, and I think every developer using AI agents deserves a solution that is free, private, and open.
If this tool has saved you time or frustration, paying it forward with a share or recommendation means more than you know. Thank you. Genuinely.
Maintained by @mksglu. Every share, every retweet, every kind word keeps this project alive and growing.