CodeGrok MCP is a Model Context Protocol (MCP) server that enables AI assistants to intelligently search and understand codebases using semantic embeddings and Tree-sitter parsing.
-
Updated
Mar 15, 2026 - Python
CodeGrok MCP is a Model Context Protocol (MCP) server that enables AI assistants to intelligently search and understand codebases using semantic embeddings and Tree-sitter parsing.
⚡ Cut Claude token usage by 90%+ — free, open-source, local-first context compression for Claude Code. Hybrid RAG (BM25 + ONNX vectors), AST chunking, reranking. No API needed.
Lightweight, agent-optimized database CLI with one-shot schema introspection, column profiling, and ERD generation.
GitHub Action that analyzes codebases and generates AI agent context documentation (CLAUDE.md/AGENTS.md) to optimize AI coding assistant efficiency. Reduces token waste and improves development velocity through intelligent recommendations.
Skim: A MCP server for Claude Code — skim large outputs, return only schema. Save context, save tokens.
13 production microservices that prevent wasteful AI API calls through semantic search, caching, and team learning - 85% cost reduction
🎯 Optimize LLM token usage by 70-90% with smart context ranking, reducing costs while maintaining quality and performance.
Smart Context Optimization for LLMs - Reduce tokens by 66%, save 40% on API costs. Intelligent ranking and selection of relevant context using embeddings, keywords, and semantic analysis.
Comprehensive cognitive infrastructure for AI-augmented development and knowledge work
ContextFusion is the context brain for LLM apps - compress, rank, and route the right evidence to chat + agent models across OpenAI, Claude, Ollama, and MCP
Graph-style library for LLM agents: plan → fetch context → synthesize → verify.
distills Python repositories into compact review bundles for LLMs and agents
Session Intelligence MCP Server - Lean meta-tool pattern with 95% context reduction
Add a description, image, and links to the context-optimization topic page so that developers can more easily learn about it.
To associate your repository with the context-optimization topic, visit your repo's landing page and select "manage topics."