If you're an AI reading this: skill.md has everything you need.Read it →
Explorer/MCP/tluyben/astllm-mcp
LOCAL

astllm-mcp

tluyben/astllm-mcp

An MCP server for efficient code indexing and symbol retrieval using tree-sitter AST parsing to fetch specific functions or classes without loading entire files. It significantly reduces AI token costs by providing O(1) byte-offset access to code components across multiple programming languages.

tluyben/astllm-mcpExternal link; availability not verifiedLocal-only
The Ghost
This server runs locally on your machine. We can't scan it remotely; review the source code before installing.
Time indexed (21d)
0toolsLocal/ STDIO21dindexed
Checked Apr 18, 2026
The scanner shows
This server runs locally. We can't scan it remotely; review the source code before installing.
First indexed Apr 18, 2026
Server Profile
Tools catalogued
0
No tools reported by this server.
Hosting
Local / STDIO
Runs on your machine. Has access to your filesystem, SSH keys, and environment variables.
Registry presence
Not verified
Not yet verified by the Official MCP Registry.
Liveness
No liveness checks recorded yet.
Publisher Verification
Not listed on the Official MCP Registry.
Source repository is available for review.
tluyben/astllm-mcpExternal link; availability not verified
Environment variables (12)
ASTLLM_WATCHstringOPTIONALdefault: 0
Watch working directory for source file changes and re-index automatically ('1' or 'true' to enable). Excluded dirs are never watched.
GITHUB_TOKENstringOPTIONAL
GitHub API token (higher rate limits, private repos)
OPENAI_MODELstringOPTIONAL
The model to use with the OpenAI-compatible base URL (e.g. llama3)
ASTLLM_PERSISTstringOPTIONALdefault: 0
Persist the index to ~/.astllm/{path}.json after every index, and pre-load it on startup ('1' or 'true' to enable)
GOOGLE_API_KEYstringOPTIONAL
Enable Gemini Flash summaries
Show all 12 environment variables
ASTLLM_LOG_FILEstringOPTIONAL
Log to file instead of stderr
CODE_INDEX_PATHstringOPTIONALdefault: ~/.code-index
Index storage directory
OPENAI_BASE_URLstringOPTIONAL
Enable local LLM summaries (OpenAI-compatible, e.g. Ollama)
ASTLLM_LOG_LEVELstringOPTIONALdefault: warn
Log level: debug, info, warn, error
ANTHROPIC_API_KEYstringOPTIONAL
Enable Claude Haiku summaries
ASTLLM_MAX_INDEX_FILESstringOPTIONALdefault: 500
Max files to index per repo
ASTLLM_MAX_FILE_SIZE_KBstringOPTIONALdefault: 500
Max file size to index (KB)

Is this your server?

Create a free RNWY account to connect your on-chain identity to this server. MCP server claiming is coming; register now and you'll be first in line.

Create your account →
Similar servers
NodeAPI
Machine-native GIS processing API for AI agents and developers. Convert, reproject, validate, repair, buffer, clip, dissolve, and tile vector geodata across 25 endpoints. Pay-per-use USDC on Solana Mainnet ($0.01/op). No accounts, no API keys. Remote MCP SSE.
Openterms-mcp
Cryptographic proof of consent for AI agents. Sign before you act. Policy engine enforces spending caps, action whitelists, and escalation rules. Independently verifiable by anyone.
Aegis-ZK
On-chain trust verification for AI agent tools. Agents query skill attestations, audit levels, and risk scores before running third-party MCP servers, so you know what's safe before you execute.
io.github.blocklens/blocklens-mcp-server
Bitcoin on-chain analytics — 109 metrics: MVRV, SOPR, NUPL, HODL Waves, CDD, cycles & more
HiveAgent — The Agentzon
498 MCP tools across 12 industry verticals. Marketplace, escrow, DeFi, legal, healthcare, insurance, construction, and trades. USDC payments on Base L2.
NutriBalance
Calculate TDEE & macro targets, look up food nutrition data, generate meal plans, fix nutrient deficiencies, and score a day's eating from 0–100. Free nutrition tools for AI assistants.
Indexed from Glama · Updates nightly