MCP Server
Connect AI assistants to your workspace memory via Model Context Protocol
MCP Server
The @lightfastai/mcp package connects AI assistants like Claude, Cursor, and Codex directly to your workspace memory via the Model Context Protocol.
Installation
No installation required — run directly with npx:
Or install globally:
Configuration
Choose your AI assistant and add the configuration:
Claude Desktop
Add to your claude_desktop_config.json:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Restart Claude Desktop after saving.
Claude Code (CLI)
Add to .mcp.json in your project root:
Or use the CLI:
For global (user-level) configuration, add to ~/.claude.json or use --scope user.
Cursor
Add to .cursor/mcp.json in your project root:
OpenAI Codex
Add to ~/.codex/config.toml:
Available Tools
Once configured, your AI assistant has access to three tools:
lightfast_search
Search through workspace neural memory for relevant documents and observations.
Parameters:
query(required): Natural language search querylimit: Number of results (default: 10)mode: Search mode — "fast", "balanced", or "quality"filters: Filter by source, date range, or type
Example prompt:
"Search lightfast for how authentication works in our API"
lightfast_contents
Fetch full content for documents by their IDs.
Parameters:
ids(required): Array of document IDs to fetch
Example prompt:
"Get the full content of doc_abc123 from lightfast"
lightfast_find_similar
Find content semantically similar to a given document or URL.
Parameters:
idorurl(one required): Document ID or URL to find similar content forlimit: Number of results (default: 10)threshold: Minimum similarity score (default: 0.5)
Example prompt:
"Find PRs similar to https://github.com/org/repo/pull/123"
Usage Examples
Once configured, you can ask your AI assistant questions like:
- "Search our codebase for how rate limiting is implemented"
- "What decisions have we made about database choices?"
- "Find documentation about our deployment process"
- "Who has worked on the payment service?"
- "Find PRs similar to this authentication refactor"
The AI assistant will automatically use the Lightfast tools to search your workspace memory and provide answers with sources.
CLI Options
Environment Variables
Instead of passing --api-key, you can set the environment variable:
This is especially useful in MCP configurations where you don't want to hardcode the key:
Troubleshooting
Server not connecting
- Verify your API key is valid and starts with
sk_ - Check that Node.js >= 18 is installed
- Restart your AI assistant after configuration changes
Tools not appearing
- Ensure the MCP server is running (check logs)
- Verify the configuration file is in the correct location
- For Claude Code, run
/mcpto check server status
Permission errors
- Lightfast respects your GitHub permissions
- Ensure your workspace has the repositories you're searching
- Check that your API key has access to the workspace