MCP Server

Connect AI assistants to your workspace memory via Model Context Protocol for real-time semantic search across your tools.

MCP Server

The @lightfastai/mcp package connects AI assistants like Claude, Cursor, and Codex directly to your workspace memory via the Model Context Protocol.

Installation

No installation required — run directly with npx:

bash
npx @lightfastai/mcp --api-key sk-lf-...

Or install globally:

bash
npm install -g @lightfastai/mcp
lightfast-mcp --api-key sk-lf-...

Configuration

Choose your AI assistant and add the configuration:

Claude Desktop

Add to your claude_desktop_config.json:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

json
{
  "mcpServers": {
    "lightfast": {
      "command": "npx",
      "args": ["-y", "@lightfastai/mcp"],
      "env": {
        "LIGHTFAST_API_KEY": "sk-lf-..."
      }
    }
  }
}

Restart Claude Desktop after saving.

Claude Code (CLI)

Add to .mcp.json in your project root:

json
{
  "mcpServers": {
    "lightfast": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@lightfastai/mcp"],
      "env": {
        "LIGHTFAST_API_KEY": "sk-lf-..."
      }
    }
  }
}

Or use the CLI:

bash
claude mcp add lightfast --scope project -- npx -y @lightfastai/mcp

For global (user-level) configuration, add to ~/.claude.json or use --scope user.

Cursor

Add to .cursor/mcp.json in your project root:

json
{
  "mcpServers": {
    "lightfast": {
      "command": "npx",
      "args": ["-y", "@lightfastai/mcp"],
      "env": {
        "LIGHTFAST_API_KEY": "sk-lf-..."
      }
    }
  }
}

OpenAI Codex

Add to ~/.codex/config.toml:

toml
[mcp_servers.lightfast]
command = "npx"
args = ["-y", "@lightfastai/mcp"]

[mcp_servers.lightfast.env]
LIGHTFAST_API_KEY = "sk-lf-..."

Available Tools

Once configured, your AI assistant has access to the lightfast_search tool.

Search through connected tools for relevant decisions and observations.

Parameters:

  • query (required): Natural language search query
  • limit: Max results, 1–100 (default: 10)
  • offset: Pagination offset (default: 0)
  • mode: "fast" (vector scores only) or "balanced" (Cohere rerank, default)
  • sources: Filter by provider (e.g. ["github", "linear"])
  • types: Filter by entity type (e.g. ["pull_request", "issue"])
  • after: ISO 8601 datetime lower bound
  • before: ISO 8601 datetime upper bound

Example prompt:

"Search lightfast for how authentication works in our API"

Usage Examples

Once configured, you can ask your AI assistant questions like:

  • "Search our codebase for how rate limiting is implemented"
  • "What decisions have we made about database choices?"
  • "Find documentation about our deployment process"
  • "Who has worked on the payment service?"

The AI assistant will automatically use the Lightfast search tool to find answers with sources.

CLI Options

bash
npx @lightfastai/mcp [options]

Options:
  --api-key <key>    Lightfast API key (or set LIGHTFAST_API_KEY env var)
  --base-url <url>   API base URL (default: https://lightfast.ai)
  --help, -h         Show help message
  --version, -v      Show version

Environment Variables

Instead of passing --api-key, you can set the environment variable:

bash
export LIGHTFAST_API_KEY=sk-lf-...
npx @lightfastai/mcp

This is especially useful in MCP configurations where you don't want to hardcode the key:

json
{
  "mcpServers": {
    "lightfast": {
      "command": "npx",
      "args": ["-y", "@lightfastai/mcp"],
      "env": {
        "LIGHTFAST_API_KEY": "${LIGHTFAST_API_KEY}"
      }
    }
  }
}

Troubleshooting

Server not connecting

  1. Verify your API key is valid and starts with sk-lf-
  2. Check that Node.js >= 18 is installed
  3. Restart your AI assistant after configuration changes

Tools not appearing

  1. Ensure the MCP server is running (check logs)
  2. Verify the configuration file is in the correct location
  3. For Claude Code, run /mcp to check server status

Permission errors

  1. Lightfast respects your GitHub permissions
  2. Ensure your workspace has the repositories you're searching
  3. Check that your API key has access to the workspace

Next Steps