Build Pipelines with Your AI Coding Agent
The Indexing Co MCP server connects AI coding agents to the Indexing Co API and live event stream. Describe your data needs in natural language, and the agent handles pipeline creation, transformation logic, testing, and deployment — no dashboard or config files required.
What You Get
| Component | What it does |
|---|
| MCP Server | Connects your agent to the Indexing Co API and live event stream via WebSocket |
| Skill | Teaches the agent how to build pipelines — filter/transformation/destination patterns, event signatures, debugging. The MCP server provides the tools; the skill provides the knowledge to use them effectively. |
| Preview CLI | Streams live pipeline events to your terminal with colorized, formatted output |
Setup
1. Install the MCP Server
git clone https://github.com/indexing-co/indexing-co-mcp.git
cd indexing-co-mcp
npm install && npm run build
2. Add Your Credentials
Create ~/.indexing-co/credentials with your API key:
Sign up at accounts.indexing.co if you don’t have an API key yet.
3. Register with Your Agent
Claude Code
Codex CLI
Other MCP Agents
claude mcp add --scope user indexing-co -- node /path/to/indexing-co-mcp/dist/index.js
Install the skill to give Claude pipeline workflow knowledge — how to structure filters, write transformations, pick destinations, and debug delivery:claude skill add --scope user https://github.com/indexing-co/indexing-co-pipeline-skill
codex mcp add indexing-co -- node /path/to/indexing-co-mcp/dist/index.js
Codex automatically recognizes MCP tools. For the full pipeline workflow knowledge, install the skill as a custom instruction or system prompt — see the skill repo for details.Point your agent’s MCP configuration at the server entry point:node /path/to/indexing-co-mcp/dist/index.js
Any MCP-compatible agent can connect by registering the server and providing API credentials. Pair it with the skill (as a system prompt or instruction set) for the best results.
What You Can Do
Once set up, just describe what you want in natural language:
- “Index all USDC transfers on Base to my Postgres database”
- “Set up a webhook for Uniswap V3 swaps on Ethereum”
- “Stream Aave supply events on Arbitrum so I can see them live”
The agent will walk through the full pipeline workflow: create a filter, write and test a transformation, set up the destination, deploy, and verify.
The MCP server gives your agent direct access to the Indexing Co API:
| Tool | Purpose |
|---|
list_pipelines / get_pipeline / create_pipeline / delete_pipeline | Manage pipelines |
backfill | Backfill historical blocks for a pipeline |
list_filters / get_filter / create_filter / delete_filter_values | Manage address filters |
list_transformations / get_transformation / create_transformation | Manage transformation code |
test_transformation | Dry-run a transformation against a live block |
subscribe / unsubscribe | Subscribe to live event channels |
get_events / query / describe_data | Query stored events with SQL |
Live Event Preview
For pipelines using the DIRECT adapter, you can stream events straight to your terminal with the preview CLI:
node /path/to/indexing-co-mcp/dist/cli/preview.js <channel>
This renders a live, colorized stream of events as they flow through your pipeline — addresses in magenta, transaction hashes in blue, numbers in yellow, and nested objects indented with recursive colorization. Press Ctrl+C to see a summary with total event count, duration, and average throughput.
DIRECT Adapter Setup
To use the preview CLI, configure your pipeline with the DIRECT adapter:
{
"adapter": "DIRECT",
"connectionUri": "my-channel-name"
}
The connectionUri is the channel name you pass to the preview CLI. Events are only sent when at least one subscriber is connected.
The DIRECT adapter can run alongside any other adapter. For example, you can stream to both Postgres and the preview CLI simultaneously by creating two pipelines with the same filter and transformation but different delivery configs.
Example Workflow
Here’s what a typical session looks like:
- You: “I want to track all USDC transfers on Base”
- Agent: Creates a filter with the Base USDC contract address
- Agent: Writes a transformation using
utils.evmDecodeLogWithMetadata for Transfer events
- Agent: Tests the transformation against a recent block
- Agent: Deploys the pipeline with your chosen destination
- Agent: Backfills a few blocks and verifies data arrives
- You: “Stream the events so I can see them”
- Agent: Launches the preview CLI and triggers a backfill — live events appear in your terminal
Security
The MCP server runs locally — API credentials stay on your machine. The agent operates with the same permissions as your API key, so treat your credentials accordingly.
Resources