MCP Web Scraping Integration — fastCRW [Firecrawl-Compatible]
fastCRW ships an official MCP server exposing scrape, search, crawl, map, and extract to any MCP-compatible client. 6.6 MB RAM runtime, 92% coverage on the 1,000-URL benchmark.
Use the official fastCRW MCP server to expose scrape, search, crawl, map, and extract tools to Claude Code, Cursor, Windsurf, and any MCP-compatible host.
Why MCP + fastCRW
The Model Context Protocol (MCP) is the emerging standard for connecting LLM hosts to external tools and data. Anthropic shipped MCP, the major IDEs and assistants adopted it, and the protocol now plays the role HTTP played for early SaaS — a shared substrate that lets any client call any compatible server. fastCRW publishes an official MCP server so any MCP-compatible host can scrape, search, crawl, map, and extract through a single tool surface. The fastCRW runtime is 6.6 MB of RAM, which is what makes the MCP story practical: the server is small enough to run inside a developer's editor process, and the underlying API is small enough to self-host alongside private MCP infrastructure.
Setup
- Install Node.js 20+ on the machine that will run the MCP server.
- Provision a fastCRW API key from the dashboard.
- Export
FASTCRW_API_KEYin the environment of the MCP host. - Register the fastCRW MCP server in your client's MCP config.
export FASTCRW_API_KEY="fcrw_..."
npx -y @fastcrw/mcp --help
The server uses stdio transport by default, which is what every desktop MCP host expects.
Code Example
Generic MCP client config (works for Claude Code, Cursor, Windsurf, Continue, Zed):
{
"mcpServers": {
"fastcrw": {
"command": "npx",
"args": ["-y", "@fastcrw/mcp"],
"env": {
"FASTCRW_API_KEY": "${FASTCRW_API_KEY}"
}
}
}
}
For a self-hosted fastCRW instance, override the base URL:
{
"mcpServers": {
"fastcrw": {
"command": "npx",
"args": ["-y", "@fastcrw/mcp"],
"env": {
"FASTCRW_API_KEY": "${FASTCRW_API_KEY}",
"FASTCRW_BASE_URL": "https://crw.internal.company.com"
}
}
}
}
For a remote MCP deployment using HTTP transport, run the fastCRW MCP server with --transport http --port 8088 behind a reverse proxy and point the MCP client at the URL. The five MCP tools exposed by the fastCRW MCP server are:
scrape— fetch a URL and return Markdown.search— web search returning ranked results.crawl— multi-page crawl from a seed URL with depth limits.map— discover all URLs reachable from a domain.extract— schema-driven structured extraction.
A minimal custom MCP client that connects to fastCRW with the official Python SDK:
import asyncio
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
async def main():
params = StdioServerParameters(
command="npx",
args=["-y", "@fastcrw/mcp"],
env={"FASTCRW_API_KEY": "fcrw_..."},
)
async with stdio_client(params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
tools = await session.list_tools()
print([t.name for t in tools.tools])
result = await session.call_tool(
"scrape",
{"url": "https://example.com"},
)
print(result.content[0].text[:500])
asyncio.run(main())
When to Use This
- Editor-side browsing — Claude Code, Cursor, Windsurf, and Zed users get a unified scrape and search tool through the fastCRW MCP server.
- Custom MCP hosts — internal tools built on the MCP SDK can call fastCRW without hand-rolling HTTP clients.
- Multi-tool agents — combine fastCRW MCP with other MCP servers (filesystem, database, GitHub) in a single host configuration.
- Self-hosted MCP — run the fastCRW MCP server with HTTP transport behind your own auth boundary for enterprise deployments.
Limits + Gotchas
- MCP tool schemas are negotiated at session start. If you upgrade the fastCRW MCP server, restart the host process to pick up new tool definitions.
- Stdio transport requires the host process to spawn the server. Sandboxed hosts may block
npx— pin a binary or use the HTTP transport instead. - Tool outputs flow into the LLM's context. fastCRW responses can be large — instruct the host to summarize before quoting full pages.
- The MCP spec is still evolving. Lock the
@fastcrw/mcpversion in production and watch the changelog for breaking protocol revisions.
Related
Continue exploring
More from Integrations
LangGraph Web Scraping Integration — fastCRW [Firecrawl-Compatible]
CrewAI Web Scraping Integration — fastCRW [Firecrawl-Compatible]
Make Web Scraping Integration — fastCRW [Firecrawl-Compatible]
Add fastCRW to Make scenarios with the HTTP module. Firecrawl-compatible scrape and search, 6.6 MB RAM runtime, 92% coverage on the 1,000-URL benchmark.
Langflow Web Scraping Integration — fastCRW [Firecrawl-Compatible]
Add fastCRW to Langflow as a custom component or HTTP node. Firecrawl-compatible scrape and search, 6.6 MB RAM runtime, 92% coverage on the 1,000-URL benchmark.
Claude Code Web Scraping Integration — fastCRW [Firecrawl-Compatible]
Add fastCRW as a Claude Code MCP server. One npx command registers scrape, search, crawl, map, and extract tools. 6.6 MB RAM runtime, 92% coverage on the 1,000-URL benchmark.
Related hubs