Teach your AI agent
to scrape the web.
Works with every major AI agent
Claude Code
Anthropic's CLI agent
Cursor
AI-first code editor
Gemini CLI
Google's CLI agent
Codex
OpenAI's coding agent
OpenCode
Open-source CLI agent
Windsurf
Codeium's AI IDE
Three steps to web data
Install
Run the init command. It auto-detects your AI agents and deploys the CRW skill file to each one.
Auto-discover
Your agent reads the skill description (~100 tokens). When web scraping is relevant, it loads the full instructions automatically.
Use
Ask your agent to scrape a URL, crawl a docs site, or search the web. It knows how to use CRW's 4 tools — scrape, crawl, map, and search.
Four tools, full web coverage
Scrape
crw_scrapeTurn any URL into clean markdown, HTML, or structured JSON. Handles JavaScript rendering, content extraction, and noise removal.
crw_scrape({ url: "https://example.com", formats: ["markdown"] })Crawl
crw_crawlAsync BFS crawler that follows links across a site. Set depth, page limits, and domain boundaries. Poll for results.
crw_crawl({ url: "https://docs.example.com", maxDepth: 2, limit: 50 })Map
crw_mapDiscover all URLs on a website using sitemaps, robots.txt, and link extraction. Fast site-wide URL inventory.
crw_map({ url: "https://example.com" })Search
POST /v1/searchWeb search with full page content extraction. Get search results with the actual page markdown, not just snippets.
POST /v1/search { query: "rust web scraping", limit: 5 }Skills + MCP = complete integration
Skills
- ~100tokens to load (description only at startup)
- Cross-platform: works across all agents without config changes
- Teaches agents when and why to use CRW
- Markdown-based, customizable per project
MCP
- Fulltool protocol with typed inputs and structured outputs
- Native integration via JSON-RPC over stdio
- Provides the how — actual execution of scraping
- Runs embedded (6MB) or proxy to remote server
Use both together for the best experience. The skill auto-triggers when web data is needed, then MCP tools handle the actual scraping.
Frequently asked questions
What is the Agent Skills standard?
Agent Skills is an open standard for teaching AI coding agents new capabilities via lightweight markdown files. A SKILL.md file contains instructions that agents auto-discover and load on demand — only ~100 tokens for the description, full content loaded when relevant.
Do I need an API key?
No. In local mode, the MCP server runs a self-contained scraper in ~6MB RAM with zero configuration. For cloud mode (fastcrw.com), set CRW_API_KEY to get managed infrastructure with 500 free credits/month.
Does this work with self-hosted CRW?
Yes. Set CRW_API_URL to your self-hosted instance and the skill will route all requests there. The SKILL.md instructions cover both cloud and local modes.
What's the difference between Skills and MCP?
Skills teach agents when and why to use CRW — lightweight instructions loaded on demand. MCP provides the actual tool protocol for structured I/O. They complement each other: the skill triggers automatically, then MCP tools execute the work.
Which agents are supported?
Claude Code, Cursor, Gemini CLI, OpenAI Codex, OpenCode, and Windsurf. The init command auto-detects installed agents and deploys the skill to all of them.
Is this compatible with Firecrawl?
Yes. CRW's REST API is a drop-in replacement for Firecrawl — same endpoints (/v1/scrape, /v1/crawl, /v1/map, /v1/search), same response format. If you're migrating from Firecrawl, it's a URL swap.
Start scraping in 30 seconds.
Install the CRW skill, get 500 free credits, and let your AI agent handle web data extraction.