CLI Commands
Complete reference for all Anakin CLI commands
All available commands for the Anakin CLI.
login
Save your API key for future sessions.
anakin login --api-key "ak-your-key-here"| Flag | Description |
|---|---|
--api-key | Your AnakinScraper API key |
status
Check the CLI version and whether you're authenticated.
anakin statusscrape
Scrape a single URL. Returns clean markdown by default.
anakin scrape "https://example.com"| Flag | Type | Description | Default |
|---|---|---|---|
--format | string | Output format: markdown, json, or raw | markdown |
--browser | flag | Use headless browser for JS-heavy sites | off |
--country | string | Two-letter country code for geo-located scraping | us |
--session-id | string | Browser session ID for authenticated scraping | — |
--timeout | number | Polling timeout in seconds | 120 |
-o, --output | string | Save output to a file instead of stdout | stdout |
Output formats
| Format | What you get | Best for |
|---|---|---|
markdown | Clean readable page text | Reading, LLM context |
json | AI-extracted structured data | Data pipelines |
raw | Full API response (HTML, metadata, everything) | Debugging |
Examples
# Default markdown output
anakin scrape "https://example.com"
# Save to file
anakin scrape "https://example.com" -o page.md
# Extract structured JSON
anakin scrape "https://example.com/product" --format json -o product.json
# Full raw API response
anakin scrape "https://example.com" --format raw -o debug.json
# JavaScript-heavy site
anakin scrape "https://example.com/spa" --browser
# Scrape from the UK
anakin scrape "https://example.com" --country gb
# Longer timeout for slow sites
anakin scrape "https://example.com" --timeout 300
# Authenticated scraping with a saved browser session
anakin scrape "https://example.com/dashboard" --session-id "session_abc123"scrape-batch
Scrape up to 10 URLs simultaneously. All URLs are processed in parallel.
anakin scrape-batch "https://a.com" "https://b.com" "https://c.com"| Flag | Type | Description | Default |
|---|---|---|---|
-o, --output | string | Save output to a file | stdout |
Examples
# Scrape 3 URLs
anakin scrape-batch "https://a.com" "https://b.com" "https://c.com"
# Save batch results to file
anakin scrape-batch "https://a.com" "https://b.com" -o results.jsonsearch
AI-powered web search. Returns results instantly (synchronous).
anakin search "your search query"| Flag | Type | Description | Default |
|---|---|---|---|
-o, --output | string | Save output to a file | stdout |
Examples
# Search the web
anakin search "python async best practices"
# Save search results
anakin search "best web scraping tools 2025" -o results.json
# Pipe to jq
anakin search "latest AI news" | jq '.results[0]'research
Deep agentic research. Runs a multi-stage pipeline: query refinement, web search, citation scraping, and AI synthesis. Takes 1–5 minutes.
anakin research "your research topic"| Flag | Type | Description | Default |
|---|---|---|---|
-o, --output | string | Save output to a file | stdout |
Examples
# Run deep research
anakin research "comparison of web frameworks 2025"
# Save research report
anakin research "quantum computing industry trends" -o report.jsonError handling
The CLI provides clear error messages:
| Error | Code | Fix |
|---|---|---|
| Authentication failed | 401 | Run anakin login --api-key "ak-xxx" |
| Plan upgrade required | 402 | Visit Pricing |
| Rate limit exceeded | 429 | Wait a few seconds and retry |
| Job timed out | — | Increase with --timeout 300 |
| Job failed | — | Check if the URL is accessible |
Exit codes: 0 for success, 1 for any error.