Anakin CLI

Command-line interface for web scraping, search, and deep research

Scrape websites, search the web, and run deep research — all from your terminal.

Latest version0.1.0
LicenseMIT
Python3.10+
PyPIanakin-cli
SourceGitHub
pip install anakin-cli

Prerequisites

Before installing, make sure you have:

  1. Python 3.10 or higher — check with python --version
  2. pip — Python's package manager (included with Python 3.10+)
  3. An API key — get one from the Dashboard. If you don't have an account, sign up here

Installation

pip install anakin-cli

Verify the installation:

anakin status

To upgrade to the latest version:

pip install --upgrade anakin-cli

Authentication

Set up your API key so the CLI can make requests on your behalf.

Login command (recommended)

anakin login --api-key "ak-your-key-here"

This saves your key locally. You only need to do this once.

Environment variable

export ANAKIN_API_KEY="ak-your-key-here"

Interactive prompt

If no key is configured, the CLI will prompt you to enter one.


Quick start

# Scrape a page to markdown
anakin scrape "https://example.com"

# Extract structured JSON with AI
anakin scrape "https://example.com/product" --format json

# Scrape a JS-heavy site with headless browser
anakin scrape "https://example.com/spa" --browser

# Batch scrape multiple URLs
anakin scrape-batch "https://a.com" "https://b.com" "https://c.com"

# AI-powered web search
anakin search "python async best practices"

# Deep research (takes 1–5 min)
anakin research "comparison of web frameworks 2025" -o report.json

Commands overview

CommandDescription
anakin loginSave your API key locally
anakin statusCheck version and authentication status
anakin scrapeScrape a single URL to markdown, JSON, or raw
anakin scrape-batchScrape up to 10 URLs in parallel
anakin searchAI-powered web search (instant results)
anakin researchDeep multi-stage agentic research

See the full Commands Reference for all flags and options, or check out Examples & Recipes for real-world usage patterns.


Output modes

Every command that returns data supports the -o flag to write to a file. Without it, output goes to stdout.

# Print to terminal
anakin scrape "https://example.com"

# Save to file
anakin scrape "https://example.com" -o page.md

The scrape command also supports three output formats:

FormatFlagWhat you get
Markdown--format markdownClean readable text (default)
JSON--format jsonAI-extracted structured data
Raw--format rawFull API response with HTML and metadata

Tips

Always quote URLs containing ?, &, or # — shells interpret these as special characters:

# Wrong — zsh will fail
anakin scrape https://example.com/page?id=123

# Correct
anakin scrape "https://example.com/page?id=123"

Piping works cleanly because all progress messages go to stderr:

anakin scrape "https://example.com" --format json | jq '.title'

Use --browser for JavaScript-heavy sites, SPAs, and dynamically loaded content.

Use --country to route requests through a specific country's proxy. See all 207 supported countries.


Support