POST Run Scraper

Submit a scrape job using a custom scraper

Submit Scrape Job

POSThttps://api.anakin.io/v1/web-scraper

Submit a URL for scraping using a custom scraper configuration. The job is processed asynchronously — use the returned jobId to poll for results.

Tip: You can copy ready-to-use API payloads for any scraper from the Web Scrapers section in your dashboard.


Request Body

{
  "url": "https://example.com",
  "scraper_code": "your_scraper_code",
  "scraper_scope": "GLOBAL",
  "scraper_params": {
    "param1": "value1",
    "param2": "value2"
  }
}
ParameterTypeDescription
url requiredstringThe URL to scrape. Must be valid HTTP/HTTPS.
scraper_code requiredstringIdentifier of the scraper configuration to use.
scraper_scope requiredstringScope of scraper. Use "GLOBAL" for global scrapers.
scraper_params requiredobjectKey-value parameters for scraper execution. Varies by scraper.
action_type requiredstringType of action. Default "scrape_data".

Response

202 Accepted
{
  "jobId": "job_xyz789",
  "status": "pending"
}

The job is processed asynchronously. Use the jobId with GET /v1/web-scraper/{id} to check status and retrieve results.


{
  "url": "https://instagram.com",
  "scraper_code": "instagram_hashtag_search",
  "scraper_scope": "GLOBAL",
  "scraper_params": {
    "hashtags": ["webscraping", "automation"],
    "results_limit": 20
  }
}

Code Examples

curl -X POST https://api.anakin.io/v1/web-scraper \
  -H "Content-Type: application/json" \
  -H "X-API-Key: your_api_key" \
  -d '{
    "url": "https://example.com",
    "scraper_code": "your_scraper_code",
    "scraper_scope": "GLOBAL",
    "scraper_params": {}
  }'