POST Scrape URLs
Submit up to 10 URLs for scraping in a single request
Batch URL Scraping
POST
https://api.anakin.io/v1/url-scraper/batchSubmit up to 10 URLs for scraping in a single request. All URLs are processed in parallel. Use the returned jobId to poll for results.
Request Body
{
"urls": [
"https://example.com/page1",
"https://example.com/page2",
"https://example.com/page3"
],
"country": "us",
"useBrowser": false,
"generateJson": false
}| Parameter | Type | Description |
|---|---|---|
urls required | string[] | Array of URLs to scrape (1–10). |
country | string | Country code for proxy routing. Default "us". See Supported Countries (207 locations). |
useBrowser | boolean | Use headless Chrome with Playwright. Default false. |
generateJson | boolean | AI-extract structured JSON from the content. Default false. |
Response
202 Accepted{
"jobId": "batch_abc123",
"status": "pending"
}You receive a parent job ID that tracks overall batch progress. Use GET /v1/url-scraper/{id} to poll for results — the response will include a results array with individual URL outcomes.
Code Examples
curl -X POST https://api.anakin.io/v1/url-scraper/batch \
-H "X-API-Key: your_api_key" \
-H "Content-Type: application/json" \
-d '{
"urls": [
"https://example.com/page1",
"https://example.com/page2",
"https://example.com/page3"
],
"country": "us",
"useBrowser": false,
"generateJson": true
}'import requests
response = requests.post(
'https://api.anakin.io/v1/url-scraper/batch',
headers={'X-API-Key': 'your_api_key'},
json={
'urls': [
'https://example.com/page1',
'https://example.com/page2',
'https://example.com/page3'
],
'country': 'us',
'useBrowser': False,
'generateJson': True
}
)
data = response.json()
print(f"Batch job submitted: {data['jobId']}")const response = await fetch('https://api.anakin.io/v1/url-scraper/batch', {
method: 'POST',
headers: {
'X-API-Key': 'your_api_key',
'Content-Type': 'application/json'
},
body: JSON.stringify({
urls: [
'https://example.com/page1',
'https://example.com/page2',
'https://example.com/page3'
],
country: 'us',
useBrowser: false,
generateJson: true
})
});
const data = await response.json();
console.log(data.jobId);