Go

Scrape your first page from Go using only the standard library — no SDK or external dependencies required.

Submit a scrape, poll for the result, and handle transient errors — all with net/http from Go's standard library.


Authentication

Set your API key as an environment variable. Get a key from the Dashboard.

export ANAKIN_API_KEY=ak-your-key-here

The base URL is https://api.anakin.io/v1. Every request authenticates via the X-API-Key header.


Install

No third-party packages needed — everything is in the standard library. Just initialize a module:

go mod init quickstart

Scrape a page

Save as main.go:

package main

import (
	"bytes"
	"encoding/json"
	"fmt"
	"net/http"
	"os"
	"time"
)

func main() {
	apiKey := os.Getenv("ANAKIN_API_KEY")
	if apiKey == "" {
		panic("ANAKIN_API_KEY is not set")
	}
	base := "https://api.anakin.io/v1"
	client := &http.Client{Timeout: 30 * time.Second}

	do := func(method, path string, body any) map[string]any {
		var buf bytes.Buffer
		if body != nil {
			json.NewEncoder(&buf).Encode(body)
		}
		req, _ := http.NewRequest(method, base+path, &buf)
		req.Header.Set("X-API-Key", apiKey)
		req.Header.Set("Content-Type", "application/json")
		resp, err := client.Do(req)
		if err != nil {
			return nil // caller retries on nil
		}
		defer resp.Body.Close()
		var out map[string]any
		json.NewDecoder(resp.Body).Decode(&out)
		return out
	}

	submitted := do("POST", "/url-scraper", map[string]string{"url": "https://example.com"})
	jobID := submitted["jobId"].(string)

	for i := 0; i < 60; i++ {
		job := do("GET", "/url-scraper/"+jobID, nil)
		if job == nil {
			time.Sleep(3 * time.Second) // retry transient errors
			continue
		}
		switch job["status"] {
		case "completed":
			fmt.Println(job["markdown"])
			return
		case "failed":
			panic(fmt.Sprintf("scrape failed: %v", job["error"]))
		}
		time.Sleep(3 * time.Second)
	}
	panic("timed out after 3 minutes")
}

Run it:

go run main.go

What this does

  1. Submits https://example.com to /url-scraper and gets back a jobId.
  2. Polls /url-scraper/{jobId} every 3 seconds (up to 60 attempts = 3 minutes).
  3. Retries transient network errors silently — only surfaces real failures.
  4. Prints the final markdown when the job completes.

Most jobs finish in 3–15 seconds.


Go further

Extract structured JSON with AI

Replace the submit body with generateJson: true to have AI return structured data:

submitted := do("POST", "/url-scraper", map[string]any{
    "url":          "https://news.ycombinator.com",
    "generateJson": true,
})

The completed response includes a generatedJson field with structured data inferred from the page.

Scrape JavaScript-heavy sites

For SPAs and dynamically-loaded pages, add useBrowser: true:

submitted := do("POST", "/url-scraper", map[string]any{
    "url":        "https://example.com/spa",
    "useBrowser": true,
})

Only use browser mode when needed — standard scraping is faster and cheaper.


Next steps