// web tools for AI agents

Search. Fetch. Extract.

One API to search the web, fetch any page as clean markdown, extract structured data, or run deep research.

Search

1–10 credits per request

POST /v1/search
{
  "query": "best restaurants in NYC",
  "search_depth": "basic",
  "max_results": 10
}
response
{
  "query": "best restaurants in NYC",
  "results": [
    {
      "title": "The 50 Best Restaurants in NYC Right Now",
      "url": "https://www.eater.com/nyc-best-restaurants",
      "content": "From fine dining to hole-in-the-wall gems...",
      "description": "A curated guide to the best restaurants...",
      "fetched": true,
      "published_date": "2026-03-16"
    }
  ],
  "search_depth": "basic",
  "topic": "general",
  "elapsed_ms": 4200,
  "credits_used": 3,
  "credits_remaining": 997
}

Fetch

1 credit per request

POST /v1/fetch
{
  "url": "https://example.com"
}
response
{
  "title": "Example Domain",
  "url": "https://example.com",
  "content": "# Example Domain\n\nThis domain is for use in
    documentation examples...",
  "published_time": null,
  "credits_used": 1,
  "credits_remaining": 999
}

Extract

5 credits per request · paid plans only

POST /v1/extract
{
  "url": "https://example.com",
  "prompt": "Summarize this page in one sentence"
}
response
{
  "content": "This domain is reserved for documentation
    purposes only and should not be used in
    actual operations.",
  "url": "https://example.com",
  "credits_used": 5,
  "credits_remaining": 995,
  "usage": { "input_tokens": 90, "output_tokens": 24 }
}

Research

25 credits per request · paid plans only

POST /v1/research
{
  "query": "How do modern LLMs handle long context?",
  "max_sources": 20
}
response
{
  "query": "How do modern LLMs handle long context?",
  "report": "## Long Context in Modern LLMs\n\nRecent advances...",
  "sources": [
    {
      "title": "Scaling Transformer Context Windows",
      "url": "https://arxiv.org/abs/...",
      "fetched": true
    }
  ],
  "sub_queries": [
    "transformer context window scaling techniques",
    "RoPE positional encoding extensions"
  ],
  "credits_used": 25,
  "credits_remaining": 975,
  "usage": { "input_tokens": 12400, "output_tokens": 1850 }
}