Robots.txt Tester API
Evaluate robots.txt rules for any path and crawler user-agent.
Last Updated: 01 April 2026
Base API URL
Base API URL
https://api.velohost.co.uk/robots/ Rate Limiting
This API is protected by a global rate limit to ensure fair usage and platform stability.
- Limit: 30 requests per 10 seconds per IP address
- Burst traffic is allowed
- No authentication required
-
Requests exceeding the limit return
HTTP 429
This limit applies across all Velohost public APIs.
Example request
GET /test?url=example.com&user_agent=Googlebot&path=/admin/
{
"input_url": "https://example.com",
"robots_url": "https://example.com/robots.txt",
"user_agent": "Googlebot",
"path": "/admin/",
"allowed": false,
"matched_rule": { "type": "disallow", "path": "/admin/" },
"status": 200,
"rules_considered": 6
} Implementation Guidance
For production use, validate input before sending requests, implement retry with exponential backoff on 429 or transient failures, and log normalized responses for trend monitoring.
Learn more
Explore the tool, documentation, and guides.