Astro Integration
Astro NoCrawl
Automatically block search engine crawling on non-production Astro sites using a static robots.txt file.
Usage snapshot
109
Downloads in the last 30 days
Why this plugin exists
Staging and preview environments are frequently crawled by search engines by mistake, leading to long-term SEO damage.
astro-nocrawl prevents this by generating a blocking robots.txt file at build time for any site not explicitly allow-listed.
The result is static, cache-safe, and requires no runtime logic, middleware, or environment guessing.
What it delivers
Design principles
- Build-time only, no runtime execution
- Explicit allow-listing, no guessing
- Static, cache-safe output
- Single responsibility
- Never blocks a deployment
What this plugin does
- Generates a restrictive robots.txt file for non-production sites
- Exact hostname allow-listing
- No HTML mutation or meta tag injection
- Works behind any CDN
- Adapter-agnostic and deterministic
Installation
npm install astro-nocrawl
The plugin runs during astro build and writes a robots.txt file only when crawling should be blocked.
Project links
Source code, releases, documentation, and contribution guidelines.
Want the deep dive?
Read the FAQs for implementation details, design rationale, and integration guidance.
View plugin FAQs