robots.txt parser
HTTP /api/v1/http/robotsFetch and parse robots.txt — User-agent groups, Disallow/Allow, Sitemap, Crawl-delay.
https://hostinfo.com/robots.txt
200
1248 bytes
0 User-agent groups
Raw robots.txt
# As a condition of accessing this website, you agree to abide by the following # content signals: # (a) If a content-signal = yes, you may collect content for the corresponding # use. # (b) If a content-signal = no, you may not collect content for the # corresponding use. # (c) If the website operator does not include a content signal for a # corresponding use, the website operator neither grants nor restricts # permission via content signal with respect to the corresponding use. # The content signals and their meanings are: # search: building a search index and providing search results (e.g., returning # hyperlinks and short excerpts from your website's contents). Search does not # include providing AI-generated search summaries. # ai-input: inputting content into one or more AI models (e.g., retrieval # augmented generation, grounding, or other real-time taking of content for # generative AI search answers). # ai-train: training or fine-tuning AI models. # ANY RESTRICTIONS EXPRESSED VIA CONTENT SIGNALS ARE EXPRESS RESERVATIONS OF # RIGHTS UNDER ARTICLE 4 OF THE EUROPEAN UNION DIRECTIVE 2019/790 ON COPYRIGHT # AND RELATED RIGHTS IN THE DIGITAL SINGLE MARKET.
-
1
Paste your input
Enter the value at the top — domain, IP, URL, email, ASN, hash, whatever fits this tool. The smart input auto-detects type.
-
2
Click "Inspect"
host.tools issues real probes (DNS, HTTP, TCP, TLS, WHOIS where applicable) and renders the result in milliseconds.
-
3
Open the API tab
Every web tool has a sibling /api/v1/http/robots JSON endpoint with the same payload. One copy-as-curl click and you're scripting it.
Headers are how the modern web declares its security posture. Auditing them is the highest-ROI thing you can do this week.
/api/v1/http/robots?q=https%3A%2F%2Fhostinfo.com
curl -s '/api/v1/http/robots?q=https%3A%2F%2Fhostinfo.com'
<iframe src="/http/robots?q={INPUT}&embed=1"
width="100%" height="600" frameborder="0"></iframe>
Drop into any HTML page. The embed=1 flag hides nav and footer.
Upgrade to Pro for $19/mo. Cancel anytime. Works with the same API you already use.
Common questions
Is robots.txt parser free?
Where does the data come from?
How fresh are the results?
?nocache=1 for a forced refresh.Can I run this from the command line?
host.tools http robots YOUR_INPUT.Can I monitor results over time?
Run robots.txt parser on a schedule. Get pinged when it changes.
Pro gets you bulk lookups, monitors, webhook alerts, history, exports and 10,000 API calls/day. $19/mo.
- ✓Schedule any tool — every 1, 5, 15, 60 min
- ✓Diff against last run, alert on change
- ✓Webhook + email + Slack + PagerDuty + OpsGenie
- ✓Bulk CSV upload, 1,000 inputs per job
- ✓Export results as CSV / NDJSON / Excel
- ✓90-day history, comparison view