robots.txt parser
HTTP /api/v1/http/robotsFetch and parse robots.txt — User-agent groups, Disallow/Allow, Sitemap, Crawl-delay.
https://tourloveai.ai/robots.txt
404
1571 bytes
0 User-agent groups
Raw robots.txt
<!DOCTYPE html>
<html lang=en>
<meta charset=utf-8>
<meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width">
<title>Error 404 (Not Found)!!1</title>
<style>
*{margin:0;padding:0}html,code{font:15px/22px arial,sans-serif}html{background:#fff;color:#222;padding:15px}body{margin:7% auto 0;max-width:390px;min-height:180px;padding:30px 0 15px}* > body{background:url(//www.google.com/images/errors/robot.png) 100% 5px no-repeat;padding-right:205px}p{margin:11px 0 22px;overflow:hidden}ins{color:#777;text-decoration:none}a img{border:0}@media screen and (max-width:772px){body{background:none;margin-top:0;max-width:none;padding-right:0}}#logo{background:url(//www.google.com/images/branding/googlelogo/1x/googlelogo_color_150x54dp.png) no-repeat;margin-left:-5px}@media only screen and (min-resolution:192dpi){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat 0% 0%/100% 100%;-moz-border-image:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) 0}}@media only screen and (-webkit-min-device-pixel-ratio:2){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat;-webkit-background-size:100% 100%}}#logo{display:inline-block;height:54px;width:150px}
</style>
<a href=//www.google.com/><span id=logo aria-label=Google></span></a>
<p><b>404.</b> <ins>That’s an error.</ins>
<p>The requested URL <code>/robots.txt</code> was not found on this server. <ins>That’s all we know.</ins>
-
1
Paste your input
Enter the value at the top — domain, IP, URL, email, ASN, hash, whatever fits this tool. The smart input auto-detects type.
-
2
Click "Inspect"
host.tools issues real probes (DNS, HTTP, TCP, TLS, WHOIS where applicable) and renders the result in milliseconds.
-
3
Open the API tab
Every web tool has a sibling /api/v1/http/robots JSON endpoint with the same payload. One copy-as-curl click and you're scripting it.
Headers are how the modern web declares its security posture. Auditing them is the highest-ROI thing you can do this week.
/api/v1/http/robots?q=https%3A%2F%2Ftourloveai.ai
curl -s '/api/v1/http/robots?q=https%3A%2F%2Ftourloveai.ai'
<iframe src="/http/robots?q={INPUT}&embed=1"
width="100%" height="600" frameborder="0"></iframe>
Drop into any HTML page. The embed=1 flag hides nav and footer.
Upgrade to Pro for $19/mo. Cancel anytime. Works with the same API you already use.
Common questions
Is robots.txt parser free?
Where does the data come from?
How fresh are the results?
?nocache=1 for a forced refresh.Can I run this from the command line?
host.tools http robots YOUR_INPUT.Can I monitor results over time?
Run robots.txt parser on a schedule. Get pinged when it changes.
Pro gets you bulk lookups, monitors, webhook alerts, history, exports and 10,000 API calls/day. $19/mo.
- ✓Schedule any tool — every 1, 5, 15, 60 min
- ✓Diff against last run, alert on change
- ✓Webhook + email + Slack + PagerDuty + OpsGenie
- ✓Bulk CSV upload, 1,000 inputs per job
- ✓Export results as CSV / NDJSON / Excel
- ✓90-day history, comparison view