Internet-Draft | AGENTS.TXT | October 2025 |
Dutta | Expires 10 April 2026 | [Page] |
This document specifies the AGENTS.TXT protocol, a strict plaintext policy file for automated clients, bots, and crawlers. It defines directives, top-line hash verification, optional parameters, and mandatory failure behavior for malformed files. Malformed files are treated as fully restrictive to prevent unintended access.¶
This Internet-Draft is submitted in full conformance with the provisions of BCP 78 and BCP 79.¶
Internet-Drafts are working documents of the Internet Engineering Task Force (IETF). Note that other groups may also distribute working documents as Internet-Drafts. The list of current Internet-Drafts is at https://datatracker.ietf.org/drafts/current/.¶
Internet-Drafts are draft documents valid for a maximum of six months and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use Internet-Drafts as reference material or to cite them other than as "work in progress."¶
This Internet-Draft will expire on 10 April 2026.¶
Copyright (c) 2025 IETF Trust and the persons identified as the document authors. All rights reserved.¶
This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document. Code Components extracted from this document must include Revised BSD License text as described in Section 4.e of the Trust Legal Provisions and are provided without warranty as described in the Revised BSD License.¶
AGENTS.TXT is a strict policy file format for automated clients, similar in purpose to robots.txt but providing more control over client behavior. Malformed files are treated as completely restrictive.¶
All AGENTS.TXT traffic validation is based on a SHA-256 hash (FIPS 180-4) of the canonical directive content.¶
The canonical path for the file is /agents.txt
. Files must be served as UTF-8 with content-type text/plain
(HTTP/1.1 Semantics).¶
The first non-comment, non-empty line MUST be the hash line, starting with '*' followed by the lowercase SHA-256 hex digest of the file excluding the hash line and comments (SHA-1 comparison for historical reference). Subsequent lines are directives:¶
/status ALLOW
¶
/dashboard ALLOW limit=50
¶
/admin DISALLOW
¶
Lines starting with '#' are comments and ignored for hash computation and parsing. Metadata such as version, generated-by, or grace-period may be included.¶
Any hash missing, hash mismatch, or directive syntax error MUST result in treating the entire site as restricted (RFC 2119 requirements). Cached copies MUST be invalidated.¶
Each directive line has the format: <path> <action> [params...]¶
<path> starts with '/', <action> is ALLOW or DISALLOW, and optional params are key=value pairs (URI syntax).¶
Compute SHA-256 over UTF-8 bytes of the file after removing the hash line, comments, and blank lines. Join remaining lines with '\n' for hashing.¶
Strict malformed-file behavior ensures accidental exposure does not occur. Site operators must ensure valid files to prevent clients from blocking themselves (TLS Best Practices).¶
# version: 1.0
¶
*e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 \
¶
#sample. may not be accurate.
¶
/status ALLOW
¶
/dashboard ALLOW limit=50
¶
/admin DISALLOW
¶
Clients SHOULD follow HTTP client best practices and API crawler guidelines when interpreting AGENTS.TXT directives. Use of AGENTS.TXT aims to reduce accidental site disruption (Bot traffic management).¶
Srijal Dutta¶
Email: srijaldutta.official+agentstxt@gmail.com¶