File: robots.txt.example

package info (click to toggle)
linkchecker 5.2-2
  • links: PTS
  • area: main
  • in suites: squeeze
  • size: 3,508 kB
  • ctags: 3,805
  • sloc: python: 22,666; lex: 1,114; yacc: 785; makefile: 276; ansic: 95; sh: 68; sql: 19; awk: 4
file content (11 lines) | stat: -rw-r--r-- 312 bytes parent folder | download | duplicates (13)
1
2
3
4
5
6
7
8
9
10
11
# Simple robots.txt example, put this into your web root.
# See the complete reference at
# http://www.robotstxt.org/wc/norobots-rfc.html

# disallow cgi-bin access for all robots
User-agent: *
Disallow: /cgi-bin/

# All LinkChecker versions are not allowed to check anything
User-agent: LinkChecker
Disallow: /