File: robots.txt.example

package info (click to toggle)
linkchecker 9.3-1
  • links: PTS
  • area: main
  • in suites: jessie-kfreebsd
  • size: 4,392 kB
  • sloc: python: 27,603; lex: 1,141; yacc: 821; makefile: 408; sh: 138; ansic: 95; sql: 20; awk: 4
file content (11 lines) | stat: -rw-r--r-- 312 bytes parent folder | download | duplicates (13)
1
2
3
4
5
6
7
8
9
10
11
# Simple robots.txt example, put this into your web root.
# See the complete reference at
# http://www.robotstxt.org/wc/norobots-rfc.html

# disallow cgi-bin access for all robots
User-agent: *
Disallow: /cgi-bin/

# All LinkChecker versions are not allowed to check anything
User-agent: LinkChecker
Disallow: /