File: control

package info (click to toggle)
parsero 0.0%2Bgit20140929.e5b585a-4
  • links: PTS, VCS
  • area: main
  • in suites: bullseye
  • size: 132 kB
  • sloc: python: 216; sh: 8; makefile: 6
file content (34 lines) | stat: -rw-r--r-- 1,413 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
Source: parsero
Section: net
Priority: optional
Maintainer: Debian Security Tools <team+pkg-security@tracker.debian.org>
Uploaders: Thiago Andrade Marques <andrade@debian.org>
Build-Depends: debhelper-compat (= 13),
               dh-python,
               python3-all,
               python3-setuptools
Standards-Version: 4.5.0
Rules-Requires-Root: no
Homepage: https://github.com/behindthefirewalls/Parsero
Vcs-Git: https://salsa.debian.org/pkg-security-team/parsero.git
Vcs-Browser: https://salsa.debian.org/pkg-security-team/parsero

Package: parsero
Architecture: all
Depends: ${misc:Depends},
         ${python3:Depends},
         python3-bs4,
         python3-pip,
         python3-pkg-resources
Description: Audit tool for robots.txt of a site
 Parsero is a free script written in Python which reads the Robots.txt file
 of a web server through the network and looks at the Disallow entries. The
 Disallow entries tell the search engines what directories or files hosted
 on a web server mustn't be indexed. For example, "Disallow: /portal/login"
 means that the content on www.example.com/portal/login it's not allowed to
 be indexed by crawlers like Google, Bing, Yahoo... This is the way the
 administrator have to not share sensitive or private information with the
 search engines.
 .
 Parsero is useful for pentesters, ethical hackers and forensics experts.
 It also can be used for security tests.