File: control

package info (click to toggle)
libwww-mechanize-perl 1.71-1
  • links: PTS, VCS
  • area: main
  • in suites: wheezy
  • size: 620 kB
  • sloc: perl: 3,272; makefile: 4
file content (50 lines) | stat: -rw-r--r-- 1,996 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
Source: libwww-mechanize-perl
Section: perl
Priority: optional
Maintainer: Debian Perl Group <pkg-perl-maintainers@lists.alioth.debian.org>
Uploaders: Jay Bonci <jaybonci@debian.org>, 
 Jaldhar H. Vyas <jaldhar@debian.org>, Rene Mayorga <rmayorga@debian.org.sv>,
 Kees Cook <kees@debian.org>, gregor herrmann <gregoa@debian.org>,
 Ryan Niebur <ryan@debian.org>, Ansgar Burchardt <ansgar@debian.org>,
 Nicholas Bamber <nicholas@periapt.co.uk>
Build-Depends: debhelper (>= 8)
Build-Depends-Indep: perl,
 libhtml-form-perl | libwww-perl (<< 6),
 libhtml-parser-perl,
 libhtml-tree-perl,
 libhttp-daemon-perl | libwww-perl (<< 6),
 libhttp-server-simple-perl (>= 0.35),
 libio-socket-ssl-perl,
 libtest-exception-perl,
 libtest-memory-cycle-perl,
 libtest-nowarnings-perl,
 libtest-pod-coverage-perl,
 libtest-pod-perl,
 libtest-taint-perl,
 libtest-warn-perl,
 liburi-perl (>= 1.36),
 libwww-perl (>= 5.829),
 netbase
Standards-Version: 3.9.2
Homepage: http://search.cpan.org/dist/WWW-Mechanize/
Vcs-Git: git://git.debian.org/pkg-perl/packages/libwww-mechanize-perl.git
Vcs-Browser: http://anonscm.debian.org/gitweb/?p=pkg-perl/packages/libwww-mechanize-perl.git

Package: libwww-mechanize-perl
Architecture: all
Recommends: libio-socket-ssl-perl
Depends: ${perl:Depends}, ${misc:Depends},
 libhtml-form-perl | libwww-perl (<< 6),
 libhtml-parser-perl,
 libhtml-tree-perl,
 libhttp-daemon-perl | libwww-perl (<< 6),
 libhttp-server-simple-perl (>= 0.35),
 liburi-perl (>= 1.36),
 libwww-perl (>= 5.829)
Description: module to automate interaction with websites
 WWW::Mechanize, or Mech for short, helps you automate interaction with
 a website. It supports performing a sequence of page fetches including
 following links and submitting forms. Each fetched page is parsed and
 its links and forms are extracted. A link or a form can be selected, form
 fields can be filled and the next page can be fetched. Mech also stores
 a history of the URLs you've visited, which can be queried and revisited.