libwww-mechanize-perl 1.83-1 source package in Ubuntu
Changelog
libwww-mechanize-perl (1.83-1) unstable; urgency=medium [ gregor herrmann ] * Rename autopkgtest configuration file(s) as per new pkg-perl- autopkgtest schema. [ Salvatore Bonaccorso ] * debian/control: Use HTTPS transport protocol for Vcs-Git URI [ gregor herrmann ] * debian/copyright: change Copyright-Format 1.0 URL to HTTPS. [ Nick Morrott ] * Import upstream version 1.78 * Update copyright years and Upstream-Contact * Update metadata * Declare compliance with Debian Policy 3.9.8 * Refresh (build-) dependencies * Drop document_proxy.patch (applied upstream) * Add debian/libwww-mechanize-perl.lintian-overrides [ gregor herrmann ] * autopkgtest: enable more smoke tests. * Remove Jaldhar H. Vyas, Jay Bonci, Kees Cook, Rene Mayorga, and Ryan Niebur from Uploaders. Thanks for your work! * Import upstream versions 1.79, 1.83. * Add new build dependencies. * Update years of packaging copyright. * debian/rules: disable DNS queries in tests. -- gregor herrmann <email address hidden> Sat, 22 Oct 2016 22:33:07 +0200
Upload details
- Uploaded by:
- Debian Perl Group
- Uploaded to:
- Sid
- Original maintainer:
- Debian Perl Group
- Architectures:
- all
- Section:
- perl
- Urgency:
- Medium Urgency
See full publishing history Publishing
Series | Published | Component | Section |
---|
Downloads
File | Size | SHA-256 Checksum |
---|---|---|
libwww-mechanize-perl_1.83-1.dsc | 2.8 KiB | f3a16032b896e7b15d7dbf8a6c35dcc6521587914a981757d36d72c12d0a4d60 |
libwww-mechanize-perl_1.83.orig.tar.gz | 155.7 KiB | 4b5fe979e0d00418ba92efe40725f2ad22333ef311ca0ec98363409c870e8ce5 |
libwww-mechanize-perl_1.83-1.debian.tar.xz | 6.3 KiB | c511164daf4c2b9964030022b7a6145d481fb162444046f3a1777ef7c6bebdf3 |
Available diffs
- diff from 1.75-1 to 1.83-1 (77.4 KiB)
No changes file available.
Binary packages built by this source
- libwww-mechanize-perl: module to automate interaction with websites
WWW::Mechanize, or Mech for short, helps you automate interaction with
a website. It supports performing a sequence of page fetches including
following links and submitting forms. Each fetched page is parsed and
its links and forms are extracted. A link or a form can be selected, form
fields can be filled and the next page can be fetched. Mech also stores
a history of the URLs you've visited, which can be queried and revisited.