This script is for Slackware 14.2 only and may be outdated.

SlackBuilds Repository

14.2 > Perl > perl-www-robotrules (6.02)

This module parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>
Webmasters can use the /robots.txt file to forbid conforming robots
from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this
object provides methods to check if access to a given URL is
prohibited. The same WWW::RobotRules object can be used for one
or more parsed /robots.txt files on any number of hosts.

Maintained by: LukenShiro
Keywords:
ChangeLog: perl-www-robotrules

Homepage:
https://metacpan.org/pod/WWW::RobotRules

Source Downloads:
WWW-RobotRules-6.02.tar.gz (b7186e8b8b3701e70c22abf430742403)

Download SlackBuild:
perl-www-robotrules.tar.gz
perl-www-robotrules.tar.gz.asc (FAQ)

(the SlackBuild does not include the source)

Validated for Slackware 14.2

See our HOWTO for instructions on how to use the contents of this repository.

Access to the repository is available via:
ftp git cgit http rsync

© 2006-2024 SlackBuilds.org Project. All rights reserved.
Slackware® is a registered trademark of Patrick Volkerding
Linux® is a registered trademark of Linus Torvalds