This queue is for tickets about the WWW-RobotRules CPAN distribution.

Report information
The Basics
Id:
23895
Status:
open
Priority:
Low/Low

People
Owner:
Nobody in particular
Requestors:
keyoshid [...] yahoo-corp.jp
Cc:
AdminCc:

BugTracker
Severity:
(no value)
Broken in:
(no value)
Fixed in:
(no value)



Subject: Bug in WWW::RobotRules
MIME-Version: 1.0
X-Mailer: MIME-tools 5.418 (Entity 5.418)
Content-Type: text/plain; charset="utf8"
Content-Disposition: inline
Content-Transfer-Encoding: binary
X-RT-Original-Encoding: utf-8
Content-Length: 1120
Hello I found a bug in WWW::RobotRules, so report it. Distribution: WWW::RobotRules 1.33 Perl: v5.8.5 <sampleCode> use strict; use WWW::RobotRules; my $robotsContent = <<"EOT"; User-agent: Test/1.0 Disallow: / EOT my $useUserAgent = "Test/1.0"; my $robotRules = WWW::RobotRules->new($useUserAgent); $robotRules->parse("http://test/robots.txt", $robotsContent); if ($robotRules->allowed("http://test/")) { print "[ALLOW]\n"; } else { print "[DENY]\n"; } </sampleCode> I think printing "[DENY]" is correct result, but print "[ALLOW]" in WWW::RobotRules 1.33. <Patch> diff: conflicting specifications of output style --- RobotRules.pm~ Fri Dec 8 20:26:59 2006 +++ RobotRules.pm Fri Dec 8 20:27:18 2006 @@ -132,7 +132,8 @@ # See whether my short-name is a substring of the # "User-Agent: ..." line that we were passed: - if(index(lc($me), lc($ua_line)) >= 0) { +# if(index(lc($me), lc($ua_line)) >= 0) { + if(index(lc($ua_line), lc($me)) >= 0) { LWP::Debug::debug("\"$ua_line\" applies to \"$me\"") if defined &LWP::Debug::debug; return 1; </Patch>
MIME-Version: 1.0
X-Mailer: MIME-tools 5.418 (Entity 5.418)
Content-Disposition: inline
Message-Id: <rt-3.6.HEAD-9420-1184697479-1278.23895-0-0@rt.cpan.org>
Content-Type: text/plain; charset="utf8"
Content-Transfer-Encoding: binary
X-RT-Original-Encoding: utf-8
X-RT-Original-Encoding: utf-8
Content-Length: 1488
On Mon Dec 11 21:26:40 2006, keyoshid wrote:
Show quoted text
> Hello > > I found a bug in WWW::RobotRules, so report it. > > Distribution: WWW::RobotRules 1.33 > Perl: v5.8.5 > > <sampleCode> > use strict; > use WWW::RobotRules; > > my $robotsContent = <<"EOT"; > User-agent: Test/1.0 > Disallow: / > EOT > > my $useUserAgent = "Test/1.0"; > > my $robotRules = WWW::RobotRules->new($useUserAgent); > $robotRules->parse("http://test/robots.txt", $robotsContent); > > if ($robotRules->allowed("http://test/")) { > print "[ALLOW]\n"; > } else { > print "[DENY]\n"; > } > </sampleCode> > > I think printing "[DENY]" is correct result, > but print "[ALLOW]" in WWW::RobotRules 1.33. > > <Patch> > diff: conflicting specifications of output style > --- RobotRules.pm~ Fri Dec 8 20:26:59 2006 > +++ RobotRules.pm Fri Dec 8 20:27:18 2006 > @@ -132,7 +132,8 @@ > # See whether my short-name is a substring of the > # "User-Agent: ..." line that we were passed: > > - if(index(lc($me), lc($ua_line)) >= 0) { > +# if(index(lc($me), lc($ua_line)) >= 0) { > + if(index(lc($ua_line), lc($me)) >= 0) { > LWP::Debug::debug("\"$ua_line\" applies to \"$me\"") > if defined &LWP::Debug::debug; > return 1; > </Patch> >
This patch should not be applied as it causes test failures to occur. The problem with the test code above is that it adds a version to the robot.txt's user agent string. This is the mistake that causes the failures to occur.


This service runs on Request Tracker, is sponsored by The Perl Foundation, and maintained by Best Practical Solutions.

Please report any issues with rt.cpan.org to rt-cpan-admin@bestpractical.com.