Skip Menu |
 

This queue is for tickets about the Net-Async-HTTP CPAN distribution.

Report information
The Basics
Id: 94304
Status: resolved
Priority: 0/
Queue: Net-Async-HTTP

People
Owner: Nobody in particular
Requestors: HKCLARK [...] cpan.org
Cc:
AdminCc:

Bug Information
Severity: (no value)
Broken in: (no value)
Fixed in: 0.35



Subject: Hangs on multiple requests
Download (untitled) / with headers
text/plain 816b
If I use the attached to try various combinations of URLs it seems to hang after the first URL. For example, if I do: perl net-async-http_test.pl http://192.168.1.2/page1.html it works. But if I do these it hangs: perl net-async-http_test.pl http://192.168.1.2/page1.html http://192.168.1.2/page2.html perl net-async-http_test.pl http://192.168.1.2/page1.html http://192.168.1.2/page1.html Note that 192.168.1.2 is a local pretty much vanilla apache server (but that does apparently do a close after each request). If I use a public 1.1 server that doesn't do a close after requests it works. Oh, and discussed with paule on IRC (but documenting here) :-) if you change the constructor call to Net::Async::HTTP to this it works: my $http = Net::Async::HTTP->new(max_connections_per_host => 0); Thanks!
Subject: net-async-http_test.pl
#!/usr/bin/env perl use strict; use warnings; use IO::Async::Loop; use Net::Async::HTTP; die "At least one URL on cmd line required\n" unless scalar(@ARGV); my $loop = IO::Async::Loop->new; my $http = Net::Async::HTTP->new; $loop->add($http); sub GET { my ($url) = @_; print "Getting: $url\n"; return $http->GET($url); } my $f = Future->needs_all( map { GET($_) } @ARGV )->on_done( sub{ print "\nDone!\n\n"; }); $f->get;
Download (untitled) / with headers
text/plain 315b
At first glance here it appears that the fact the server closes the connection after the first request is the key element. None of the on_closed code seems to handle the case that there may be more pending requests sitting in the ready queue. I'll write up a unit test and see where that comes to. -- Paul Evans
Download (untitled) / with headers
text/plain 292b
On Sat Mar 29 14:51:35 2014, PEVANS wrote: Show quoted text
> None of the on_closed code seems to handle the case that there may be > more pending requests sitting in the ready queue. I'll write up a unit > test and see where that comes to.
Indeed, this seems to be it. Find patch attached. -- Paul Evans
Subject: rt94304.patch
Download rt94304.patch
text/x-diff 2.1k
=== modified file 'lib/Net/Async/HTTP.pm' --- lib/Net/Async/HTTP.pm 2014-03-29 18:38:27 +0000 +++ lib/Net/Async/HTTP.pm 2014-03-29 19:25:57 +0000 @@ -387,9 +387,15 @@ on_closed => sub { my $conn = shift; + my $http = $conn->parent; $conn->remove_from_parent; @$conns = grep { $_ != $conn } @$conns; + + if( my $next = first { !$_->connecting } @$ready_queue ) { + # Requeue another connection attempt as there's still more to do + $http->get_connection( %args, ready => $next ); + } }, ); === modified file 't/06close.t' --- t/06close.t 2014-01-22 20:17:07 +0000 +++ t/06close.t 2014-03-29 19:25:57 +0000 @@ -36,6 +36,7 @@ return Future->new->done( $self ); }; +# HTTP/1.1 pipelining - if server closes after first request, others should fail { my @f = map { $http->do_request( request => HTTP::Request->new( GET => "/$_", [ Host => $host ] ), @@ -67,4 +68,42 @@ ok( $f[2]->failure ); } +# HTTP/1.0 connection: close behaviour. second request should get written +{ + my @f = map { $http->do_request( + request => HTTP::Request->new( GET => "/$_", [ Host => $host ] ), + host => $host, + ) } 1 .. 2; + + my $request_stream = ""; + wait_for_stream { $request_stream =~ m/$CRLF$CRLF/ } $peersock => $request_stream; + + $request_stream = ""; + + $peersock->print( "HTTP/1.0 200 OK$CRLF" . + "Content-Type: text/plain$CRLF" . + $CRLF . + "Hello " ); + $peersock->close; + undef $peersock; + + wait_for { $f[0]->is_ready }; + ok( !$f[0]->failure, 'First request succeeds after HTTP/1.0 EOF' ); + + wait_for { defined $peersock }; + ok( defined $peersock, 'A second connection is made' ); + + wait_for_stream { $request_stream =~ m/$CRLF$CRLF/ } $peersock => $request_stream; + + $peersock->print( "HTTP/1.0 200 OK$CRLF" . + "Content-Type: text/plain$CRLF" . + $CRLF . + "World!" ); + $peersock->close; + undef $peersock; + + wait_for { $f[1]->is_ready }; + ok( !$f[1]->failure, 'Second request succeeds after second HTTP/1.0 EOF' ); +} + done_testing;
This was released some time ago in 0.35 -- Paul Evans


This service is sponsored and maintained by Best Practical Solutions and runs on Perl.org infrastructure.

Please report any issues with rt.cpan.org to rt-cpan-admin@bestpractical.com.