Skip Menu |

Preferred bug tracker

Please visit the preferred bug tracker to report your issue.

This queue is for tickets about the CGI CPAN distribution.

Report information
The Basics
Id: 17830
Status: resolved
Priority: 0/
Queue: CGI

Owner: Nobody in particular
Requestors: webmaster [...]

Bug Information
Severity: Critical
Broken in: (no value)
Fixed in: (no value)

Subject: User cancel of file upload causes runaway process
Download (untitled) / with headers
text/plain 1.3k
I have been using for upload of photos in one of my scripts with no problems for a number of years (since 2001). However, a problem has recently started to appear with the script since I upgraded the service I have with my ISP. It might be a bug in the current version of If a web user starts the transfer of a file of between about 100K and 500k and then cancels using the Stop button on the browser after 5 seconds, the process starts to runaway and uses up all available CPU. The process keeps running until stopped manually. I get no errors in the logs, nor to the browser. The 500K top limit is probably because I use the following: use CGI; use CGI::Carp qw(fatalsToBrowser); $CGI::POST_MAX=1024 * 500; # max 500K posts Dont know if it is a clue, but I get no errors to the browser if the user exceeds the 500k file limit (even thought that does not trigger the runaway problem). Don't know if that is normal. This happens consistently and is easily replicable on my site. I have 3.16, perl 5.8.7, RedHat 7.3 i686, Apache/1.3.34 Server. Linux 2.4.20-021stab022.5.777-enterprise #1 SMP Fri Sep 3 12:45:02 MSD 2004 i686 unknown I have tested this using both IE 6 and Firefox with exactly the same results. I hope someone can help with this one...:) Regards, William
From: [...]
Download (untitled) / with headers
text/plain 1.1k
I reproduce this bug in this way: #!/usr/bin/perl use CGI; $CGI::POST_MAX = 1; my $q = new CGI; print $q->header(), $q->start_html(); print <<HTML; <form method="post" enctype="multipart/form-data"> <input type="file" name="uploaded_file"> <input type="submit" name="Action" value=" OK " /> </form> HTML 1; I use a browser and view the script, it shows the upload form. I upload a file that is big enough to take several seconds for the upload. During upload but before it is fully uploaded I cancel the upload, eg by pressing escape. That way the process runs forever without producing output. The bug is caused by this code in while($tmplength > 0) { my $maxbuffer = ($tmplength < 10000)?$tmplength:10000; my $bytesread = $MOD_PERL ? $self->r->read($buffer,$maxbuffer) : read(STDIN,$buffer,$maxbuff er); $tmplength -= $bytesread; } This becomes an endless loop because $bytesread=0 at the premature end of the file. This bug appears in versions from 3.12 .. 3.20. It is fixed since 3.21, the entry in the changelog is: "Don't try to read data at all when POST > $POST_MAX."

This service is sponsored and maintained by Best Practical Solutions and runs on infrastructure.

Please report any issues with to