Skip Menu |

This queue is for tickets about the Archive-Tar CPAN distribution.

Report information
The Basics
Id: 68322
Status: new
Priority: 0/
Queue: Archive-Tar

Owner: Nobody in particular
Requestors: miz_cpan [...]

Bug Information
Severity: (no value)
Broken in: (no value)
Fixed in: (no value)

Subject: Archive::Tar memory usage during extraction
Date: Thu, 19 May 2011 20:02:53 +0200 (CEST)
To: bug-archive-tar [...]
From: "Misi CPAN Mladoniczky" <miz_cpan [...]>
Download (untitled) / with headers
text/plain 995b
Hi, I am having problem getting Archive::Tar to extract without consuming too much memory. The starting file is called orig.tar.gz and is 847M in size. The first file of the tar contains another file called firstfile.tar.gz and is 585M in size. I have used both of these syntaxes: - Archive::Tar->extract_archive($file, 1) - Archive::Tar->iter($file ,1) In both cases, the memory consumtion goes up to 4G during the extraction of the first large file (585M), and then dropps back to around 2.5G. This just seems to be too much. I would think that it should never really need to exceed 585M by that much, if you refrain from doing any random access. What do you think about this? I can live with the fact that the Archive::Tar is a little bit slow, but there must be a way to improve memory consumption. On my 64-bit system, the perl-process actually dies immediately after the extraction. On a 32-bit system I guess I will have even more of a problem. Best Regards - Misi
Subject: [ #68322]
Date: Fri, 20 May 2011 07:52:38 +0200 (CEST)
To: bug-Archive-Tar [...]
From: "Misi CPAN Mladoniczky" <miz_cpan [...]>
Download (untitled) / with headers
text/plain 149b
Hi, The memory problem is still there! But it seems my script did not break after all. I got an ftp-timeout that I did not handle. /Misi

This service is sponsored and maintained by Best Practical Solutions and runs on infrastructure.

Please report any issues with to