This queue is for tickets about the Text-CSV_XS CPAN distribution.

Report information
The Basics
Id:
74216
Status:
resolved
Priority:
Low/Low
Queue:

People
Owner:
Nobody in particular
Requestors:
danboo [...] cpan.org
Cc:
AdminCc:

BugTracker
Severity:
Normal
Broken in:
0.73
Fixed in:
0.86



Subject: setting 'eol' affects global input record separator
The following demonstrates an issue where setting the 'eol' value to CRLF results in a subsequent unrelated file read slurping the entire contents at once. The expected behavior can be restored by setting $\ back to "\n". Note that this is running on a Linux system where 'eol' is a non default value. This script outputs: 100 1 100 While the expected output is: 100 100 100 Perhaps this is a deeper issue in IO::Handle or PerlIO, but I wanted to raise attention where I observed the issue first. Thanks! - danboo use strict; use warnings; use Text::CSV_XS; ## prints 100 (lines) as expected slurp_check(); my $csv_data = "a,b,c" . "\015\012" . "1,2,3" . "\015\012"; open my $csv_fh, '<', \$csv_data or die $!; my $csv = Text::CSV_XS->new( { eol => "\015\012" } ); my $csv_parse = $csv->getline($csv_fh); ## now prints 1 slurp_check(); ## restore $/ to get 100 again { local $/ = "\n"; slurp_check() } sub slurp_check { my $data = join "\n", 1 .. 100; open my $fh, '<', \$data or die $!; print scalar @{[ <$fh> ]}; close $fh; print "\n"; }
Uploaded 0.86 with fix


This service runs on Request Tracker, is sponsored by The Perl Foundation, and maintained by Best Practical Solutions.

Please report any issues with rt.cpan.org to rt-cpan-admin@bestpractical.com.