|Subject:||Segmentation fault at 2 GB of memeory|
Parsing a large file (about 317 lines, about 19670 bytes) results in a segmentation fault as soon as Perl hits ~2 GB of memory usage. It's hard to tell, but memory allocation seemed to accelerate the further along the parser got in the file. Is there any way to force Regexp::Grammars to treat a list-like subrule as non-backtracking or atomic, like "<[statements=block_stmt]>*+" instead of "<[statements=block_stmt]>*"? Wouldn't that save on memory if Regexp::Grammars knew to fail instead of saving backtracking locations? Tested in Perl 5.26.1, macOS 10.13.4, Regexp::Grammars 1.048.