-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix stream processing #35
base: master
Are you sure you want to change the base?
Conversation
I can confirm that I also had issues while trying to merge several (30+) geojson files. Furthermore I got a memory leak warning:
I tried the fixed code of this pull-request and it solved the issue. |
Just for the sake of good written code, you could remove the variable declaration, because the variable is actually unused: function mergeFeatureCollectionStream (inputs) {
const out = geojsonStream.stringify();
const streams = inputs.map(file => fs.createReadStream(file));
new StreamConcat(streams);
.pipe(geojsonStream.parse())
.pipe(out);
return out;
} |
Thanks for the suggestion. I have updated the code. |
Same issue here with 1.2GB in 72 files. This fix cleared it right up! |
While trying to merge a large number of GeoJson files in streaming mode I encountered 2 bugs.
The merge file did not contain all content of the separate files.
This seems to be related to parallel stream processing. At least serializing the processing of the separate file streams fixes it.
The contents of the first file argument were missing in the merged file.
This was related to the way command line arguments are parsed.
-s
was parsed askey=value
which leads to the fact that the first file was interpreted as value for the s option.I've tested the changes on macOS with node v13.5.0.