You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
When using large .geojson files (in my case >1.6GB), I noticed that the provider is throwing an error. {"message":"Error parsing file ....geojson: Cannot create a string longer than 0x3fffffe7 characters","level":"error"}
When going through the source code I noticed that it seems to load the entire file content in memory and then parse it. Is this correct? Would it make sense to read one entry at a time (or maybe a buffer), process it and then add each processed item/batch to the result? (unless there is a limit on the result as well).
Thanks!
PS: I'm new to koop so maybe a silly idea :)
The text was updated successfully, but these errors were encountered:
I think you could try, though at some point increasing file sizes will produce some noticeable latency. Once your data is all processed it will get sent on to other Koop dependencies that handle any filtering, sorting, reprojection, etc, that is defined by the request.
Also if you are using the feature server output, it will limit the number for features returned to a max amount (it expects large datasets to be paged).
Hi,
When using large .geojson files (in my case >1.6GB), I noticed that the provider is throwing an error.
{"message":"Error parsing file ....geojson: Cannot create a string longer than 0x3fffffe7 characters","level":"error"}
When going through the source code I noticed that it seems to load the entire file content in memory and then parse it. Is this correct? Would it make sense to read one entry at a time (or maybe a buffer), process it and then add each processed item/batch to the result? (unless there is a limit on the result as well).
Thanks!
PS: I'm new to koop so maybe a silly idea :)
The text was updated successfully, but these errors were encountered: