-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Algorithm taking way too long when precision is disproportionately small #101
Comments
Can you make a more isolated example to debug this, e.g. a small JS snippet including the data that produces an OOM in Node? |
Of course! Let me just prepare the example for you... |
The link below has the video demonstrating the OOM error, a README.md, the .js file used and the .geojson file used as the data in the video. https://mega.nz/file/QYETkSCQ#COvB7fj5XzjyQFB9MQIk4YlvjgvLWrYej4kEcoTqFxw |
Should you be curious as to why I'm calculating the POI of Openstreetmap elements, I believe the POI calculation would be a safe way to figure out which elements contain which, as I've noticed there are many elements in Openstreetmap which lack a "@relation" Tag. One of my goals would be to mend this data inconsistency by using the POI list to build a tree-like list of hierarchical relationships, and then show the "relationship tree" I've built to the relevant community. |
Can you elaborate on this? Why can't the data be split into isolated polygons which are used as the input for polylabel? I don't have much bandwidth to debug complex apps, so it would be nice to have a minimal test case (e.g. just a single |
I was messing something up on my side, I can isolate the polygon after all. For debugging you can try this: The polygon is here, since it's way too much text to have here on the page. |
@NMC92 great, thanks for the clear test case! I'll investigate. What's the coordinate system you're using in the input data? 0.0005 seems likely way too small for this kind of input scale. |
@mourner Sure! I'm not sure this answers your question, but I'm using the "Web Mercator" projection, so I believe the coordinate system should be WGS 84/Pseudo-Mercator. |
@NMC92 the projection is in Mercator meters, so on that latitude (~54 degrees), 0.0005 is a precision of 3 millimeters. 😅 Keeping this open because we probably need to put some safeguards in place here so that when a bad precision data is provided for the value, it doesn't go out of memory. |
I see. It is indeed extremely high..seemingly unnecessarily so, if only some places wouldn't be so close to each other, which is what justifies such precision. It'd be great to change the precision according to the size of the actual polygon, in order to avoid complications. Then again, it's merely a suggestion for the future. Thanks a lot @mourner! |
As pointed out (#82 (comment)) the size of the queue can be reduced by only pushing cells whose max possible distance are greater then the current best distance. For a lot of these degenerate cases, if this check is applied to the initial tiling cells as well, you won't get this error. If the centroid isn't inside the polygon, you can fall back to this degenerate case too, but with a trivial check/change to the starting best_cell, this is no longer an issue (#97 (comment)) |
Hmmm, interesting stuff. Just adding my 2 cents: it appears that the problem may be in the desired scale precision, say one meter or so, being dependent on the parallel you are at. Because the Web Mercator projection is "stretched at the poles" in extreme latitudes, you might need less precision up north than at the equator. What is the relationship between degree latitude and numerical precision for achieving one-meter accuracy? Does the required precision increase or decrease as we move towards higher latitudes? |
The algorithm either takes way too long to calculate the label of rectangular shapes or it runs out of memory while doing so.
I am using Polylabel to calculate the POI of GeoJSON features, from Openstreetmap.
Examples of such shapes include:
https://www.openstreetmap.org/relation/2730170
https://www.openstreetmap.org/relation/153548
https://www.openstreetmap.org/relation/12863380
https://www.openstreetmap.org/relation/7491995
https://www.openstreetmap.org/relation/7585771
https://www.openstreetmap.org/relation/7717839
The precision I'm using is 0.000001, since many times the shape is minuscule which requires high precision.
I also changed the code to change the precision according to the size of the shape(parseInt(inputFeature.properties.admin_level) <= 10 ? 0.0000475 : precision), but the fact there is an OOM error indicates a memory leak, which should be fixed.
I understand it taking longer due to the high precision, but when it runs out of memory to calculate the center of a square shape then it doesn't make sense.
The command I'm using is:
geojson-polygon-labels --precision=0.000001 --label=polylabel --style=largest example.geojson > labels.geojson
The text was updated successfully, but these errors were encountered: