Open
Conversation
b390922 to
d0ca186
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
I have a usecase that consists of bootstrapping a database with a subset of the planet. This subset is somewhat sparse and would not cause memory problems if it was to be eagerly allocated. I imagine other people to have similar requirements: basically being able to run a
filter_mapon the data.One can do this in series with the supplied functions, however, not in parallel, which is a requirement for anyone who's doing frequent runs on big dumps.
I generalized my approach in this PR. Albeit more granular than what #42 is offering, this is slightly less elegant as it assumes that you want your blobs decoded and that you want to run through
OsmData, which might not be true.I could not get
flatten()to let errors out and the workaround not a lazy iterator. This approach can easily fill someone's RAM if the filters happen to be too broad. Nonetheless, I've been reliably using it for a while now and just had the time to upstream it. Hope it serves someone else.Cheers