Add caching of address information for Nominatim export #850
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This completely reworks how the import from Nominatim works: places are now imported by country. For each country name information for all places that can function as an address part are loaded in memory. After that places for the country are read in a single query and the address information added from the cache. This saves us thousands of SQL queries for lookup of addresses. Import time for the planet goes down from around 20h to 10h.
Exporting by country requires an additional per-country index over placex. That means that you need write access to the database. Alternative pre-create the index as
CREATE INDEX ON placex(country_code)
.As a side-effect of the country split, it is now also possible to the database reading in parallel threads. Use the new
-j
option for that. The usefulness of this option is somewhat limited. Writing to the ES/OS database is still single-threaded and thus a limiting factor. Also, mixing data from different countries while writing results in quite a bit of bloat (180GB vs 200GB for a planet).