You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have tried to import entire planet several times with very powerful VMs.
Each time there was another problem.
Last time I was very close and I got log:
2024-12-15 03:22:56: Starting indexing postcodes using 96 threads
2024-12-15 03:22:56: Starting postcodes (location_postcode) (using batch size 20)
2024-12-15 03:22:56: Done 0/0 in 0 @ 0.000 per second - FINISHED postcodes (location_postcode)
+ sudo -E -u nominatim nominatim admin --check-database
2024-12-15 03:22:57: Using project directory: /nominatim
2024-12-15 03:22:57: Checking database
Checking database connection ... OK
Checking database_version matches Nominatim software version ... OK
Checking for placex table ... OK
Checking for placex content ... OK
Checking that tokenizer works ... OK
Checking for wikipedia/wikidata data ... OK
Checking indexing status ... OK
Checking that database indexes are complete ... OK
Checking that all database indexes are valid ... OK
Checking TIGER external data table. ... OK
Freezing database
+ '[' '' '!=' '' ']'
+ '[' true = true ']'
+ echo 'Freezing database'
+ sudo -E -u nominatim nominatim freeze
2024-12-15 03:24:00: Using project directory: /nominatim
Traceback (most recent call last):
File "/usr/local/bin/nominatim", line 5, in <module>
exit(cli.nominatim(module_dir=None, osm2pgsql_path=None))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/nominatim_db/cli.py", line 260, in nominatim
return get_set_parser().run(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/nominatim_db/cli.py", line 122, in run
ret = args.command.run(args)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/nominatim_db/clicmd/freeze.py", line 40, in run
freeze.drop_update_tables(conn)
File "/usr/local/lib/python3.12/dist-packages/nominatim_db/tools/freeze.py", line 42, in drop_update_tables
drop_tables(conn, *tables, cascade=True)
File "/usr/local/lib/python3.12/dist-packages/nominatim_db/db/connection.py", line 94, in drop_tables
cur.execute(sql.format(pysql.Identifier(name)))
File "/usr/local/lib/python3.12/dist-packages/psycopg/cursor.py", line 97, in execute
raise ex.with_traceback(None)
psycopg.errors.OutOfMemory: out of shared memory
HINT: You might need to increase max_locks_per_transaction.
Next time I start docker it downloads entire .pbf again but then it also fails to complete with another error...
I have made a snapshot of my disk after the first error.
Can you recommend what should I do to try to continue process after the "out of shared memory" error?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I have tried to import entire planet several times with very powerful VMs.
Each time there was another problem.
Last time I was very close and I got log:
Next time I start docker it downloads entire .pbf again but then it also fails to complete with another error...
I have made a snapshot of my disk after the first error.
Can you recommend what should I do to try to continue process after the "out of shared memory" error?
I started the import with:
Beta Was this translation helpful? Give feedback.
All reactions