I have to make an hypertable from a table with ca. 30*10^6 rows. In addition the postgres table is inserting ca. 100 rows/min and these data shouldn’t be lost.
What’s your advice, creating a new hypertable and import data into from the old table or it could be safely used in the described conditions the create_hypertable function directly in the existent table?
Hi @nando, you have a few options.
If you want to have a full control of the migration, create a background action to be inserting some amount and paginate your data move. You can order by date and always get records_from_old_table.time > max(last_record_new_table.time)
Have you tried to just use the migrate_data => true argument in the create_hypertable call?
I’d strongly recommend you to run it in a test environment before running such migration in production.