Hello…we are struggling to find the best way to initiate a new database.
I have a completely dedicated server for Timescaledb. Initially, I need to ingest 3 years of 1Hz data for approximately 9000 items. Things are structured so that each item is its own hypertable and each table only contains 2 columns, the timestamp and the data.
The system:
128 GB RAM, 5T storage.
Within the first day of ingesting data, the HD filled up and the data was not compressed…we stopped ingesting and waited for compression to take place and it took forever, but eventually did compress all the way. Today, I have been experimenting with different memory settings to see if there’s a way to ingest the data and compress it at the same time.
My question is, what is the best strategy to get all the data for 3 years into the db and compressed?
Thanks!