I am wondering how VACUUM and ANALYZE are executed on a compressed hypertable which hosts north of 100B rows.
When I run VACUUM ANALYZE hypertable_name
, is VACUUM (or ANALYZE) executed for every uncompressed and compressed chunk? Do compressed chucks need to be uncompressed first and then re-compressed?
Is there any way to predict for how long should I expect the VACUUM and ANALYZE commands to run?
I would love to understand better how these typical maintenance routines are executed for hypertables (assuming we know how they are executed on normal tables).