Performance impact after 2 years of data

Hi,

we’re experiencing performance issues with the following query towards a hypertable.

select time from data where foreign_id = :uuid order by time desc limit 1;

The query lasted <2s, but after passing 2 years of data we noticed a significant performance impact.
The hypertable currently has 12 696 958 rows and the query is lasting around 24s, query execution plan example baseline_1 | explain.dalibo.com.

After removing 151 758 oldest records (reducing the total count to 12 545 200) and having data spanned to just about less than 2 years, the same query lasts <1s, query execution plan for same query as above test_1 | explain.dalibo.com.

Any ideas on what could be causing the slow down or ways to fix this would be helpful.

PostgreSQL: 11.16
TimescaleDB: 1.7.4

Hello,

you are using a very old unsupported version of timescaledb. I would suggest upgrading to PG14 and using the latest timescaledb version. You also seem to have way too many chunks for the amount of data you have. I recommend increasing your chunk_time_interval. This query should return in milliseconds not seconds.