The compression job is always running

I have a postgresql 16 and timescaleDB 2.14.1 .
I use this DB to run Zabbix .
The postgressql log is:
2024-12-25 17:34:18 CST [3213967]: [5171-1] user=,db=,app=,client= LOG: 00000: finished compressing 1654 rows from “_hyper_10_2510_chunk”
2024-12-25 17:34:18 CST [3213967]: [5172-1] user=,db=,app=,client= CONTEXT: SQL statement “SELECT _timescaledb_functions.recompress_chunk_segmentwise(chunk_rec.oid)”
PL/pgSQL function _timescaledb_functions.policy_compression_execute(integer,integer,anyelement,integer,boolean,boolean,boolean) line 88 at PERFORM
SQL statement “CALL _timescaledb_functions.policy_compression_execute(
job_id, htid, lag_value::INTEGER,
maxchunks, verbose_log, recompress_enabled, use_creation_time
)”
PL/pgSQL function _timescaledb_functions.policy_compression(integer,jsonb) line 70 at CALL
2024-12-25 17:34:18 CST [3213967]: [5173-1] user=,db=,app=,client= LOCATION: row_compressor_append_sorted_rows, compression.c:1036
This process alway running .It’s still five days.
How can I jump this chunk “_hyper_10_2510_chunk”.
Thank you very much