Pg_dump for hypertable not working

Hi,

Trying to get a data export for a 30gb hypertable using pg_dump :-

postgres version 14.13
timescale version 2.14.2

Currently failing with below error :-

pg_dump -Fc -v -t mytab_v4 -f mytab_v4.dmp

pg_dump: NOTICE:  hypertable data are in the chunks, no data will be copied
DETAIL:  Data for hypertables are stored in the chunks of a hypertable so COPY TO of a hypertable will not copy any data.
HINT:  Use "COPY (SELECT * FROM <hypertable>) TO ..." to copy all data in hypertable, or copy each chunk individually.

I know I could use \COPY but given the size of the table, this is suboptimal, any clues/help on fixing the above error ?

Consider the second option :point_up:

You can use show_chunks and then use COPY one by one.

You can also use the range from the timescaledb information to copy then by range, so, you just open and export one by one.

Am I right in thinking pg_dump and hyper tables don’t mix well ?
Currently, my rather lengthy COPY command is still going on, if it fails, I’ll give your suggested method a go.
Thanks for the response.

You can use dump and restore but you’d need to follow this instructions as you have some extra steps.

For loading back, the timescaledb-parallel-copy can also help to load the csv back faster than single threaded methods.

In case anybody else gets stuck, I think my issue was running the pg_dump from my local machine which runs Mac OS rather than linux.

In the end, what I did was : -

  1. Boot up a Linux VM
  2. Used timescale backfill utility - I only needed to grab 2 hyper tables.
    Timescale documentation

The above worked, although I never tested the Linux VM with pg_dump, fairly confident, it would have worked, when I get a chance to try, I’ll update thread.

Thanks for the suggestions Jonatasdp.

1 Like

Welcome, thanks for your proactivity @parminder.garcha :nerd_face: