Hi Team,
What is the json data limit I can store it in timescale DB ?
I have 15- 20 MB range files. What is the best way to store it ?
The maximum document size per PostgreSQL is 255MB per document ([1]). However, I don’t think I’d store that much data in a hypertable. While it probably compresses pretty well, it feels wrong. What is the document, what does it look like and is there some “main information” and some “meta data” though could be separated into a time-series table and a metadata (additional information) table? The split would be something like “data that are commonly queried” and “data which is only queried when a special occasion happens”. Remember, TimescaleDB is built upon PostgreSQL which means you can do JOINs as you like
[1] Postgres JSONB Usage and performance analysis | by Eresh Gorantla | Geek Culture | Medium.
I have a json file 100-200 json files and each file size is 15 MB. When I tired to store it in DB, I couldn’t able to see the data of the file in db using select query. Thats why I am asking what is better way to store it in DB ? whether we can store it as BLOB? I have data analysing values in the file as key:value.
100-200 entries isn’t a use case for TimescaleDB, which, as a time-series database, is designed for billions and trillions of time-related entries.
Tbh, never tried to store such big json documents in PostgreSQL, but you may be better of with some document database such as mongodb or stuff