pgMemento icon indicating copy to clipboard operation
pgMemento copied to clipboard

Any best practices for frequent bulk import?

Open bbigras opened this issue 3 years ago • 2 comments

I would like to import data frequently. About every 5 minutes.

I would like to track when the data changes, but without filling up transaction_log and table_event_log.

Any ideas?

bbigras avatar May 09 '22 19:05 bbigras

I don't get why you want changes logged in row_log, but not the other two log tables (which contain only transaction metadata). Maybe you can explain your use case? What problems do you face? Is pgmemento slowing down these imports too much?

FxKu avatar May 19 '22 08:05 FxKu

I want to track changes from another database where I can't make any changes.

My idea is to import that data every 5 minutes into a PostgreSQL database with pgMemento enabled to be able to track the changes.

Is pgmemento slowing down these imports too much?

Probably not.

I'm only concerned that all those insert/update would fill up transaction_log and table_event_log and I'm assuming that would not be ideal. I guess by taking disk space.

I might be wrong. Also, maybe I could just prune those log from time to time.

bbigras avatar May 27 '22 14:05 bbigras