pyznap
pyznap copied to clipboard
rentention policy on backup server
hi, i have use pyznap only on remote site to pull snapshots from production server to backup server. everything is ok execpt one thing, deleting snap from dest/backup server. on source server everything is ok
my config :
You can also take snapshots on a remote and pull snapshots from there
[ssh:22:[email protected]:default/data] frequent = 4 hourly = 24 daily = 7 snap = yes clean = yes dest = default/data
compress = lz4
how to keep retention on backup server as on soruce
You should create a second policy for your backup server, something like:
[default/data]
frequent = 4
hourly = 24
daily = 7
snap = no
clean = yes
You can also retain snapshots for longer, e.g. have daily = 30
on your backup server.
works like charm thats quite simple and logic :) thanks another question is there a way to change daily snap time ? from 00:00 to 05:00 AM ?
There is no easy way, no. You could have two different config files, one for frequent+hourly snaps and one for the daily snaps. Then you run the daily one only every day at 5am, e.g. with the cron job like
0 5 * * * root pyznap --config /path/to/daily.conf snap
But that is a bit complicated...
okay, let it stay that 00:00
last question in my case I have two datasets to snapshot, /default/date /default/date_new
i see in log :
Nov 24 20:45:01 INFO: Starting pyznap... Nov 24 20:45:01 INFO: Taking snapshots... Nov 24 20:45:01 INFO: Starting pyznap... Nov 24 20:45:01 INFO: Sending snapshots... Nov 24 20:45:02 INFO: Taking snapshot [email protected]:default/data@pyznap_2022-11-24_20:45:02_frequent... Nov 24 20:45:02 ERROR: Error while opening source [email protected]:default/data: ''... Nov 24 20:45:03 INFO: Taking snapshot [email protected]:default/data_new@pyznap_2022-11-24_20:45:03_frequent... Nov 24 20:45:03 ERROR: Error while opening source [email protected]:default/data_new: ''... Nov 24 20:45:03 INFO: Cleaning snapshots... Nov 24 20:45:03 INFO: Finished successfully...
how to snap to different dataset ? use two differnt config and cron job ? or maybe snap&send all /default/ ? but there is a some dataset which i don't want backup or then use exclude?
In that case it's best to have two different policies. Since the settings in the config are recursive, you could do something like this:
[ssh:22:[email protected]:default]
frequent = 4
hourly = 24
daily = 7
snap = no <-- important! It will not take snapshots for this dataset
clean = yes
[ssh:22:[email protected]:default/data]
snap = yes <-- overwrite the "no" from above, so we take snapshots for this dataset with the periods given above
dest = default/data
[ssh:22:[email protected]:default/data_new]
snap = yes
dest = default/data_new
[default]
frequent = 4
hourly = 24
daily = 7
snap = no
clean = yes
This should set the snapshot frequency for all child datasets of default
, while having snap = no
for the root dataset and then overriding it for the child datasets. I think this should work, though you should test it to be sure.
Also btw, it might actually be better to have pyznap also running on the prod server, simply to take snapshots. And then on the remote only pull the snapshots. The way you have it set up now you have to run pyznap remotely over ssh, which can be a bit slow, though everything should work as expected.