I have an application which is making automatic daily backups at 11pm in a directory dirA
. Each backup is a single .tar file, and has an apparently random naming convention that I don’t have control over (e.g. 129d3139.tar
, 4a98bb6b.tar
, etc).
Since dirA
resides on a storage medium which is very limited in space, I have a cron job five minutes later at 11.05pm which deletes all backups except the most recent one. The bash script “clear_old_backups.sh
” looks like this and works fine (thanks to this answer):
#!/usr/bin/env bash
find /path/to/dirA/ -type f -printf '%T@t%pn' |
sort -t $'t' -g |
head -n -1 |
cut -d $'t' -f 2- |
xargs -r rm
Now in addition I am looking to keep some older archived backups on another bigger storage medium, in a second directory dirB
. Ideally, inside dirB
I would like to have x4 files:
1 day old (just a direct duplicate of the single one stored in dirA
), then additionally 1 week old, 1 month old, and 6 months old.
Can anyone help with a bash script which will copy and keep the relevant backup files?
I am not sure where to begin – do I need to somehow check much more regularly whether a file’s timestamp has reached “now minus 7 days”, etc? And only then act accordingly?
OS is Debian 12.