NETGEAR is aware of a growing number of phone and online scams. To learn how to stay safe click here.
Forum Discussion
ArchPrime
Nov 16, 2016Guide
Culling old archive files automatically
Hi, I have set up my RN102 as an FTP server, to receive a rolling daily backup of my websites, with backups generated and sent from the host webserver using Xcloner The problem is that there is n...
ArchPrime
Nov 17, 2016Guide
Hi thanks JennC. The Xcloner part is really only background info - basically, it is a source of recurring daily backup archives that get sent to NAS via ftp.
What I am looking for is an app or scripting method that the NAS can run to ensure old archives(beyond a certain number of days) get deleted to make way for more recent incoming archives
StephenB
Nov 18, 2016Guru - Experienced User
I don't know of any add-on that will do what you want.
You could set up a pair of backup jobs. One would copy the xcloner destination to a second share (deleting the contents of the second share first). The second backup job would clear the xcloner destination (and would run the day after the first one, but before xcloner runs). If you run those jobs every two weeks, your retention would vary from 2 to 4 weeks.
- ArchPrimeNov 18, 2016Guide
Thanks Stephen
Sound interesting. If you can forgive my ignorance, by 'backup jobs', do you mean cron jobs? (i.e of the kind that Xcloner creates and the webhost runs?)
I hate to ask, but how does one get RN102 to run cron jobs?, and is there a cron script I could copy that would do what I am trying to achieve?
Or does RN offer some other type of automatic 'job' process?
Cheers
Paul
** EDIT - ok, I think I have found info on the system you are talking about. Something to try!
Even better would be a way that is less reliant on so many automated processes doing things to a schedule - for example I can foresee a likely situation in which the Xcloner backup files stop coming through, even while the ReadyNas backup jobs continue to process those two locations, and so end up destroying all my backups if I do not notice and intervene in time (I might be away on holiday at the time for example). Would love to find a 'smart' procedural deletion method, that only deletes what it detects has been superseded
- StephenBNov 18, 2016Guru - Experienced User
ArchPrime wrote:
Sound interesting. If you can forgive my ignorance, by 'backup jobs', do you mean cron jobs? (i.e of the kind that Xcloner creates and the webhost runs?)
The NAS has a built-in backup facility (on the admin UI there is a "backup" tab). I was suggesting that you could use that - perhaps close enough enough to what you need. It has the benefit of not requiring SSH access or modification to the normal NAS installation.
- ArchPrimeNov 18, 2016Guide
Thanks Stephen. Looks like you were replying just as I was editing my previous reply.
What worried me was the potential for the setup as proposed to continue to destroy old backups even when new ones have stopped coming through...
Is there a scripting language that allows for a procedural approach on ReadyNAS - for example to continue deleting the oldest files only when there are more than 10 files in a directory (or similar)
Related Content
NETGEAR Academy
Boost your skills with the Netgear Academy - Get trained, certified and stay ahead with the latest Netgear technology!
Join Us!