For my servers, I use weekly system backup (not admin/reseller backup) – becouse I like the way it is, clean and complete.
Well, in the past I had some problems with users having HUGE ammount of files within their directories, and I mean HUGE (+2 mil / files in their whole homedir) – that make the backup cron run very very slow – sometimes taking more than 6 hours for a total of 300GB of data to be backed up.
And … I thinked about something. What about I could set sysbk utility in order to check the number of files before backup (find $dir -type f | grep -c .) and do not backup that user at all if it exceeds more than 100.000 files in his whole dir. Also, I thinked about some capacity limit (for example: if the user has more than 10 GB stored, then do not backup)
What do you think ? DA could give me a possibility to do this … ?
I’ve read the tutorials, I got no clue at all about this.
I know how to do it on user initiated backup, my question is about automated (cron) system backup.
Thank you in advance.