OK, We're Back!

Story Text:

It never rains but it pours eh? We've been down all morning, not sure what caused it yet but there were several badly corrupted tables in the database.

It looks like we have everything restored now, and were not forced to roll back a couple of days as i'd feard.

Many thanks to Brian for spending all his time on the phone with the sys admin this morning.

Normal service will now be resumed :)


darn technology

I might just go back to pen and paper :)

Those darn MySQL tables are

Those darn MySQL tables are always messing my sites up too. Athough Nick, your little updates on the blocking page were very well done.

well done gents

well done gents

Yes it's a pain when that

Yes it's a pain when that happens, mending tables on the command line was one of the best things I ever learnt. Good to have you back Nick!

Nick, if you have ...

Nick, if you have your own server (or virtual private server) and can run cron, I've written a cron script for automatic mysql backups. Let me know if you're interested.

Thanks Diane, i've got the

Thanks Diane, i've got the script, and can write a cron entry - what i really need to do is this:

  • Run the cron hourly (can do)
  • Have that call a script to back up db in a directory (can do)
  • Delete backups older than say, 5hrs so not to stack up on MB's - (cant do)
  • scp once a day to my home machine (can do)

At present the cron line looks like this:

10 * * * * ~/bin/backup

and the ~/bin/backup

/usr/bin/mysqldump -u user_name -pPassWord --opt db_name | bzip2 -f > /home/wilson/backups/`date +%Y%m%d%H%M%S`.sql.bz2

which outputs files like


Would anyone know how to calculate what files are older than 5hrs or do something similar to that? or even improve on the system?


Same question here

My backups have dates in the file name; I manually remove older ones, but am thinking there's got to be a more automated method.

(Glad you're okay, Nick.)

Yup I can help - I recomend

Yup I can help - I recomend Reo Back

It's bloody old and doesn't offer SCP out of the box but with a little juggling using Net::SCP it isn't a hard job to enable SCP copying and the age is no problem IMHO.

it works and works well

Thanks jason, i'll have a

Thanks jason, i'll have a look - but i don't have root...

Makes life harder without

Makes life harder without root but shuold still run as a user cron job, or I think it will.

As a side question and possible answer, have you thought about DB replication rather than physical DB dumps ?

Im not sure what it is, but

Im not sure what it is, but i'd like the backups to be OFF of the remote server you know? better able to survive a real disaster...

I am talkign about copying

I am talkign about copying in real time / replicating the TW MySQL server with a remote MySQL Server (in your personal network maybe)

Sincere apologies to all as

Sincere apologies to all as well - Threadwatch was down longer than it should have been, partly due to trying to ensure a most recent non-corrupted backup was used, but also because there was a communications error which held things up longer than it should have.

Anyway, there were already remote weekly backups, but now there are full daily system backups within a SCSI harddrive configuration. This should mean that in the event of anything similar happening again, the entire restore process should be much much faster.

Also thanks to Nick for not losing his cool - having a site down can be a distressing event at the best of times, but we learned from this, and everything should run smoother and better for it.

My apologies again for the disruption caused.

I am talkign about copying

I am talkign about copying in real time / replicating the TW MySQL server with a remote MySQL Server (in your personal network maybe)

That might work well, i could certainly set that up - is there a command for this? does mysqldump do it?

Thanks Brian!

Thanks Brian!

GUess who wrote this It

GUess who wrote this

It aint hard and this aint a perfect presentation but it gives the principles.

hehe! Good ole Jeremy - its

hehe! Good ole Jeremy - its this evenings task Jason, cheers!

Deleting files that are older than X days/mins...

...look into the "find" command using the "-ctime" parameter with the "-exec" option tacked onto the end. There are examples on the web.

An example to delete files after a day:

find "/home/backups/" -ctime +1 -name "*.bak" -exec rm '{}' \\;

Just add a similar command to your cron or your shell script/perl script file that is called within your cron.

Note: The "\\;" are required at the end.

Also Note: Some flavors of Unix support "-cmin" to allow finer time control.

The "find" command's "-exec" option can be used with other Unix commands as well.

Trivia (in a way): To perform a command (say an "rm" command) on a list of files that number in the high hundreds or nearly a thousand, you may very well have to use the "find" command with the "-exec" option as the "rm" command by itself is limited to the number of files it can handle with wildcarding (this pertains to Unix commands in general, not just "rm"). When you use "find" with "-exec", it buffers the arguments differently and you can get around that limitation.

Added: In retrospect...the above trivia may pertain to using the "find" command without the "-exec" option but instead piping the output into "xargs" with an "rm". Haven't had to deal with this 'limit' issue for some time. This would not come into play though if your are running the command from a periodic cron job and the files it is manipulating are too numerous each period of time.

Thanks Stephen, that's

Thanks Stephen, that's fantastic - i pictured doing all kinds of weird calculations based on the timestamp in the filename :)

I didn't expect anyone to be able to answer here, so asked on the gentoo forum aswell and got similar, but slightly different answers.

I'll play around with it and check the man pages etc - it'll get sorted once im caught up with the news :)

thanks again


Got it all working just great now! It keeps 10hrs worth of hourly backups on the host machine. Now i need a little help with the final step...

When a backup is made, i need to scp that backup to my home machine. The problem is i can't work out how to name the backup file - see here..

/usr/bin/mysqldump -u user_name -pPassWord --opt data_base | bzip2 -f >/home/wilson/backups/`date +%Y%m%d%H%M%S`.sql.bz2

I want to add "&& scp FILE_NAME

:~/backups" but how do i work out the FILE_NAME?

can i use xargs for that somehow?


Well, that is a tricky one for a one-liner. I don't think "xargs" would work in this context (but with the power of Unix commands who knows). I am not a Unix guru but I am above average so maybe it is doable with some funky syntax.

A few things you could do:

-You could wildcard the FILE_NAME in the "scp" command at the hour or minute level but then you run the risk of hour or day changes if the "bzip2" step takes too long. I wouldn't go this route.

-You could use a wildcard at the hour level in the "scp" command and after the "scp" command "mv" the file to a sub-directory to remove it from your 'working' directory (i.e. move it after the "scp" from "/backups/working" to "/backups/working/moved" or what not). If you applied this mechanism, you could actually wildcard even sooner in the FILE_NAME. The "mv" would only be changing directory inode entries so it wouldn't be eating up much CPU or I/O resources (unless you moved the files to a different partition/drive).

-(This is probably the easiest and most logical). Just place your commands in a small shell script file and then run that shell script file from your cron. I am not at a Unix prompt right now, but something like these four lines (in a file named "backup_and_scp.sh"):


FILEEXT=`date +%Y%m%d%H%M%S`

/usr/bin/mysqldump -u user_name -p PassWord --opt data_base | /usr/bin/bzip2 -f >${FILEEXT}.sql.bz2

scp ${FILEEXT}.sql.bz2


#end of script - may need adjustments for "shell" type. Make sure the "#!/bin/sh" line is the first(!) line of the shell script file.

Then add this to your cron:
* 11,23 * * * /usr/bin/backup_and_scp.sh 1>/tmp/backup.log 2>/tmp/backup_error.log

Of course adjust the time settings in the cron entry to what you want them to be.

Make sure you have a "/tmp" directory too (or change those to where you want them).

The last one works

The last one works beautifully, thanks Stephen!

That's how i would have thought to do it, but i know so little about bash, i don't even know (untill now) how to assign a variable and then call it....

I've looked at a few docs about bash, but i don't want to learn every little nuance, what i need is a 'basics' - for people that understand the concept of programming but just need to know how to create loops, assign vars etc in bash..

will search when i have time, it's been a while since i tried to find something like that.

thanks again...

Glad to hear it...

...if you are more of a perl guy or even PHP (as I would think you could execute a PHP script from the command line...not much of a PHP guy myself) you could use that language's variable schemes, flow control (i.e. loops, conditionals, etc.), etc. and just embed all your OS commands in calls to "exec()" (or its equivalent) in a script of whatever command line executable language you choose. The best of both worlds. I usually write my "shell" scripts inside a perl wrapper since I know perl and I always forget the idiosyncracies of each shell's scripting capabilities.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.