Make Sure To Backup Your Web Server

I know quite a few people, myself included, who have lost their entire web site because they did not backup their web server.

Sometime if you get a spam complaint the hosting company will immediately delete your entire web server and not allow you to copy the files on it. Some hosting companies only keep backups for 24 hours (I found that out the hard way).

I also tend to keep all my previous backups. I had one site that got hacked 3 months prior to me knowing about it. I had all my prior backups so I was able to restore my server to that point and fix everything else from that point forward.

There are multiple ways to backup your server. I primarily deal with Linux servers so I’ll tell you about a couple ways to do it with a linux/unix based server.

Back in the beginning I used to just FTP into my Sever with my favorite FTP program (Filezilla). Then just select all my data directories and download them to my home computer and copy that data to a CDROM or DVD Disc.

I have several gigs of data and several hundred thousand files on a multitude of servers now & Filezilla just doesn’t cut it anymore. It misses files and sometimes even complete directories. Does anyone have a FTP application that doesn’t miss files like Filezilla does?

My preferred method now to backup my server is to just zip the entire server up, then download the zip file via ftp; I use SSH to log into the server. There are many free clients. One that seems popular is Putty. Ask your web host how to use it if you aren’t familiar with it.

Then I will use this command:

tar -zcvf 02-01-2009-backup.tgz .
tar -zcvf 02-01-2009-backup.tgz *

Try them both. Different servers respond a little bit differently or check your manual or hosting FAQ’s. That will zip up your entire web server and put the zip file in your home directory. Then you can log in via FTP and download.


No comments yet.

Sorry, the comment form is closed at this time.