Server backup strategies: effectively protect your data
Backups are your last resort in case of data loss. Without good backups, a hack, hardware failure, or human error can be catastrophic. At Theory7, we unfortunately often see clients who do not have a recent backup when they need it most. In this guide, we help you set up a solid backup strategy.
Why backups are essential
Data can be lost in many ways:
- Hacks and malware - Ransomware encrypts your files, making you unable to access crucial information. This can lead to significant financial losses and reputational damage.
- Hardware failures - Disks can fail, often without any warning. This can be especially problematic if you do not have redundant systems.
- Human errors - Accidentally deleting files or applying incorrect configurations can have disastrous consequences. It is astonishing how often this occurs in practice.
- Software bugs - Updates that corrupt data or incompatibility with existing systems can lead to data loss.
- Natural disasters - Fire, flooding, or other disasters can devastate your data center, resulting in the loss of all your data.
The question is not if you will lose data, but when. Good backups make the difference between a minor disruption and a business disaster. It is crucial to adopt a proactive approach to data security.
What should you back up?
Databases
Databases often contain your most important data:
- WordPress posts and users
- Webshop orders and customers
- Application configuration
Website files
- Uploaded images and documents
- Custom code and themes
- Configuration files
Server configuration
- Web server configs (Apache/Nginx/LiteSpeed)
- PHP configuration
- SSL certificates
- Cron jobs
Email (if applicable)
- Mailbox data
- Email configuration
The 3-2-1 backup rule
A proven strategy:
- 3 copies of your data
- 2 different storage media
- 1 copy offsite
Example implementation:
- Live data on your server
- Backup on external drive or second server
- Backup in cloud storage (S3, Backblaze, Google Cloud)
Automated backup script
Here is a complete backup script for a typical web server:
#!/bin/bash
DATUM=$(date +%Y%m%d_%H%M)
BACKUP_DIR=/backups
SITE_DIR=/var/www/html
DB_USER=root
DB_PASS=wachtwoord
DB_NAME=wordpress
# Create backup directory
mkdir -p $BACKUP_DIR
# Database backup
mysqldump -u$DB_USER -p$DB_PASS $DB_NAME | gzip > $BACKUP_DIR/db_$DATUM.sql.gz
# Files backup
tar -czf $BACKUP_DIR/files_$DATUM.tar.gz $SITE_DIR
# Configuration backup
tar -czf $BACKUP_DIR/config_$DATUM.tar.gz /etc/apache2 /etc/php /etc/nginx 2>/dev/null
# Delete old backups (older than 7 days)
find $BACKUP_DIR -name "*.gz" -mtime +7 -delete
echo "Backup completed: $DATUM"
Make the script executable and schedule it:
chmod +x /root/backup.sh
crontab -e
Add:
0 3* * * /root/backup.sh >> /var/log/backup.log 2>&1
Backups to external location
Local backups alone are not enough. Copy to another server:
Via rsync
rsync -avz /backups/ backup@andere-server:/backups/
Via S3-compatible storage
With rclone:
rclone copy /backups remote:bucket/backups
Via FTP/SFTP
lftp -u user,pass sftp://backup.server.com -e "mirror -R /backups /remote/backups; quit"
Backup frequency
Determine frequency based on:
- How often data changes
- How much data you can afford to lose
- Available storage space
Typical schedules:
- Databases: Daily, hourly for active sites
- Files: Daily or weekly
- Full system: Weekly
Backup verification
A backup that does not work is not a backup. Test regularly:
Database restore test
mysql -u root -p test_database < backup.sql
File extraction test
tar -tzf backup.tar.gz | head -20
Full restore test
Periodically schedule a full restore test on a test server. This ensures that you not only rely on the backups but also on the recovery process.
Retention policy
Do not keep all backups forever. Example schedule:
- Last 7 days: daily backups
- Last 4 weeks: weekly backups
- Last 12 months: monthly backups
Script for retention:
# Keep daily backups for 7 days
find /backups/daily -mtime +7 -delete
# Keep weekly backups for 4 weeks
find /backups/weekly -mtime +28 -delete
# Keep monthly backups for 1 year
find /backups/monthly -mtime +365 -delete
DirectAdmin backups
If you use DirectAdmin:
- Go to Admin > Admin Backup/Transfer
- Configure automatic backups
- Set up external FTP/SFTP for offsite storage
DirectAdmin can back up complete user accounts including email, databases, and files. This makes managing backups easier and more efficient.
Related articles
- Basic Linux commands for hosting
- Installing DirectAdmin on VPS
- SSH connection from Mac/Linux
- Cron jobs at server level
More information about VPS servers at Theory7
Need help?
We are here for you! Are you facing any issues or do you have questions? Our support team is happy to assist you personally. Send us a message via the ticket system - we usually respond within a few hours and are happy to help.
0 van 0 vonden dit nuttig