{"id":196,"date":"2024-03-06T06:50:19","date_gmt":"2024-03-06T06:50:19","guid":{"rendered":"https:\/\/blog.devops955.com\/swain\/?p=196"},"modified":"2024-03-20T08:08:58","modified_gmt":"2024-03-20T08:08:58","slug":"wordpress-server-maintenance","status":"publish","type":"post","link":"https:\/\/blog.devops955.com\/swain\/2024\/03\/06\/wordpress-server-maintenance\/","title":{"rendered":"Routine WordPress Server Maintenance"},"content":{"rendered":"<blockquote>\n<p>Estimated reading time: 13 minutes<\/p>\n<\/blockquote>\n<h1>Introduction<\/h1>\n<p>In our previous articles, we have set up a <strong><a href=\"https:\/\/blog.devops955.com\/swain\/2024\/02\/28\/step-by-step-guide-to-deploying-wordpress-with-lemp-stack\/\" title=\"LEMP stack WordPress site\">LEMP stack WordPress site<\/a><\/strong> and extended our website's capabilities with <strong><a href=\"https:\/\/blog.devops955.com\/swain\/2024\/03\/05\/how-to-setup-wordpress-multisite-with-lemp-stack\/\" title=\"multisite feature\">multisite feature<\/a><\/strong>. As the website's content and user data grow, it becomes crucial to regularly back up and maintain site data. Backing up data not only prevents the loss of important information but also ensures quick recovery from hardware failures, software bugs, security breaches, or other unforeseen events. Moreover, it is an indispensable step during website migration, upgrades, or significant changes. Regularly backing up website data ensures long-term stability and secure storage of all vital information for any emergency.<\/p>\n<p>The main components for backup and maintenance of a WordPress site include:<\/p>\n<ol>\n<li><strong>MySQL Database<\/strong>: Stores all website data, including posts, comments, and user information, and is the core component of the site.<\/li>\n<li><strong>WordPress Files<\/strong>: Include all core code, themes, plugins, and uploaded files, forming the foundation of the website's operation.<\/li>\n<li><strong>SSL Certificates<\/strong>: Ensure the security of website data transmission. When using HTTPS, these certificates are crucial for maintaining user trust and establishing website credibility.<\/li>\n<li><strong>iptables Rules<\/strong>: Protect the server from unauthorized access and various network attacks, constituting an essential part of server security.<\/li>\n<li><strong>System and Application Configuration Files<\/strong>: This mainly includes web server (like Nginx or Apache) and PHP configuration files, which determine the operational behavior and performance of the website and have a direct impact on its stable operation and security.<\/li>\n<\/ol>\n<p>By regularly backing up and maintaining these key components, we can ensure the data security, stability, and performance of our WordPress website. Next, we will delve into each section, explaining how to perform specific operations and maintenance steps.<\/p>\n<h2>MySQL Database Backup<\/h2>\n<p>In maintaining any website or large system, database backup is one of the most critical steps. Especially for content management systems like WordPress, all content data is stored in the MySQL database. Without a database backup, any data corruption or loss can lead to significant setbacks, disastrously affecting website operation and user experience. Therefore, regularly backing up the MySQL database is a necessary measure to ensure data safety and business stability.<\/p>\n<h3>Using the mysqldump Tool for Backup<\/h3>\n<ol>\n<li>\n<p>SSH into your server.<\/p>\n<\/li>\n<li>\n<p>Back up using the following command:<\/p>\n<pre><code class=\"language-bash\">sudo mysqldump -u [username] -p [database_name] &gt; [backup_file_name].sql<\/code><\/pre>\n<p>After execution, enter your MySQL user password and replace the relevant parts with actual usernames, database names, and desired backup file names.<\/p>\n<\/li>\n<li>\n<p>After completing the backup, make sure to check the .sql file to confirm the data has been successfully backed up.<\/p>\n<\/li>\n<\/ol>\n<h3>Using crontab for Scheduled Backup<\/h3>\n<p>We don't have to spend time manually backing up every day, as setting up a scheduled task (cron job) makes it easier. For <strong>cron<\/strong>, you need to understand some basic concepts to arrange scheduled tasks properly. In brief, its structure is as follows:<\/p>\n<pre><code class=\"language-bash\">* * * * * command to be executed<\/code><\/pre>\n<p>The parts of this structure represent the scheduled time for execution:<\/p>\n<ul>\n<li>The first * represents minutes (0 - 59).<\/li>\n<li>The second * represents hours (0 - 23).<\/li>\n<li>The third * represents the day of the month (1 - 31).<\/li>\n<li>The fourth * represents the month (1 - 12).<\/li>\n<li>The fifth * represents the day of the week (0 - 7), where both 0 and 7 denote Sunday.<\/li>\n<\/ul>\n<p>Now, the steps for setting up:<\/p>\n<ol>\n<li>Open the terminal and enter <code>crontab -e<\/code> to edit the crontab as the current user or use <code>sudo crontab -e<\/code> for root to edit.<\/li>\n<li>At the end of the file, add a similar command for automatic backup. For example, to perform a backup every day at 2 AM (usually set during off-peak hours):\n<pre><code class=\"language-bash\">0 2 * * * sudo \/usr\/bin\/mysqldump -u [username] -p[password] [database_name] &gt; \/home\/user\/db_backup\/[database_name]_$(date +\\%Y\\%m\\%d).sql<\/code><\/pre>\n<p>With this command, we can backup the corresponding database to the <code>\/home\/user\/db_backup\/<\/code> directory. Note that the password follows directly after <strong>-p<\/strong> (no space), and the backup format is <strong>name_current date<\/strong>.<\/p>\n<\/li>\n<li>Save and close (similar to <strong>vim<\/strong> operations), and the system will automatically apply the new cron job. You can set a time close to the current time to test. After confirming the function is normal, set it to the time you want it to execute. If changes are infrequent and demands are not high, backing up once a day is sufficient.<\/li>\n<li>Restoring the database is relatively simple. After creating the corresponding tables and setting permissions in the target database, use <code>mysql -u [username] -p[password] [database_name] &lt; [backup_file_name].sql<\/code> to restore.<\/li>\n<\/ol>\n<blockquote>\n<p>Note:<br \/>\nThis backup will display your database password in plaintext. If you need to enhance security, you can use a script to hide the password. I will write an article about using <strong>bash scripts<\/strong> to complete these operations later.<\/p>\n<\/blockquote>\n<h3>Offsite Storage and Backup File Management<\/h3>\n<p>Storing backups solely on the local server is still risky, as data could be lost if the server itself encounters problems. Therefore, we also need to consider offsite storage and management of backup files, such as using cloud storage services or uploading to another backup server, or even downloading to your PC (if information security is not a concern).<\/p>\n<h3>Introduction to MySQL Real-Time Incremental Backup<\/h3>\n<p>The full backup method mentioned above can serve as a simple backup strategy. MySQL also supports real-time incremental backups, which record all changes made to the database since the last backup. This means you can achieve more accurate data recovery, restoring to the latest moment, whereas scheduled tasks only capture data up to the time of backup. This article does not delve into the specifics (as I am not a database expert :P), because implementing incremental backups usually requires specialized tools or enabling MySQL's binary logs, which record database changes. By using binlog, you can restore the database to the latest data. Here, I will mention how to enable binlog (as for how to restore data, I will study and update later):<\/p>\n<ol>\n<li>\n<p>Add the following configuration in MySQL's config file (usually <code>\/etc\/mysql\/my.cnf<\/code> or <code>\/etc\/my.cnf<\/code>):<\/p>\n<pre><code class=\"language-ini\">[mysqld]\nlog_bin=mysql-bin\nserver_id=1\nmax_binlog_size=100M # Each file is 100M\nexpire_logs_days=7 # Binlog retention time is 7 days<\/code><\/pre>\n<\/li>\n<li>\n<p>Restart MySQL service:<\/p>\n<pre><code class=\"language-bash\">sudo systemctl restart mysql<\/code><\/pre>\n<\/li>\n<li>\n<p>Log in to MySQL and verify if binary logging is enabled:<\/p>\n<pre><code class=\"language-sql\">SHOW BINARY LOGS;<\/code><\/pre>\n<p>If binary logging is enabled, you will see a list of log files.<\/p>\n<\/li>\n<\/ol>\n<p>With these steps, your MySQL database backup work should be proceeding smoothly. Regularly performing full and incremental backups is key to ensuring data security.<\/p>\n<h2>WordPress Files Backup<\/h2>\n<p>When maintaining a WordPress site, regularly backing up website files is very important. This not only protects your site from data loss, hacker attacks, update anomalies, and other unexpected events but also enables quick restoration of site operation when needed, such as during site migration or upgrades. WordPress files include core codes, themes, plugins, and uploaded media files, which maintain the entire site's operation.<\/p>\n<h3>Backing Up WordPress Files with tar<\/h3>\n<ol>\n<li>\n<p>Log into your server.<\/p>\n<\/li>\n<li>\n<p>Switch to the root directory of WordPress, for example:<\/p>\n<pre><code class=\"language-bash\">cd \/var\/www\/blog.yourdomain.com<\/code><\/pre>\n<\/li>\n<li>\n<p>Use the <code>tar<\/code> command (usually pre-installed) to create a compressed package of the WordPress directory:<\/p>\n<pre><code class=\"language-bash\">sudo tar -czvf wordpress_backup_$(date +%Y%m%d).tar.gz .<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>This command compresses the entire WordPress directory into a file named <code>wordpress_backup_current date.tar.gz<\/code>.<\/p>\n<h3>Scheduled WordPress Files Backup Using crontab<\/h3>\n<p>Scheduled backups can be implemented by setting the following cron job:<\/p>\n<ol>\n<li>As before, edit the crontab.<\/li>\n<li>At the end of the file, add a line to set your automatic backup task. For example, to back up WordPress files every day at 3 AM (set your backup time wisely, so several backup tasks don't run simultaneously, which might affect performance or cause backup failure):\n<pre><code class=\"language-bash\">0 3 * * 0 sudo tar -czvf \/home\/user\/wp_backup\/wordpress_backup_$(date +\\%Y\\%m\\%d).tar.gz -C \/var\/www\/blog.yourdomain.com .<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>One thing to note, including the previous MySQL backup, you need to ensure the specified backup directory exists first. If it does not, you should create it with <code>mkdir \/home\/user\/xx_backup<\/code>.<\/p>\n<ol start=\"3\">\n<li>If you need to restore WordPress files, simply unzip the backup file to the WordPress root directory:\n<pre><code class=\"language-bash\">sudo tar -xzvf wordpress_backup_YYYYMMDD.tar.gz -C \/var\/www\/blog.yourdomain.com\/<\/code><\/pre>\n<\/li>\n<\/ol>\n<h3>Backup File Storage and Management<\/h3>\n<p>Similarly, don't put all your eggs in one basket. The best practice is to store backup files in locations other than your local server to ensure backup files remain accessible in case of server anomalies.<\/p>\n<p>With the steps mentioned above, you should have effectively backed up your WordPress files and ensured data security and disaster recovery capabilities through scheduled tasks and offsite storage management.<\/p>\n<h2>SSL Certificate Maintenance with Certbot<\/h2>\n<p>SSL certificates are crucial for securing your website. They encrypt communications between the website and users, ensuring the safety of data transmission. However, SSL certificates typically have a validity period, and they will no longer provide protection once expired. Hence, regularly maintaining and updating SSL certificates is an important part of keeping your site secure. We have already covered this in <strong><a href=\"https:\/\/blog.devops955.com\/swain\/2024\/02\/28\/step-by-step-guide-to-deploying-wordpress-with-lemp-stack\/\" title=\"the first article\">the first article<\/a><\/strong>, but let's quickly go over it again.<\/p>\n<ol start=\"3\">\n<li><strong>Verify Automatic Renewal<\/strong>:<br \/>\nAfter installing Certbot, it automatically sets up a scheduled task to regularly check and renew the certificates. You can test the automatic renewal process by executing the following command:<\/p>\n<pre><code class=\"language-bash\">sudo certbot renew --dry-run<\/code><\/pre>\n<\/li>\n<\/ol>\n<h3>Setting Up systemd Timer for Automatic Renewal<\/h3>\n<p>Certbot installation typically automatically sets up a <strong>systemd timer<\/strong> to manage the certificate's automatic renewal. You can check the status of this timer using the following commands:<\/p>\n<pre><code class=\"language-bash\">sudo systemctl list-timers | grep certbot\nsudo systemctl status certbot.timer<\/code><\/pre>\n<p>If this timer is activated, it will perform the renewal tasks periodically, typically twice a day, and you don't need further settings, just verify if the automatic renewal can be carried out.<\/p>\n<blockquote>\n<p>Although it checks twice a day, if your certificate isn't close to its expiry date, it won't change your certificate, so no need to worry.<\/p>\n<\/blockquote>\n<p>If this timer is not activated, you can activate it with the following commands:<\/p>\n<pre><code class=\"language-bash\">sudo systemctl start certbot.timer\nsudo systemctl enable certbot.timer<\/code><\/pre>\n<h3>Manually Setting Up cron job for Automatic Renewal<\/h3>\n<p>If the timer can't be set or if you're more accustomed to using <strong>cron jobs<\/strong>, you can also set up a cron job for automatic renewal by following these steps:<\/p>\n<ol>\n<li>Edit the crontab.<\/li>\n<li>At the end of the file, add a line to set the automatic renewal task. For example, to check and renew any certificates that are about to expire at 1 AM every day:\n<pre><code class=\"language-bash\">0 1 * * * sudo certbot renew --quiet<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>The <code>--quiet<\/code> option means that Certbot will only output information if the renewal attempt fails or if it successfully renews and updates the certificates.<\/p>\n<h3>Verifying Automatic Renewal<\/h3>\n<p>You can test the automatic renewal process by executing the following command:<\/p>\n<pre><code class=\"language-bash\">sudo certbot renew --dry-run<\/code><\/pre>\n<p>Ensure that your server's port 80 is open before running Certbot and note that the server firewall (ufw\/iptables) allows access to port 80, as Let's Encrypt usually needs to verify server ownership through this port.<\/p>\n<p>With these steps, you should be able to ensure your site continues to use valid SSL certificates, thereby maintaining the site's security and trustworthiness. Even after setting up automatic renewal, it's important to regularly check your renewal task logs or look out for related notifications in your email to ensure renewals continue to proceed normally.<\/p>\n<h2>iptables Backup and Boot-Time Loading<\/h2>\n<p>Since I previously used iptables as an access control tool, regularly backing up iptables rules and automatically loading these rules at system startup are important steps to ensure system security.<\/p>\n<h3>How to Backup iptables Rules<\/h3>\n<ol>\n<li>Log into the server.<\/li>\n<li>Use the following command to save the current iptables rules to a file:\n<pre><code class=\"language-bash\">sudo iptables-save &gt; \/home\/user\/iptables.backup<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>This command outputs all current active iptables rules to the specified file, ensuring you can quickly restore these rules when needed.<\/p>\n<ol start=\"3\">\n<li>You can set up a cron job to regularly execute this command, ensuring your iptables rule backup files remain up-to-date:\n<pre><code class=\"language-bash\">0 0 * * * sudo iptables-save &gt; \/home\/user\/iptables_$(date +%Y%m%d).backup<\/code><\/pre>\n<\/li>\n<\/ol>\n<h3>How to Restore iptables Rules<\/h3>\n<ol>\n<li>Ensure you have a backup file of iptables rules.<\/li>\n<li>Use the following command to restore these rules to iptables:\n<pre><code class=\"language-bash\">sudo iptables-restore &lt; \/home\/user\/iptables.backup<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>If using a specific day's backup, modify the file name to <code>iptables_YYYYMMDD.backup<\/code>.<\/p>\n<h3>Automatically Loading iptables Using systemd at Boot Time<\/h3>\n<p>To have iptables rules automatically loaded on Ubuntu 20.04 system startup, you can add the <code>iptables-restore<\/code> command to the system's startup scripts:<\/p>\n<ol>\n<li>\n<p>Create a systemd service file:<\/p>\n<pre><code class=\"language-bash\">sudo vim \/etc\/systemd\/system\/iptables-load.service<\/code><\/pre>\n<\/li>\n<li>\n<p>Add the following content to the file:<\/p>\n<pre><code class=\"language-xml\">[Unit]\nDescription=IPTABLES RESTORE\nDefaultDependencies=no\nBefore=network-pre.target\nWants=network-pre.target\n[Service]\nType=oneshot\nExecStart=\/sbin\/iptables-restore \/home\/user\/iptables.backup\n[Install]\nWantedBy=multi-user.target<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>Reload systemd configuration and start the service:<\/p>\n<pre><code class=\"language-bash\">sudo systemctl daemon-reload\nsudo systemctl enable iptables-load\nsudo systemctl start iptables-load<\/code><\/pre>\n<ol start=\"3\">\n<li>If you need to restore the most recent iptables backup, since systemd doesn't support using shell commands and variables, you'll need to complete this through a script:\n<pre><code class=\"language-bash\">sudo vim \/home\/user\/iptables-load.sh<\/code><\/pre>\n<\/li>\n<\/ol>\n<p>Add the following content:<\/p>\n<pre><code class=\"language-bash\">#!\/bin\/bash\n\/sbin\/iptables-restore &lt; \/home\/user\/iptables_$(date +%Y%m%d).backup<\/code><\/pre>\n<p>Save and exit, then add execution permissions to the shell script:<\/p>\n<pre><code class=\"language-bash\">sudo chmod +x \/home\/user\/load-iptables.sh<\/code><\/pre>\n<p>Modify the <code>\/etc\/systemd\/system\/iptables-load.service<\/code> file, changing the <code>ExecStart<\/code> value to <code>\/home\/user\/load-iptables.sh<\/code>:<\/p>\n<pre><code class=\"language-xml\">[Unit]\nDescription=Load iptables rules\nBefore=network-pre.target\nWants=network-pre.target\n\n[Service]\nType=oneshot\nExecStart=\/home\/user\/load-iptables.sh\n\n[Install]\nWantedBy=multi-user.target<\/code><\/pre>\n<p>Again, you'll need to reload the systemd configuration and start the service:<\/p>\n<pre><code class=\"language-bash\">sudo systemctl daemon-reload\nsudo systemctl enable iptables-load\nsudo systemctl start iptables-load<\/code><\/pre>\n<h2>Conclusion<\/h2>\n<p>Regular backups and maintenance are indispensable measures to keep your WordPress site running smoothly and securely. In today's era, data is an extremely valuable resource. Even though cloud services provide high reliability, the online environment is still full of unpredictability. Hence, by regularly performing backup and maintenance tasks, not only can you ensure data security, but you can also quickly restore services in case of problems, minimizing your losses and impact on user experience.<\/p>\n<p>This should conclude our articles about WordPress, though there may be more to come sharing experiences and tips from usage. In this article, we have introduced how to back up MySQL databases and WordPress files, how to maintain SSL certificates with Certbot automatically, and how to back up and auto-load iptables rules. I hope this content helps you build a more stable and secure website environment.<\/p>\n<blockquote>\n<p>References:<br \/>\n<a href=\"https:\/\/dev.mysql.com\/doc\/refman\/8.0\/en\/mysqldump.html\">https:\/\/dev.mysql.com\/doc\/refman\/8.0\/en\/mysqldump.html<\/a><br \/>\n<a href=\"https:\/\/dev.mysql.com\/doc\/refman\/8.0\/en\/mysqlbinlog.html\">https:\/\/dev.mysql.com\/doc\/refman\/8.0\/en\/mysqlbinlog.html<\/a><br \/>\n<a href=\"https:\/\/www.digitalocean.com\/community\/tutorials\/how-to-use-systemctl-to-manage-systemd-services-and-units\">https:\/\/www.digitalocean.com\/community\/tutorials\/how-to-use-systemctl-to-manage-systemd-services-and-units<\/a><\/p>\n<\/blockquote>\n","protected":false},"excerpt":{"rendered":"<p>Estimated reading time: 13 minutes Introduction In our previous articles, we have set up a LEMP stack WordPress site and extended our website&#8217;s capabilities with multisite feature. As the website&#8217;s content and user data grow, it becomes crucial to regularly back up and maintain site data. Backing up data not only prevents the loss of&#8230;<\/p>\n","protected":false},"author":3,"featured_media":169,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","_jetpack_memberships_contains_paid_content":false},"categories":[4],"tags":[],"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/blog.devops955.com\/swain\/wp-content\/uploads\/sites\/2\/2024\/03\/Ubuntu-1404-.jpg","_links":{"self":[{"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/posts\/196"}],"collection":[{"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/comments?post=196"}],"version-history":[{"count":10,"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/posts\/196\/revisions"}],"predecessor-version":[{"id":205,"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/posts\/196\/revisions\/205"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/media\/169"}],"wp:attachment":[{"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/media?parent=196"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/categories?post=196"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.devops955.com\/swain\/wp-json\/wp\/v2\/tags?post=196"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}