devxlogo

Backup and Restore Using Amazon S3 API

Backup and Restore Using Amazon S3 API

In terms of data security, it is a good practice to backup your server’s files to an external location. At a rate of $3 per 100 GB of storage, Amazon S3 is both a cost-effective and reliable choice. To completely automate the backup process, you will have to integrate Amazon Services with the Web application. Let’s see how to do that.

AWS SDK for PHP

Amazon has created a software development kit that enables developers to easily access Amazon Web Services and integrate them with their application. It is built on Guzzle and contains classes and methods that can communicate with many different Amazon services. It requires at least PHP 5.5 compiled with cURL and Openssl extensions. AWS SDK for PHP can be downloaded here: https://github.com/aws/aws-sdk-php.

Installation

AWS SDK for PHP can be installed either through Composer or by downloading a zip file.

Installing via Composer

To install AWS SDK library through Composer, run the following command:

require aws/aws-sdk-php

After that, include the Composer’s autoloader (usually not necessary in PHP frameworks):

require 'vendor/autoload.php'; 

Installing via zip file

Download the zip file and include the AWS autoloader in your code:

require '/path/to/aws-autoloader.php'; 

Creating a Backup

First, instantiate the SDK library:

 'latest',    'region'  => 'us-west-2',    'credentials' => [        'key'    => 'my-access-key-id',        'secret' => 'my-secret-access-key',    ]]); 

Replace the data above with the desired Amazon region and your app key and secret. Please note that the hardcoded key values are used for educational purposes only and are not the safest way to store credentials in the production. You can read more about the available options AWS Credentials Guide.

Use the putObject?method to push files to Amazon S3:

try {    $s3->putObject([        'Bucket' => 'my-bucket-name',        'Key'    => 'file.zip',        'Body'   => fopen('/path/to/file.zip', 'r'),        'ACL'    => 'private',    ]);} catch (AwsExceptionS3Exception $e) {    echo "There was an error uploading the file.
";}

Bucket is the name of the S3 bucket where the file will be stored, key is the name of the file, body is the file contents, while the ACL allows you to control who can access the file.

To create a backup of your Web site, you could zip up all the Web site’s files and then send that zip file to your S3 bucket. To create the zip file containing all Web site’s files, a function that recursively archives files and folders to a zip file will be used:

ini_set('max_execution_time', 600);ini_set('memory_limit','1024M');// The zipData function is created by Marvin Menzerath// https://gist.github.com/MarvinMenzerath/4185113function zipData($source, $destination) {    if (extension_loaded('zip')) {        if (file_exists($source)) {            $zip = new ZipArchive();            if ($zip->open($destination, ZIPARCHIVE::CREATE)) {                $source = realpath($source);                if (is_dir($source)) {                    $iterator = new RecursiveDirectoryIterator($source);                    // skip dot files while iterating                     $iterator->setFlags(RecursiveDirectoryIterator::SKIP_DOTS);                    $files = new RecursiveIteratorIterator($iterator, RecursiveIteratorIterator::SELF_FIRST);                    foreach ($files as $file) {                        $file = realpath($file);                        if (is_dir($file)) {                            $zip->addEmptyDir(str_replace($source . '/', '', $file . '/'));                        } else if (is_file($file)) {                            $zip->addFromString(str_replace($source . '/', '', $file), file_get_contents($file));                        }                    }                } else if (is_file($source)) {                    $zip->addFromString(basename($source), file_get_contents($source));                }            }            return $zip->close();        }    }    return false;}$timestamp = date('Y-m-d');$filename = $timestamp . '-mywebsite.com.zip';$is_zipped = zipData('/var/www/html/mywebsite.com', '/home/myusername/backups/' . $filename);if($is_zipped) {    try {        $s3->putObject([            'Bucket' => 'my-bucket-name',            'Key'    => $filename,            'Body'   => fopen('/home/user/backups/' . $filename, 'r'),            'ACL'    => 'private',            'ContentType'    => 'application/zip',        ]);    } catch (AwsExceptionS3Exception $e) {        echo "There was an error uploading the file.
";    }}

Restoring a Backup

Fetching the files from S3 is done using a getObject function:

try {    $s3file = $s3->getObject( array(                               'Bucket' => 'my-bucket-name',                               'Key' => $filename ) );} catch (Exception $e) {    echo "There was an error downloading the file.
";}if ( $s3file[ 'ContentLength' ] > 0 && ! empty( $s3file[ 'ContentType' ] ) ) {    $body = $s3file->get('Body');        $is_restored = file_put_contents('/home/myusername/restore/' . $filename, $body);}

After downloading the zipped backup file from S3 to your server, you can now unzip the file and replace the contents of the current folder.

Automatic Backup Creation

It is a good idea to create a daily backup of your files. It can be done automatically by creating a cron job. To add a new cron job, ssh into your server and run the following command:

sudo crontab -e

Append the following line to the end of the file:

* 3 * * * /usr/bin/php /var/www/html/backup.php > /home/myusername/logs/backup_log.log

This cron job will run the backup script every day at 3 a.m.

Conclusion

The Amazon SDK for PHP is available as a service provider for Laravel and bundle for Symfony.

devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist