Dropbox is my favourite cloud storage provider as it’s the easiest to install & administrate remotely on a Linux server. Together with a 30-day version history of all files even on its free plan (180 days on pro), it functions as a great cloud backup solution. Here’s how I set it up, and this should work regardless of whether your server is at home on your internal network or out there on the big bad internet running WordPress 😉
1. Dropbox headless install via command line
Go to your home folder (cd
) and run the below in your shell to download the Dropbox Linux daemon (correct at the time of writing; if you have any issues go to Dropbox’s Linux install page and skip to the headless instructions):
cd ~ && wget -O - "https://www.dropbox.com/download?plat=lnx.x86_64" | tar xzf -
This will drop the Dropbox daemon into a folder .dropbox-dist
in your home directory. Bear in mind since the directory name begins with a period, you’re not going to see it with regular ls
but it’ll show up with the -a
option.
Start the daemon like this, with &
at the end so it returns you to the shell and lets the daemon run in the background:
~/.dropbox-dist/dropboxd &
There’s a chance dropboxd
will quit with an error on first run complaining about dependencies/shared objects – in this case try the following, which should install the libraries it needs, before starting the daemon again:
sudo apt install libglapi-mesa libxdamage1 libxfixes3 libxcb-glx0 libxcb-dri2-0 libxcb-dri3-0 libxcb-present0 libxcb-sync1 libxshmfence1 libxxf86vm1
When it’s first run it will give you a URL containing a unique code to visit – this can be done from any browser anywhere (I suspect you won’t have a windowing system and a GUI browser on your Linux server, and you don’t need it) – once visited you’ll be prompted to allow your server to sync with your Dropbox account. It’ll create a folder Dropbox
in your home directory where it’ll sync everything to.
The daemon is designed to run once per user – not as a system-wide service like other Linux daemons you may be used to – so don’t try to run as it as root as it’ll simply create a Dropbox folder in the root home directory instead!
It’s also helpful (but not necessary) to download this Python script which can interrogate the Dropbox daemon for things like sync status (as well as start and stop it for you).
2. Use rsync to synchronise your web server root with your Dropbox folder
First create a subfolder within your Dropbox folder for your website files (now that you have the Dropbox daemon running, you can do this from the Dropbox site and it should sync to your Linux server within a few seconds!) – in the below command examples I’ve used site
as the folder name.
I’m going to copy the whole web root across as a directory instead of tarballing it, and will use rsync
instead of the plain cp
command. rsync
will copy everything across on the first run, but on subsequent runs will only replace the files that have changed and/or add new ones. This will make it a lot easier for you to see what individual files have been added/removed/changed from the Dropbox web interface – which you can’t do if it’s all in a tarball (a zipped tarball is only useful if space is at a premium or your bandwidth is limited, and you’re willing to lose finer change tracking).
The basic rsync
syntax goes like this: rsync -optionletters --optionwords /source/path /destination/path
I would suggest using it like this:
rsync -cogrtuv --delete --chown=user:group /var/www/html /home/user/Dropbox/site
To break this down:
- Option letters
v
– for verbose; tell us a bit more about what’s going on and each file you’re copying across (useful to make sure it’s worked as intended)u
– update; don’t copy files that haven’t changedt
– preserve times; keep the date & time stamps of the original files (so instead of a backup having all file dates set to today, you will see the actual date when that file was last modified)r
– recurse into directories; go into all sub-directoriesg
– preserve the owner group on each fileo
– preserve the owner username on each file- Note: You could replace the above 4 options (
trgo
) with the archive option (a
) instead.a
includes everythingtrgo
does, and also preserves a bunch of other things including permissions and symbolic links – which may or may not be convenient depending on how you restore your backup. Read the manual page for rsync (at the shell run:man rsync
) for more information.
- Note: You could replace the above 4 options (
c
– use checksum; this is a more reliable way of checking if files have actually changed, asu
on its own will only check the timestamp and file size
- Option words
--delete
– if any files have been removed from the source, delete them from the destination too (don’t worry, Dropbox keeps deleted files for 30 days!)--chown=user:group
– when copying files to the destination, set the ownership touser
ingroup
– use your current username and primary group here – this is critical to ensure the Dropbox daemon can actually read the files, it can’t sync them if not!
- Source: I’ve used
/var/www/html
as an example here. Depending on your set up you may have something like/home/user/public_html
as your web root so replace as appropriate. - Destination: I’ve used
/home/user/Dropbox/site
as an example here. This should be the path to the Dropbox folder created by the daemon, so it should look similar, and be appended with thesite
subfolder we created at the beginning of this section.
In case some of the files in your web root are not readable as your current username, you may need to run the rsync
command as root using sudo
.
3. Backup your MariaDB database (works for MySQL too of course)
Now create a separate subfolder in your Dropbox for your database backups – this is important as we don’t want rsync to delete them because they’re not part of the web root! In the below examples I’m using sql
.
Here I’m going to use mysqldump
to dump the database
to a file, and copy that file across to the Dropbox folder for syncing to the cloud. Depending on your Linux distro and how its configured, the >
operator may behave differently when a file already exists. To avoid trouble I would suggest creating a backup staging folder in your home directory (e.g. mkdir backup-staging
) and then you can use the below shell script which you should amend as appropriate with your database-name
, user
& group
.
#!/bin/sh
mysqldump --single-transaction --quick --lock-tables=false database > /home/user/backup-staging/database-name.sql
chown user:group /home/user/backup-staging/*
chmod 700 /home/user/backup-staging/*
mv -f /home/user/backup-staging/database-name* /home/user/Dropbox/sql
To explain what’s going on:
mysqldump
- The first 3 options are recommended for running websites; a dump that uses only a single transaction without locking tables is preferable as it’ll have the least impact on your site’s performance, but it’ll be slower for you (alternatively you can remove these options which might produce a dump faster)
- I am assuming here that your current user (or you are running the script under
sudo
) has local super-user permission on MariaDB so gets logged in automagically. If it doesn’t, you will need to add the-u<username>
and-p<password>
options to specify a username & password to authenticate with. database
is of course the name of the database you want to backup- The bit after the
>
is telling it to write an output to a file at the given path
chown
will set theuser
&group
to be the same as the one the Dropbox daemon is running under, as otherwise it may not have permission to read the file (needless to say, files the daemon can’t read won’t be synced).chmod
– conservatively setting this at700
(only owning user has permissions) in case the database has private info in it and you have other users on your server.mv -f
– moves the database dump file to thesql
subfolder in Dropbox that we created earlier. The-f
option tells it to overwrite the previous file without asking – but don’t worry, Dropbox maintains 30 days of version history!
If you’re saving the above script to a file, don’t forget to make it executable (e.g. chmod 700 backup-script.sh
).
You’ll note I haven’t zipped the sql dump file here. Provided space isn’t an issue again, the advantage here is that you can view the sql file with nice formatting in the Dropbox web interface which is handy if you ever need to figure out which version to roll back to.
4. Setting up a schedule to save to Dropbox automatically
Now you’ve done all the hard work, you just need to set up a schedule for this to run periodically using cron. I’m assuming you’ll need to run this as root to ensure all files are readable by rsync, so do the following to open root’s crontab for editing:
sudo crontab -e
Now enter two rows at the end – which will run the command we used in section 2 and the script in section 3:
0 5 * * * rsync -qcogrtuv --delete --chown=user:group /var/www/html /home/user/Dropbox/site
10 5 * * * /home/user/backup-script.sh
Once you save & exit the editor, your server will now run these commands at the specified schedule. A cron entry has two parts, the first is 5 entries (separated from each other by a space) denoting the date & time at which the command should run, and the second part (also separated by a space from the first part) is the command itself.
Breaking this down we have:
- Time part – a
*
can go in any entry, and it means ‘every’- First entry: minutes past the hour (0-59)
- Second entry: hour of the day (0-23)
- Third entry: day of the month (1-31) – e.g. a
*
here would cause the command to run every day - Fourth entry: month of the year (1-12)
- Fifth entry: day of the week (0-6 where 0 is Sunday)
- In the above example, the first command runs at 5.00am and the second at 5.10am.
- Command part
- The one at the end of the first row is the rsync command we used in section 2, but you may notice that I’ve added the
q
option, which stands for ‘quiet’. This will suppress messages except for error messages. Since the output of the cron job is emailed to us, this will prevent us getting constant emails unless something has gone wrong. - The one at the end of the second row is the path to the shell script we saved in section 3.
- The one at the end of the first row is the rsync command we used in section 2, but you may notice that I’ve added the
You can try out different cron schedules here at crontab guru, who provide a helpful explanation of each schedule entered – which can help verify that you’re giving cron the right instructions!
Ignoring files or folders in Dropbox
If there are folders or files that you don’t want synchronised to Dropbox – e.g. the WordPress cache folder which changes frequently (and will clutter your Dropbox ‘recently deleted’ list unnecessarily) – you can request the Dropbox daemon to ignore them by setting an attribute on the file or folders you want to exclude. This should be done within the Dropbox folder (not the source folder) on your server after doing the first rsync.
To exclude a file or folder, do this:
attr -s com.dropbox.ignored -V 1 path-to-folder-or-file
This will add the com.dropbox.ignored attribute to the folder or file and assign it a value of 1. The Dropbox daemon looks for this attribute, and when found it will delete the file or folder from the Dropbox cloud but won’t do anything on your server itself (the file or folder and its contents will stay untouched locally).
You may first need to install the attr
package if this is your first time setting extended attributes and you get a “command not found” error:
sudo apt install attr
To stop excluding a file or folder and start syncing it with Dropbox again, do this:
attr -r com.dropbox.ignored path-to-folder-or-file
This removes the com.dropbox.ignored attribute. The Dropbox daemon will notice the change straight away and will start the sync within seconds.
If you downloaded the Dropbox Python script mentioned in the first section, you can see what files are excluded by navigating to a folder and running: ~/dropbox.py filestatus
You can do file exclusions on Windows and Mac too; Dropbox provides instructions for this here.
Final thoughts
Dropbox is cool huh? The possibilities are endless. One thing worth trying is setting your Dropbox folder as your web root – you can then edit your website wherever you have access to Dropbox and have edits sync to your web server in seconds!
You may also want to set the Dropbox daemon to run automatically on startup (to ensure you don’t lose sync if the server is rebooted and you forget to start Dropbox) – LinuxBabe has some great instructions on how to do that here.
Was this useful? Let me know your thoughts in the comments!
Cloud photo by engin akyurt on Unsplash
Hi there. Is there a way to only upload folders to Dropbox, instead of sync? Right now my Dropbox has lots of files that I don’t want added to my server. I would like to set a cronjob to just upload/update everything from /media/external-HD to Dropbox on Saturday’s at 2AM, but not to download everything from Dropbox to the external hard drive. Is that possible?
That’s possible – instead of using the Dropbox daemon, you can use rclone (https://rclone.org/dropbox/) to do a one-way sync of folders to Dropbox.