How to take backup of digital ocean droplet in dropbox or offline disk_?


(Lal Salaam) #1

How do we take backup of full DO droplet in dropbox or any other offline ■■■■?
I want to backup my DB everyday night in dropbox or other file.

@anon71964425


#2

Won’t that be overly stupid and useless? .

You can set up a cron job to rsync all data to your PC though.

My recommendation would be to try this:
Set a cron job to create a dump of your database in a specific folder every x hours.

Use a tool like S3CMD and configure it so that wherever the local folder is changed it syncs it with the remote bucket: either on DO or on AWS or minio


(Mr. Potter) #4

I would also love to export the full droplet into my drive.

That’s good for syncing. Not for backups.
I’m also looking for a good backup script (only folders) export them to Drive


#5

How is storing database dump only good for syncing not for backup?


(Mr. Potter) #6

What I meant was, you are telling us to do a rsync using cronjob. which is good.
What if I wants keep database of last 30 days? rsync only gonna keep the latest version.
So, we actually need script to zip it & send it via rsync or whatever.

sync !== backup


#7

Not really, If your dumping script is creating a unique filename for every dump by appending the backup date/time then it will sync all of those backups.

You however will have to add additional cron job to delete files created 30 days ago separately.


(Mr. Potter) #8

Yup. So, the bottom line is. We need a good reliable script. I searched github but could not find any reliable one. AFAIR I also discussed the same in orng.

If you get some free time, mind to write one for us?


#9

I definitely can share a nuked version of the python script I use. We basically have written a script which is triggered at 12:00 in night and noon and dumps the database (mariadb and postgresql) into the following format: [dbname-backup-date-time.dump] and then stores it into /var/custom/backup folder and then S3CMD is configured in a way that a python script scans changes to the directory every 5 minutes then uses S3CMD to send all the changed files to our Digitalocean Spaces bucket.
This is so far working just fine for us. But due to the experimental nature of the script and all the tokens being hard coded, I can’t share it as is. I’ll make it available on my github whenever I get time.


(Mr. Potter) #10

Thanks in advance. :slight_smile:

I don’t have knowledge of python or bash script otherwise I would’ve built that myself. I’m also planning to create my own backup script using PHP.


#11

All the best!


(Mr. Potter) #12

Hello,
Everyone. I was googling backup script today. Found this. Looks reliable. I have not tried it yet, I will let you know, when I do. :slight_smile:


(Lal Salaam) #14

No, on Postgres


#15

Process for dumping database is same for postgres as well, you just need to change the command.


#17

There is no reason for this command to not to work.


#19

Probably!
Give it a shot