I don’t know much about bash scripting yet. Also, I would like someone to explain or realize by explaining me (if you have the time of course) the creation of this one.
I explain it: :
I have a website and a subscription on Mega.nz or Mediafire or Uptobox or 1File. They are all in premiums. But I prefer Mega.nz.
I’d like to create a script in bash with a cron that allows me twice a day to make a backup of the databases in .sql.tar.gz and a backup of the whole site that weighs about 3 GB in tar.gz. These backups would go from my host to my Mega.nz account.
In your opinion, is there a way to achieve this?
Personally, my bash scripting skills are too limited for me to do it alone.
If one of you could volunteer, I’ll buy him a beer.
Can you actually access the content of those websites from the shell? When it comes to scripting, you essentially need to enter the same commands, you would enter into the terminal to do those things. The cron job later will only automate the running of that script.
If you can SSH into that website, rsync out to be a good option to do the backing up of that data base.
Sorry i can’t be more helpfull, but I’m not familiar with the sites you mentioned.
In fact, there is only one site that belongs to me and in which I can obviously enter ssh.
The destination of the backup is simply an online file host called mega.nz.
To put the backup file on Mega, I can only do it in ftp.
Isnt there a mega client ?
Building on what @cscs said, you can install the Mega client, synchronize a directory structure, then just dump your SQL backups in there. They will transfer to Mega automatically. The only gotcha is the transfer limit imposed by Mega (with a free account), which I think is 5Gb in a 6 hour period. That may not affect you, though.
This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.