Project: Automated offsite backups for an NSLU2 -- part 13
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8, Part 9, Part 10, Part 11, Part 12.
I'm setting up automated offsite backups from my NSLU2 to Amazon S3. With suprisingly little effort, I've managed to get a tool called s3sync running on the "slug" (as it's known). s3sync is a Ruby script, so in order to run it, I had to install Ruby, which in turn meant that I had to replace the slug's firmware with a different version of Linux, called Unslung. Once all of this was done, I just had to set up the appropriate directory structures and certificates so that the sync tool could use SSL, and write a simple upload/download script. All of this worked pretty much as advertised in the tools' respective documentation -- for the details, see the previous posts in this series.
My final step had been to set up a cron job to run the upload script, but it had failed, not logging anything. In order to debug, I ran the upload script directly from the command line, and left it to run overnight, copying a large set of directories to S3.
Project: Automated offsite backups for an NSLU2 -- part 12
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8, Part 9, Part 10, Part 11.
I'm setting up automated offsite backups from my NSLU2 to Amazon S3. With suprisingly little effort, I've managed to get a tool called s3sync running on the "slug" (as it's known). s3sync is a Ruby script, so in order to run it, I had to install Ruby, which in turn meant that I had to replace the slug's firmware with a different version of Linux, called Unslung. Once all of this was done, I just had to set up the appropriate directory structures and certificates so that the sync tool could use SSL, and write a simple upload/download script. All of this worked pretty much as advertised in the tools' respective documentation -- for the details, see the previous posts in this series.
My final step in my last post was to set up a cron job to synchronise quite a lot of data up to S3 overnight. This post covers what I found the next day.
Project: Automated offsite backups for an NSLU2 -- part 11
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8, Part 9, Part 10.
I'm setting up automated offsite backups from my NSLU2 to Amazon S3. With suprisingly little effort, I've managed to get a tool called s3sync running on the "slug" (as it's known). s3sync is a Ruby script, so in order to run it, I had to install Ruby, which in turn meant that I had to replace the slug's firmware with a different version of Linux, called Unslung. All of this worked pretty much as advertised in the tools' respective documentation -- for the details, see the previous posts in this series.
Having confirmed that s3sync worked as I'd expect it to, I needed to install it
in a sensible place -- I'd previously just put it in /tmp
-- set it up so that
I could use SSL to encrypt the data while it was on its way to Amazon, and then
write a script to synchronise at least one of the directories I want backed up.
I'd then be able to test the script, schedule it, test the scheduling, and then
I'd be done!
Project: Automated offsite backups for an NSLU2 -- part 10
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8, Part 9.
I'm setting up automated offsite backups from my NSLU2 to Amazon S3. With suprisingly little effort, I've managed to get a tool called s3sync running on the "slug" (as it's known). s3sync is a Ruby script, so in order to run it, I had to install Ruby, which in turn meant that I had to replace the slug's firmware with a different version of Linux, called Unslung. All of this worked pretty much as advertised in the tools' respective documentation -- for the details, see the previous posts in this series.
As all of the pieces were in place, I next needed to do some simple tests to make sure it could handle the kind of files I wanted it to back up. In particular, I wanted it to be able to handle deep directory hierarchies, and to remember user and group ownership and file permissions.
Project: Automated offsite backups for an NSLU2 -- part 9
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8.
I'm setting up automated offsite backups from my NSLU2 to Amazon S3. The tool I need to use to make this happen is called s3sync; it's a Ruby script, so in order to run it, I had to work out some way of installing Ruby. In order to do that, I had to replace the slug's firmware with a different version of Linux, called Unslung; once that was done, getting Ruby up and running wasn't too tricky. The next step was to get s3sync itself to work.
Project: Automated offsite backups for an NSLU2 -- part 8
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7.
I've discovered that in order to get automated offsite backups from my NSLU2 to Amazon S3, I have to get it to run Ruby so that it can run s3sync. Installing Ruby required the slug's firmware to be upgraded to a new version of Linux, called Unslung, so I did that -- and I also installed s3sync on an Ubuntu machine as a dry run. Both of these worked out OK, so the next step was to get the Ruby language itself running under the Unslung firmware -- and importantly, to make sure that Ruby had the OpenSSL package installed; the latter had proven non-obvious under regular Linux, so I was expecting problems on Unslung, which is, after all, a cut-down version of the operating system.
Project: Automated offsite backups for an NSLU2 -- part 7
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6.
I've discovered that in order to get automated offsite backups from my NSLU2 to Amazon S3, I have to get it to run Ruby so that it can run s3sync. I've installed the latter on a Ubuntu machine as a dry run, and it all looks good -- so now I need to get Ruby onto the slug, and the first step in that direction is to install Unslung.
Project: Automated offsite backups for an NSLU2 -- part 6
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5.
I now know that in order to get automated offsite backups from my NSLU2 to Amazon S3, I have to get it to run Ruby so that it can run s3sync. I want to back everything up first, which is going to take some time -- so while that's happening, I'll get both Ruby and s3sync installed on an Ubuntu Linux machine as a dry run.
Project: Automated offsite backups for an NSLU2 -- part 5
Previously in this series: Part 1, Part 2, Part 3, Part 4.
In order to get automated offsite backups from my NSLU2 to Amazon S3, I've determined I need to use s3sync, a Ruby script. Obviously, this means that I need to get Ruby running on the "slug".
As I noted earlier, the standard firmware will not support Ruby, so the first step is going to have to be to install new firmware. The matrix of possibilities on the NSLU2-Linux site lists a bunch. My gut instinct is to stay as close to the original firmware -- to the left of the matrix -- as possible. I've been using Linux for a long time now - on and off since 1992 -- but I've never really got to a serious kernel-recompiling porting-it-to-a-ZX81 level with it. So let's keep things as simple as possible.
Project: Automated offsite backups for an NSLU2 -- part 4
Previously in this series: Part 1, Part 2, Part 3.
I am trying to get my NSLU2 to back itself up automatically to Amazon S3. I currently know that in order to get this to work, the "slug" (as it's affectionally known) will need to be upgraded with new firmware -- basically, a new version of Linux. Just which of the many competing firmwares is appropriate will depend on the software I use to do the sync, and so it's time to work through the various options presented in Jeremy Zawodny's post and the comments people have left there.