- September 2024 (1)
- August 2024 (2)
- July 2024 (2)
- May 2024 (2)
- April 2024 (2)
- February 2024 (2)
- April 2023 (1)
- March 2023 (2)
- September 2022 (1)
- February 2022 (1)
- November 2021 (1)
- March 2021 (1)
- February 2021 (2)
- August 2019 (1)
- November 2018 (1)
- May 2017 (1)
- December 2016 (1)
- April 2016 (1)
- August 2015 (1)
- December 2014 (1)
- August 2014 (1)
- March 2014 (1)
- December 2013 (1)
- October 2013 (3)
- September 2013 (4)
- August 2013 (2)
- July 2013 (1)
- June 2013 (1)
- February 2013 (1)
- October 2012 (1)
- June 2012 (1)
- May 2012 (1)
- April 2012 (1)
- February 2012 (1)
- October 2011 (1)
- June 2011 (1)
- May 2011 (1)
- April 2011 (1)
- March 2011 (1)
- February 2011 (1)
- January 2011 (1)
- December 2010 (3)
- November 2010 (1)
- October 2010 (1)
- September 2010 (1)
- August 2010 (1)
- July 2010 (1)
- May 2010 (3)
- April 2010 (1)
- March 2010 (2)
- February 2010 (3)
- January 2010 (4)
- December 2009 (2)
- November 2009 (5)
- October 2009 (2)
- September 2009 (2)
- August 2009 (3)
- July 2009 (1)
- May 2009 (1)
- April 2009 (1)
- March 2009 (5)
- February 2009 (5)
- January 2009 (5)
- December 2008 (3)
- November 2008 (7)
- October 2008 (4)
- September 2008 (2)
- August 2008 (1)
- July 2008 (1)
- June 2008 (1)
- May 2008 (1)
- April 2008 (1)
- January 2008 (5)
- December 2007 (3)
- March 2007 (3)
- February 2007 (1)
- January 2007 (2)
- December 2006 (4)
- November 2006 (18)
- 3D (5)
- AI (14)
- Admin (3)
- Blogging (5)
- Business of Software (9)
- Copyright (1)
- Dirigible (3)
- Django (1)
- Eee (3)
- Finance (6)
- Funny (11)
- GPU Computing (2)
- Gadgets (8)
- JavaScript (1)
- Linux (13)
- Memes (2)
- Meta (7)
- Music (4)
- NSLU2 offsite backup project (13)
- OLPC XO (2)
- Oddities (4)
- Personal (3)
- Politics (3)
- Programming (61)
- Python (36)
- PythonAnywhere (12)
- Quick links (2)
- Rants (4)
- Raspberry Pi (1)
- Resolver One (22)
- Resolver Systems (18)
- Robotics (8)
- Space (2)
- Talks (3)
- Uncategorized (5)
- VoIP (2)
- Website design (4)
Project: Automated offsite backups for an NSLU2 -- part 12
Previously in this series: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8, Part 9, Part 10, Part 11.
I'm setting up automated offsite backups from my NSLU2 to Amazon S3. With suprisingly little effort, I've managed to get a tool called s3sync running on the "slug" (as it's known). s3sync is a Ruby script, so in order to run it, I had to install Ruby, which in turn meant that I had to replace the slug's firmware with a different version of Linux, called Unslung. Once all of this was done, I just had to set up the appropriate directory structures and certificates so that the sync tool could use SSL, and write a simple upload/download script. All of this worked pretty much as advertised in the tools' respective documentation -- for the details, see the previous posts in this series.
My final step in my last post was to set up a cron job to synchronise quite a lot of data up to S3 overnight. This post covers what I found the next day.
Here is the line from the crontab file:
42 22 * * * root /home/s3sync/upload.sh &> /tmp/s3sync.log
Checking this morning brought some bad news. Nothing had been written to the log file, and the bucket I'd set up to receive the backup on S3 had only 6Mb of data -- as compared to a total of 4Gb+ that was there to be backed up.
Clearly something had gone wrong.
I figured it was best to try again, this time trying to eliminate whatever problem had occurred with the cron job by simply running the backup script from a command prompt. After all, I had run the script from a command line previously, and had seen some useful logging information.
This time it at least seemed to be logging something:
-bash-3.1# /home/s3sync/upload.sh
Create node Giles
...
I left it for an hour or so, after which it had uploaded 141.25Mb, logging all the while. Clearly there was (a) something wrong with the way I had set up logging from the crontab, and (b) something had interrupted it when it had run the previous night. After a little thought, I came to the conclusion that it might not ba a great idea to have something in the crontab that could take multiple hours to run; there could well be a limit, at least in the version of the cron daemon that lives on the NSLU2, and the sync process might have been killed before it was able to sync its output to the log file. That said, I could find no mention of such a thing on the obvious page on the NSLU2-Linux site. I decided to ask the site's mailing list, to see if anyone knew for sure if this was the answer; in the meantime, I watched the sync from the command line as it reached 683 items and 270Mb.
Next: Further investigations.