Ok, so I discovered a number of shortcomings in my recent attempt to sync a folder in one direction to Amazon S3 using encryption, the most important of which was that it wouldn’t resume a failed transfer efficiently, which in the case of large transfers wasn’t at all ideal (as I learned to be own cost - damn my 256k upload speed). So, this is attempt number 2. I decided to completely rewrite the script in Python instead to give me some more flexibility, coupled with the availability of Boto, a nice Python library for accessing all the Amazon Web Services.
Edit: this script is deprecated in favour of a rewritten version 2. I use Amazon S3to host large media files which I want cheap scalable bandwidth on, and for expandable offsite storage of important backups. I used to have some simple incremental tar scripts to do my offsite backups, but since I moved to Bacula, I’ve just established an alternative schedule and file set definition for my offsite backups, the critical subset of data I couldn’t possibly stand to lose (like company documents).
Thanks John for the reminderto investigate S3as a business media hosting service, it works like a charm! Now that I have far fewer bandwidth worries (max $0.17 per GB), the Torus Knot siteincludes a nifty dynamic selector so you can pick low, medium or high quality - the latter is at a higher resolution too, clocking in at about 100Mb. I may well use S3 for future public commercial downloads in the future too.