Duplicity - Bandwidth Efficient Encrypted Backup

Error an hour thru backup relating to duplicity/diffdir.py

Asked by Kevin Smith on 2010-03-12

About an hour through the backup process to S3, I get the following error. I can see from my S3 bucket's contents that the backup has definitely stopped. This error doesn't really make sense to me. I'm using version 0.6.08b. The script I'm using to back up my server is the bash script by Damon Timm, here: http://blog.damontimm.com/bash-script-incremental-encrypted-backups-duplicity-amazon-s3/

# /usr/local/sbin/dt-s3-backup.sh --backup
Traceback (most recent call last):
  File "/usr/bin/duplicity", line 1239, in ?
    with_tempdir(main)
  File "/usr/bin/duplicity", line 1232, in with_tempdir
    fn()
  File "/usr/bin/duplicity", line 1205, in main
    full_backup(col_stats)
  File "/usr/bin/duplicity", line 416, in full_backup
    globals.backend)
  File "/usr/bin/duplicity", line 294, in write_multivol
    globals.gpg_profile, globals.volsize)
  File "/usr/lib/python2.4/site-packages/duplicity/gpg.py", line 279, in GPGWriteFile
    data = block_iter.next(min(block_size, bytes_to_go)).data
  File "/usr/lib/python2.4/site-packages/duplicity/diffdir.py", line 505, in next
    result = self.process(self.input_iter.next(), size)
  File "/usr/lib/python2.4/site-packages/duplicity/diffdir.py", line 631, in process
    data, last_block = self.get_data_block(fp, size - 512)
  File "/usr/lib/python2.4/site-packages/duplicity/diffdir.py", line 658, in get_data_block
    buf = fp.read(read_size)
  File "/usr/lib/python2.4/site-packages/duplicity/diffdir.py", line 415, in read
    buf = self.infile.read(length)
  File "/usr/lib/python2.4/site-packages/duplicity/diffdir.py", line 384, in read
    buf = self.infile.read(length)
IOError: [Errno 22] Invalid argument

Question information

Language:
English Edit question
Status:
Solved
For:
Duplicity Edit question
Assignee:
No assignee Edit question
Solved by:
Kevin Smith
Solved:
2010-03-13
Last query:
2010-03-13
Last reply:

As an FYI, here's the command that I was running which caused the error:

/usr/bin/duplicity -v3 --full-if-older-than 14D --encrypt-key=XXXXXXXXX --sign-key=XXXXXXXXX --exclude /home/*/Trash --exclude /home/*/Projects/Completed --exclude /**.DS_Store --exclude /**Icon? --exclude /**.AppleDouble --exclude /root/.jungledisk/cache --exclude /root/.cpan --exclude /var/tmp --include=/boot --include=/etc --include=/home --include=/lib --include=/root --include=/usr --include=/var --exclude=** / s3+http://my_bucket_name/

When I removed the options to include /home and /usr, the backup of course was much smaller. Duplicity backed it up, no problem. I've now added /usr, and I'm running backup again so that it'll add it as an incremental backup. It's chugging along right now without a problem.

Has anyone noticed a limit to how large the full backup can be? Given that broken up chunks are working here, I'm wondering if that might not be the issue.

It looks like the offending directory is /home/virtfs. If you have this directory, exclude it from your backups.

Guilherme Salgado (salgado) said : #3

sounds like bug 543180