403 Forbidden when using --s3-use-multiprocessing

Asked by Andy Wallace on 2016-06-29

Duplicity 0.7.05 on CentOS 6., backing up to Amazon S3
Boto version is 2.38.0

This works perfectly using the default and upload the file in one chunk, but if I try to speed up the backup by using --s3-use-multiprocessing, it fails with this error:

Uploading s3+http://ipt-test/duplicity-full.20160629T090332Z.vol1.difftar.gpg to STANDARD Storage
Uploading 104923735 bytes in 5 chunks
Waiting for the pool to finish processing 5 tasks
PoolWorker-1: Uploading chunk 1
PoolWorker-2: Uploading chunk 2
PoolWorker-3: Uploading chunk 3
PoolWorker-4: Uploading chunk 4
PoolWorker-5: Uploading chunk 5
AsyncScheduler: scheduling task for asynchronous execution
Traceback (most recent call last):
  File "/usr/lib64/python2.6/site-packages/duplicity/backends/_boto_multi.py", line 199, in _upload
    for mp in bucket.list_multipart_uploads():
  File "/usr/lib/python2.6/site-packages/boto/s3/bucketlistresultset.py", line 127, in multipart_upload_lister
  File "/usr/lib/python2.6/site-packages/boto/s3/bucket.py", line 609, in get_all_multipart_uploads
    'uploads', headers, **params)
  File "/usr/lib/python2.6/site-packages/boto/s3/bucket.py", line 410, in _get_all
    response.status, response.reason, body)
S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>55C09CF6C315CB36</RequestId><HostId>EhQphl44j+I5eaIpDNm3NHHYptXxsKdol9cIFR9pi0r/EAUU/Tx8ZvgwiggPkqvJWAC3psbby9M=</HostId></Error>
PoolWorker-4: Upload of chunk 4 failed. Retrying 4 more times...

It's not a one-off, it happens every time and the backup never completes. Any idea what might be going wrong?

Question information

English Edit question
Duplicity Edit question
No assignee Edit question
Solved by:
Andy Wallace
Last query:
Last reply:
Andy Wallace (andy-wallace-g) said : #1

Found the problem - some missing privileges in my IAM user