-
Bug
-
Resolution: Duplicate
-
Undefined
-
None
-
Unspecified
-
False
-
-
False
-
-
https://github.com/ansible-collections/amazon.aws/issues/1866
-
-
- Summary
-
When I try to put an object into S3 on a snowball device it fails.
-
-
- Issue Type
-
Bug Report
-
-
- Component Name
-
s3_object
-
-
- Ansible Version
-
```
ansible [core 2.14.9]
config file = /home/ec2-user/openshift4-snowball/playbooks/ansible.cfg
configured module search path = ['/home/ec2-user/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.9/site-packages/ansible
ansible collection location = /home/ec2-user/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.9.18 (main, Sep 7 2023, 00:00:00) [GCC 11.4.1 20230605 (Red Hat 11.4.1-2)] (/usr/bin/python3)
jinja version = 3.1.2
libyaml = True
```
-
-
- Collection Versions
-
```
Collection Version
----------------- -------
amazon.aws 7.0.0
ansible.netcommon 5.3.0
ansible.posix 1.5.4
ansible.utils 2.11.0
community.aws 7.0.0
community.crypto 2.16.0
community.general 8.0.2
kubernetes.core 2.4.0
```
-
-
- AWS SDK versions
-
```
Name: boto
Version: 2.49.0
Summary: Amazon Web Services Library
Home-page: https://github.com/boto/boto/
Author: Mitch Garnaat
Author-email: mitch@garnaat.com
License: MIT
Location: /home/ec2-user/.local/lib/python3.9/site-packages
Requires:
Required-by:
—
Name: boto3
Version: 1.29.1
Summary: The AWS SDK for Python
Home-page: https://github.com/boto/boto3
Author: Amazon Web Services
Author-email:
License: Apache License 2.0
Location: /home/ec2-user/.local/lib/python3.9/site-packages
Requires: botocore, jmespath, s3transfer
Required-by:
—
Name: botocore
Version: 1.32.1
Summary: Low-level, data-driven core of boto 3.
Home-page: https://github.com/boto/botocore
Author: Amazon Web Services
Author-email:
License: Apache License 2.0
Location: /home/ec2-user/.local/lib/python3.9/site-packages
Requires: jmespath, python-dateutil, urllib3
Required-by: awscli, boto3, s3transfer
```
-
-
- Configuration
-
```
CACHE_PLUGIN(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = jsonfile
CACHE_PLUGIN_CONNECTION(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = /tmp/ansiblecachedir
CONFIG_FILE() = /home/ec2-user/openshift4-snowball/playbooks/ansible.cfg
DEFAULT_FORKS(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = 5
DEFAULT_HOST_LIST(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = ['/home/ec2-user/openshift4-snowball/>
DEFAULT_LOG_PATH(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = /home/ec2-user/openshift4-snowball/pla>
DEFAULT_STDOUT_CALLBACK(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = debug
HOST_KEY_CHECKING(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = False
RETRY_FILES_ENABLED(/home/ec2-user/openshift4-snowball/playbooks/ansible.cfg) = False
```
-
-
- OS / Environment
-
RHEL 9.3
-
-
- Steps to Reproduce
-
```
- name: Upload file to S3 Bucket
amazon.aws.s3_object:
bucket: "mybucket"
region: snow
profile: snowballEdge
endpoint_url: 'https://192.168.1.40'
aws_ca_bundle: /etc/pki/ca-trust/source/anchors/snow_cert.pem
src: "my_file.txt"
object: my_file.txt
mode: put
overwrite: always
```
-
-
- Expected Results
-
I expect the file to be uploaded to the S3 bucket on the snowball edge device
-
-
- Actual Results
-
```
The full traceback is:
Traceback (most recent call last):
File "/home/ec2-user/.local/lib/python3.9/site-packages/boto3/s3/transfer.py", line 292, in upload_file
future.result()
File "/home/ec2-user/.local/lib/python3.9/site-packages/s3transfer/futures.py", line 103, in result
return self._coordinator.result()
File "/home/ec2-user/.local/lib/python3.9/site-packages/s3transfer/futures.py", line 266, in result
raise self._exception
File "/home/ec2-user/.local/lib/python3.9/site-packages/s3transfer/tasks.py", line 139, in _call_
return self._execute_main(kwargs)
File "/home/ec2-user/.local/lib/python3.9/site-packages/s3transfer/tasks.py", line 162, in _execute_main
return_value = self._main(**kwargs)
File "/home/ec2-user/.local/lib/python3.9/site-packages/s3transfer/tasks.py", line 348, in _main
response = client.create_multipart_upload(
File "/home/ec2-user/.local/lib/python3.9/site-packages/botocore/client.py", line 535, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/ec2-user/.local/lib/python3.9/site-packages/botocore/client.py", line 983, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (NotImplemented) when calling the CreateMultipartUpload operation: A header you provided implies functionality that is not implemented
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/tmp/ansible_amazon.aws.s3_object_payload_jij9mc6e/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 753, in upload_s3file
File "/tmp/ansible_amazon.aws.s3_object_payload_jij9mc6e/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/module_utils/retries.py", line 105, in deciding_wrapper
return retrying_wrapper(*args, **kwargs)
File "/tmp/ansible_amazon.aws.s3_object_payload_jij9mc6e/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/module_utils/cloud.py", line 119, in _retry_wrapper
return _retry_func(
File "/tmp/ansible_amazon.aws.s3_object_payload_jij9mc6e/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/module_utils/cloud.py", line 68, in _retry_func
return func()
File "/home/ec2-user/.local/lib/python3.9/site-packages/boto3/s3/inject.py", line 143, in upload_file
return transfer.upload_file(
File "/home/ec2-user/.local/lib/python3.9/site-packages/boto3/s3/transfer.py", line 298, in upload_file
raise S3UploadFailedError(
boto3.exceptions.S3UploadFailedError: Failed to upload /opt/openshift//rhcos/rhcos.img to awsclubb/rhcos_test.img: An error occurred (NotImplemented) when calling the CreateMultipartUpload operation: A header you provided implies functionality that is not implemented
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/tmp/ansible_amazon.aws.s3_object_payload_jij9mc6e/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 1576, in main
File "/tmp/ansible_amazon.aws.s3_object_payload_jij9mc6e/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 1127, in s3_object_do_put
File "/tmp/ansible_amazon.aws.s3_object_payload_jij9mc6e/ansible_amazon.aws.s3_object_payload.zip/ansible_collections/amazon/aws/plugins/modules/s3_object.py", line 762, in upload_s3file
S3ObjectFailure: Unable to complete PUT operation.
fatal: [localhost]: FAILED! => {
"boto3_version": "1.29.1",
"botocore_version": "1.32.1",
"changed": false,
"invocation": {
"module_args":
}
}
MSG:
Unable to complete PUT operation.: Failed to upload /opt/openshift//rhcos/rhcos.img to awsclubb/rhcos_test.img: An error occurred (NotImplemented) when calling the CreateMultipartUpload operation: A header you provided implies functionality that is not implemented
```
-
-
- Code of Conduct
-
- [X] I agree to follow the Ansible Code of Conduct