Description of problem: RGW incorrectly processes ACLs for object whose name contains double underscore ('__'). Version-Release number of selected component (if applicable): 1.3.x, 2.0. How reproducible: 100% This bug corresponds to the following upstream issue: http://tracker.ceph.com/issues/16856 A fix from Orit Wasserman has been submitted: https://github.com/ceph/ceph/pull/10939
Hi, I'm still facing this issue when there are underscores on both end of the object name. >>> import boto >>> #import boto.s3.connectionaccess_key = 'GD62VDOK3D9XFHCI5REZ' ... KeyboardInterrupt >>> import boto.s3.connection >>> access_key = 'GD62VDOK3D9XFHCI5REZ' >>> secret_key = 'MNzDrMVE12iGOiX5uzcOq52ZzhRCqh6YNTf22LKd' >>> conn = boto.connect_s3( ... aws_access_key_id = access_key, ... aws_secret_access_key = secret_key, ... host = 'magna111.ceph.redhat.com', ... port = 8080, ... is_secure=False, ... calling_format = boto.s3.connection.OrdinaryCallingFormat(), ... ) >>> bucket = conn.create_bucket('for_1372346') >>> key = bucket.new_key('_temp2_') >>> key.set_contents_from_filename('tempfile') 8388608 >>> key.set_canned_acl('private') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.7/site-packages/boto/s3/key.py", line 584, in set_canned_acl return self.bucket.set_canned_acl(acl_str, self.name, headers) File "/usr/lib/python2.7/site-packages/boto/s3/bucket.py", line 908, in set_canned_acl response.status, response.reason, body) boto.exception.S3ResponseError: S3ResponseError: 404 Not Found <?xml version="1.0" encoding="UTF-8"?><Error><Code>NoSuchKey</Code><BucketName>for_1372346</BucketName><RequestId>tx00000000000000000db82-0058078450-20746-default</RequestId><HostId>20746-default-default</HostId></Error> I'm moving this back to ASSIGNED state. Please let me know if there are any concerns or issues. Regards, Vasishta
Fix in master: https://github.com/ceph/ceph/pull/11566
Hi, Working fine. (Pasting observation of two cases, objectname '__' and '_temp_') >>> import boto >>> import boto.s3.connection >>> >>> access_key = 'server' >>> secret_key = 'server' >>> secret_key1 = 'client' >>> access_key1 = 'client' >>> >>> client_conn = boto.connect_s3( ... aws_access_key_id = access_key1, ... aws_secret_access_key = secret_key1, ... host = 'magna111.ceph.redhat.com', ... port = 8080, ... is_secure=False, ... calling_format = boto.s3.connection.OrdinaryCallingFormat(), ... ) >>> server_conn = boto.connect_s3( ... aws_access_key_id = access_key, ... aws_secret_access_key = secret_key, ... host = 'magna111.ceph.redhat.com', ... port = 8080, ... is_secure=False, ... calling_format = boto.s3.connection.OrdinaryCallingFormat(), ... ) >>> server = server_conn.create_bucket('new_server_bucket_1') >>> server.set_acl('public-read') >>> client = client_conn.get_bucket('new_server_bucket_1') >>> key1 = server.new_key('__') >>> key1.set_contents_from_string('Hello World ! WARHW') 19 >>> key1.set_canned_acl('private') >>> key2 = client.get_key('__') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.7/site-packages/boto/s3/bucket.py", line 193, in get_key key, resp = self._get_key_internal(key_name, headers, query_args_l) File "/usr/lib/python2.7/site-packages/boto/s3/bucket.py", line 231, in _get_key_internal response.status, response.reason, '') boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden >>> >>> >>> >>> key1 = server.new_key('_temp_') >>> key1.set_contents_from_string('Hello World ! WARHW') 19 >>> key1.set_canned_acl('public-read') >>> >>> key2 = client.get_key('_temp_') >>>
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHSA-2016-2815.html