Bug 1505400 - [ rgw ] s3: DeleteBucketWebsite fails with 403
Summary: [ rgw ] s3: DeleteBucketWebsite fails with 403
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat
Component: RGW
Version: 3.0
Hardware: Unspecified
OS: Unspecified
low
medium
Target Milestone: rc
: 3.1
Assignee: Adam C. Emerson
QA Contact: Shreekar
Erin Donnelly
URL:
Whiteboard:
Keywords: TestOnly
Depends On:
Blocks: 1494421
TreeView+ depends on / blocked
 
Reported: 2017-10-23 13:55 UTC by Shreekar
Modified: 2018-09-26 18:18 UTC (History)
10 users (show)

(edit)
.`delete_website_configuration` cannot be enabled by setting the bucket policy `DeleteBucketWebsite`

In the Ceph Object Gateway, a user cannot enable `delete_website_configuration` on a bucket even when a bucket policy has been written granting them `S3:DeleteBucketWebsite` permission.

To work around this issue, you can use other methods of permitting, for example, by using admin operations, by bucket owner, or by ACL.
Clone Of:
(edit)
Last Closed: 2018-09-26 18:16:43 UTC


Attachments (Terms of Use)
rgw log (2.07 MB, application/x-gzip)
2018-08-23 09:47 UTC, Shreekar
no flags Details


External Trackers
Tracker ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2018:2819 None None None 2018-09-26 18:18 UTC
Github ceph ceph pull 18606 None None None 2017-10-28 03:04 UTC
Ceph Project Bug Tracker 21896 None None None 2017-10-23 20:31 UTC

Description Shreekar 2017-10-23 13:55:08 UTC
Description of problem:
After setting the policy s3:DeleteBucketWebsite, delete_website_configuration() fail to work.

Version-Release number of selected component (if applicable):
ceph version 12.2.1-23.el7cp 

How reproducible:
Always

Steps to Reproduce:
1. Create a bucket with tenant user testx$tester
2. Write a bucket policy providing s3:DeleteBucketWebsite to another user in the same or different tenant
3. After setting the policy, try deleting static website configuration on the bucket

Actual results:

S3ResponseError: 403 Forbidden

Expected results:
Should be able to delete website configuration

Additional info:

bucket_policy ={
        "Version": "2012-10-17",
        "Statement": [{
            "Effect": "Allow",#try deny also
            "Principal": {"AWS": ["arn:aws:iam::testx:user/tester1","arn:aws:iam::testx:user/tester2","arn:aws:iam::testy:user/tester1"]} ,
            "Action": ["s3:ListBucket", "s3:GetBucketWebsite","s3:PutBucketWebsite","s3:DeleteBucketWebsite"],

            "Resource": [
                    "arn:aws:s3::*:s3website",
            "arn:aws:s3::*:s3website/*"
        ],

        }]
}


>>> m.delete_website_configuration()
send: 'DELETE /testx:s3website/?website HTTP/1.1\r\nHost: magna118:8080\r\nAccept-Encoding: identity\r\nDate: Mon, 23 Oct 2017 13:44:41 GMT\r\nContent-Length: 0\r\nAuthorization: AWS TESTER1:wwteSaGUC6ya7GsMNdSJMc8/xU0=\r\nUser-Agent: Boto/2.48.0 Python/2.7.5 Linux/3.10.0-693.el7.x86_64\r\n\r\n'
reply: 'HTTP/1.1 403 Forbidden\r\n'
header: Content-Length: 219
header: x-amz-request-id: tx000000000000000000039-0059edf249-5ea7-default
header: Accept-Ranges: bytes
header: Content-Type: application/xml
header: Date: Mon, 23 Oct 2017 13:44:41 GMT
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/boto/s3/bucket.py", line 1551, in delete_website_configuration
    response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?><Error><Code>AccessDenied</Code><BucketName>s3website</BucketName><RequestId>tx000000000000000000039-0059edf249-5ea7-default</RequestId><HostId>5ea7-default-default</HostId></Error>

Comment 5 Adam C. Emerson 2017-10-23 20:30:45 UTC
Fixed in https://github.com/ceph/ceph/pull/18492

Comment 10 Adam C. Emerson 2017-11-07 21:10:41 UTC
I hereby verify the doc text!

(Also we have a fix for this. I don't know if people have not noticed or if they just aren't going to worry about things for 3.1 until 3.0 gets cut.)

Comment 13 Shreekar 2018-08-23 09:47 UTC
Created attachment 1478118 [details]
rgw log

Comment 14 Shreekar 2018-08-23 09:48:55 UTC
(In reply to Shreekar from comment #13)
> Created attachment 1478118 [details]
> rgw log

Steps followed to verify the bug:

ceph.conf:
[client.rgw.magna064]
host = magna064
keyring = /var/lib/ceph/radosgw/ceph-rgw.magna064/keyring
log file = /var/log/ceph/ceph-rgw-magna064.log
rgw frontends = civetweb port=10.8.128.64:8080 num_threads=100
rgw_enable_static_website = true
rgw_enable_apis = s3, swift, s3website
rgw dns s3website name = s3website.magna064.ceph.redhat.com
rgw_dns_name = magna064.ceph.redhat.com
rgw_resolve_cname = true
debug_rgw = 20
----------------------
/etc/hosts
10.8.128.64 magna064.ceph.redhat.com magna064
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
10.8.128.64 website.magna064.ceph.redhat.com
----------------------

boto script with aws2:
---
access_key = 'JA681ZXDB6HPKD9FEH3X'
secret_key = 'PUw1vVjuDKZgkMcYGJ8ziNvVuhEaVmf8TuewKTix'
#boto.config.add_section('s3')
#boto.config.set('s3', 'use-sigv4', 'True')

conn = boto.connect_s3(
    aws_access_key_id=access_key,
    aws_secret_access_key=secret_key,
    host='magna064',
    port=8080,
    is_secure=False,  # uncomment if you are not using ssl

    calling_format=boto.s3.connection.OrdinaryCallingFormat(),
)
#conn.auth_region_name = 'default'


website_bucket = conn.create_bucket("website")
----
[root@magna064 ~]# python s3website.py 
Traceback (most recent call last):
  File "s3website.py", line 22, in <module>
    website_bucket = conn.create_bucket("website")
  File "/usr/lib/python2.7/site-packages/boto/s3/connection.py", line 628, in create_bucket
    response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?><Error><Code>SignatureDoesNotMatch</Code><RequestId>tx000000000000000000002-005b7e7f64-1087-default</RequestId><HostId>1087-default-default</HostId></Error>

-----------------------------
boto script with aws4:
---
access_key = 'JA681ZXDB6HPKD9FEH3X'
secret_key = 'PUw1vVjuDKZgkMcYGJ8ziNvVuhEaVmf8TuewKTix'
boto.config.add_section('s3')
boto.config.set('s3', 'use-sigv4', 'True')

conn = boto.connect_s3(
    aws_access_key_id=access_key,
    aws_secret_access_key=secret_key,
    host='magna064',
    port=8080,
    is_secure=False,  # uncomment if you are not using ssl

    calling_format=boto.s3.connection.OrdinaryCallingFormat(),
)
conn.auth_region_name = 'default'


website_bucket = conn.create_bucket("website")
---
[root@magna064 ~]# python s3website.py 
Traceback (most recent call last):
  File "s3website.py", line 22, in <module>
    website_bucket = conn.create_bucket("website")
  File "/usr/lib/python2.7/site-packages/boto/s3/connection.py", line 628, in create_bucket
    response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 404 Not Found
<?xml version="1.0" encoding="UTF-8"?><Error><Code>NoSuchBucket</Code><BucketName>magna064</BucketName><RequestId>tx000000000000000000001-005b7e7f3a-1087-default</RequestId><HostId>1087-default-default</HostId></Error>

Comment 15 Shreekar 2018-08-23 13:01:05 UTC
(In reply to Shreekar from comment #14)
> (In reply to Shreekar from comment #13)
> > Created attachment 1478118 [details]
> > rgw log
> 
> Steps followed to verify the bug:
> 
> ceph.conf:
> [client.rgw.magna064]
> host = magna064
> keyring = /var/lib/ceph/radosgw/ceph-rgw.magna064/keyring
> log file = /var/log/ceph/ceph-rgw-magna064.log
> rgw frontends = civetweb port=10.8.128.64:8080 num_threads=100
> rgw_enable_static_website = true
> rgw_enable_apis = s3, swift, s3website
> rgw dns s3website name = s3website.magna064.ceph.redhat.com
> rgw_dns_name = magna064.ceph.redhat.com
> rgw_resolve_cname = true
> debug_rgw = 20
> ----------------------
> /etc/hosts
> 10.8.128.64 magna064.ceph.redhat.com magna064
> 127.0.0.1   localhost localhost.localdomain localhost4
> localhost4.localdomain4
> ::1         localhost localhost.localdomain localhost6
> localhost6.localdomain6
> 10.8.128.64 website.magna064.ceph.redhat.com
> ----------------------
> 
> boto script with aws2:
> ---
> access_key = 'JA681ZXDB6HPKD9FEH3X'
> secret_key = 'PUw1vVjuDKZgkMcYGJ8ziNvVuhEaVmf8TuewKTix'
> #boto.config.add_section('s3')
> #boto.config.set('s3', 'use-sigv4', 'True')
> 
> conn = boto.connect_s3(
>     aws_access_key_id=access_key,
>     aws_secret_access_key=secret_key,
>     host='magna064',
>     port=8080,
>     is_secure=False,  # uncomment if you are not using ssl
> 
>     calling_format=boto.s3.connection.OrdinaryCallingFormat(),
> )
> #conn.auth_region_name = 'default'
> 
> 
> website_bucket = conn.create_bucket("website")
> ----
> [root@magna064 ~]# python s3website.py 
> Traceback (most recent call last):
>   File "s3website.py", line 22, in <module>
>     website_bucket = conn.create_bucket("website")
>   File "/usr/lib/python2.7/site-packages/boto/s3/connection.py", line 628,
> in create_bucket
>     response.status, response.reason, body)
> boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
> <?xml version="1.0"
> encoding="UTF-8"?><Error><Code>SignatureDoesNotMatch</
> Code><RequestId>tx000000000000000000002-005b7e7f64-1087-default</
> RequestId><HostId>1087-default-default</HostId></Error>
> 
> -----------------------------
> boto script with aws4:
> ---
> access_key = 'JA681ZXDB6HPKD9FEH3X'
> secret_key = 'PUw1vVjuDKZgkMcYGJ8ziNvVuhEaVmf8TuewKTix'
> boto.config.add_section('s3')
> boto.config.set('s3', 'use-sigv4', 'True')
> 
> conn = boto.connect_s3(
>     aws_access_key_id=access_key,
>     aws_secret_access_key=secret_key,
>     host='magna064',
>     port=8080,
>     is_secure=False,  # uncomment if you are not using ssl
> 
>     calling_format=boto.s3.connection.OrdinaryCallingFormat(),
> )
> conn.auth_region_name = 'default'
> 
> 
> website_bucket = conn.create_bucket("website")
> ---
> [root@magna064 ~]# python s3website.py 
> Traceback (most recent call last):
>   File "s3website.py", line 22, in <module>
>     website_bucket = conn.create_bucket("website")
>   File "/usr/lib/python2.7/site-packages/boto/s3/connection.py", line 628,
> in create_bucket
>     response.status, response.reason, body)
> boto.exception.S3ResponseError: S3ResponseError: 404 Not Found
> <?xml version="1.0"
> encoding="UTF-8"?><Error><Code>NoSuchBucket</Code><BucketName>magna064</
> BucketName><RequestId>tx000000000000000000001-005b7e7f3a-1087-default</
> RequestId><HostId>1087-default-default</HostId></Error>


Comments 13 and 14 are referred to a defect https://bugzilla.redhat.com/show_bug.cgi?id=1620734, which is causing an issue for verifying this defect

Comment 21 errata-xmlrpc 2018-09-26 18:16:43 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2018:2819


Note You need to log in before you can comment on or make changes to this bug.