Bug 1929387

Summary: RGW ops log is not logging bucket listing operations
Product: [Red Hat Storage] Red Hat Ceph Storage Reporter: Bob Emerson <roemerso>
Component: RGWAssignee: Matt Benjamin (redhat) <mbenjamin>
Status: CLOSED ERRATA QA Contact: Tejas <tchandra>
Severity: medium Docs Contact:
Priority: medium    
Version: 4.2CC: bniver, cbodley, ceph-eng-bugs, gsitlani, kbader, mbenjamin, mhackett, roemerso, sweil, tserlin, vereddy
Target Milestone: ---   
Target Release: 5.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: ceph-16.2.0-20.el8cp Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
: 1954789 (view as bug list) Environment:
Last Closed: 2021-08-30 08:28:20 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On:    
Bug Blocks: 1954789    

Description Bob Emerson 2021-02-16 18:52:05 UTC
Description of problem:

Customer has implemented RGW ops logging and noticed that during a "ls" operation this operation is not logged to the ops log. https://access.redhat.com/solutions/3613291

Version-Release number of selected component (if applicable):

Tested on 4.2 and 3.3.6


Steps to Reproduce:

Follow KCS https://access.redhat.com/solutions/3613291 to implement the RGW ops log


Actual results:

No operation entry logged for ls or list--buckets operation.


Expected results:

Expect an operation entry to be created for the listing operation


Additional info:


Tested with 3 different clients: AWS CLI/S3Broswer/S3CMD

Tested at version 4.2 & 3.3.6

[root@node5 ~]# ceph -v
ceph version 14.2.11-95.el7cp (1d6087ae858e7c8e72fe7390c3522c7e0d951240) nautilus (stable)

[root@rgw1 ~]# ceph -v
ceph version 12.2.12-127.el7cp (149c9c8a16ac33a42231ce4145067d3ceec16ac7) luminous (stable)





Empty or "/" URI does not log to RGW ops log.


Test

S3CMD - No logging

[root@mynode ~]# ~/tools/s3cmd/s3cmd-2.0.2/s3cmd -c ~/.s3cfg.aaa ls
2021-02-11 23:07  s3://new-bucket-e098aa4a
2021-02-11 23:48  s3://new-bucket-f1b18510


RGW LOG:

2021-02-15 18:41:54.895 7f5e87c31700  1 ====== starting new request req=0x7f5fc1ac2670 =====
2021-02-15 18:41:54.896 7f5e87c31700  1 ====== req done req=0x7f5fc1ac2670 op status=0 http_status=200 latency=0.001s ======
2021-02-15 18:41:54.896 7f5e87c31700  1 beast: 0x7f5fc1ac2670: 10.0.0.83 - - [2021-02-15 18:41:54.0.896677s] "GET / HTTP/1.1" 200 241 - - -


NC:

[root@node5 ~]# nc -U --recv-only /var/run/ceph/opslog



----------------



AWS CLI - no logging

[root@node5 ~]# ceph -v
ceph version 14.2.11-95.el7cp (1d6087ae858e7c8e72fe7390c3522c7e0d951240) nautilus (stable)

[root@emmitt ~]# aws s3 ls --endpoint-url=http://10.0.0.163:8080
2021-02-11 18:07:26 new-bucket-e098aa4a
2021-02-11 18:48:23 new-bucket-f1b18510

[root@emmitt ~]# aws s3api list-buckets --query "Buckets[].Name" --endpoint-url=http://10.0.0.163:8080
[
    "new-bucket-e098aa4a",
    "new-bucket-f1b18510"
]

RGW LOG:

2021-02-15 18:42:36.837 7f5ea2466700  1 ====== starting new request req=0x7f5fc1ac2670 =====
2021-02-15 18:42:36.839 7f5ea2466700  1 ====== req done req=0x7f5fc1ac2670 op status=0 http_status=200 latency=0.00200001s ======
2021-02-15 18:42:36.839 7f5ea2466700  1 beast: 0x7f5fc1ac2670: 10.0.0.197 - - [2021-02-15 18:42:36.0.839883s] "GET / HTTP/1.1" 200 241 - "aws-cli/2.1.25 Python/3.7.3 Linux/4.18.0-240.8.1.el8_3.x86_64 exe/x86_64.rhel.8 prompt/off command/s3.ls" -


NC:

[root@node5 ~]# nc -U --recv-only /var/run/ceph/opslog



----------------


S3BROWSER - No logging:

Client - Windows GUI

[root@node5 ~]# nc -U --recv-only /var/run/ceph/opslog



rgw log:

2021-02-15 18:39:57.171 7f5f5add7700  1 ====== starting new request req=0x7f5fc1ac2670 =====
2021-02-15 18:39:57.174 7f5f5add7700  1 ====== req done req=0x7f5fc1ac2670 op status=0 http_status=200 latency=0.00300001s ======
2021-02-15 18:39:57.174 7f5f5add7700  1 beast: 0x7f5fc1ac2670: 10.0.0.181 - - [2021-02-15 18:39:57.0.174113s] "GET / HTTP/1.1" 200 241 - "S3 Browser 9.2.1 https://s3browser.com" -


-----------------


This will log to the RGW ops log because of the bucket resource (s3cmd la)

S3CMD


[root@mynode ~]# ~/tools/s3cmd/s3cmd-2.0.2/s3cmd -c ~/.s3cfg.aaa la
2021-02-11 23:47      1024   s3://new-bucket-e098aa4a/F1.tmp
[root@mynode ~]#



[root@node5 ~]# nc -U --recv-only /var/run/ceph/opslog

[{"bucket":"new-bucket-e098aa4a","time":"2021-02-15 23:32:11.874004Z","time_local":"2021-02-15 18:32:11.874004","remote_addr":"10.0.0.83","user":"aaa","operation":"get_bucket_location","uri":"GET /new-bucket-e098aa4a/?location HTTP/1.1","http_status":"200","error_code":"","bytes_sent":134,"bytes_received":0,"object_size":0,"total_time":683,"user_agent":"","referrer":""},
{"bucket":"new-bucket-e098aa4a","time":"2021-02-15 23:32:12.600007Z","time_local":"2021-02-15 18:32:12.600007","remote_addr":"10.0.0.83","user":"aaa","operation":"list_bucket","uri":"GET /new-bucket-e098aa4a/?delimiter=%2F HTTP/1.1","http_status":"200","error_code":"","bytes_sent":571,"bytes_received":0,"object_size":0,"total_time":2,"user_agent":"","referrer":""},
{"bucket":"new-bucket-f1b18510","time":"2021-02-15 23:32:12.645007Z","time_local":"2021-02-15 18:32:12.645007","remote_addr":"10.0.0.83","user":"aaa","operation":"get_bucket_location","uri":"GET /new-bucket-f1b18510/?location HTTP/1.1","http_status":"200","error_code":"","bytes_sent":134,"bytes_received":0,"object_size":0,"total_time":1215,"user_agent":"","referrer":""},
{"bucket":"new-bucket-f1b18510","time":"2021-02-15 23:32:13.902013Z","time_local":"2021-02-15 18:32:13.902013","remote_addr":"10.0.0.83","user":"aaa","operation":"list_bucket","uri":"GET /new-bucket-f1b18510/?delimiter=%2F HTTP/1.1","http_status":"200","error_code":"","bytes_sent":280,"bytes_received":0,"object_size":0,"total_time":2,"user_agent":"","referrer":""},

Comment 1 RHEL Program Management 2021-02-16 18:52:07 UTC
Please specify the severity of this bug. Severity is defined here:
https://bugzilla.redhat.com/page.cgi?id=fields.html#bug_severity.

Comment 2 Matt Benjamin (redhat) 2021-03-09 14:24:16 UTC
Upstream PR created https://github.com/ceph/ceph/pull/39933 (approved, waiting on merge)

Comment 30 Veera Raghava Reddy 2021-07-13 05:16:18 UTC
Moving to Verified based on https://bugzilla.redhat.com/show_bug.cgi?id=1929387#c28

Comment 33 errata-xmlrpc 2021-08-30 08:28:20 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Red Hat Ceph Storage 5.0 bug fix and enhancement), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2021:3294

Comment 35 Red Hat Bugzilla 2023-09-15 01:01:21 UTC
The needinfo request[s] on this closed bug have been removed as they have been unresolved for 500 days