Bug 1707037 - [OSP 13] Generating volume as member user in tenant A from glance image in tenant B with source cinder image in tenant B
Summary: [OSP 13] Generating volume as member user in tenant A from glance image in te...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat OpenStack
Classification: Red Hat
Component: openstack-cinder
Version: 13.0 (Queens)
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: ---
: ---
Assignee: Cinder Bugs List
QA Contact: Tzach Shefi
Tana
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-05-06 16:48 UTC by Andreas Karis
Modified: 2019-09-17 22:10 UTC (History)
3 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-09-17 22:10:25 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)

Description Andreas Karis 2019-05-06 16:48:36 UTC
Description of problem:
It seems that the behavior in OSP changed between versions. According to a customer who upgraded from OSP 10 to 13 (via 11, 12), generating a volume as _member_ user in tenant A from glance image in tenant B with source cinder image in tenant B is no longer possible.


Version-Release number of selected component (if applicable):


How reproducible:


Steps to Reproduce:
1.
2.
3.

Actual results:


Expected results:


Additional info:

Comment 2 Andreas Karis 2019-05-06 16:53:34 UTC
The source volume is in a different tenant than the failing volume.

The failed volume:
| os-vol-tenant-attr:tenant_id | 6fb0a45d023d4f049f4d5781ae629167     |

The volume from which you are cloning:
| os-vol-tenant-attr:tenant_id   | 72fefc72b09e425fb151478146da823b                       |

The glance image is in the same tenant:
| owner            | 72fefc72b09e425fb151478146da823b              |


---------------------------------------------------------------------------

cinder show e48dd1d4-4f57-4a46-8c60-a7c23e276f20
+------------------------------+--------------------------------------+
| Property                     | Value                                |
+------------------------------+--------------------------------------+
| attached_servers             | []                                   |
| attachment_ids               | []                                   |
| availability_zone            | nova                                 |
| bootable                     | false                                |
| consistencygroup_id          | None                                 |
| created_at                   | 2019-04-29T19:31:26.000000           |
| description                  | None                                 |
| encrypted                    | False                                |
| id                           | e48dd1d4-4f57-4a46-8c60-a7c23e276f20 |
| metadata                     |                                      |
| multiattach                  | False                                |
| name                         | test_vol1                            |
| os-vol-tenant-attr:tenant_id | 6fb0a45d023d4f049f4d5781ae629167     |
| replication_status           | None                                 |
| size                         | 20                                   |
| snapshot_id                  | None                                 |
| source_volid                 | None                                 |
| status                       | error                                |
| updated_at                   | 2019-04-29T19:31:40.000000           |
| user_id                      | f9848a40a1524629b5e662e857bc5654     |
| volume_type                  | volumes_dellsc                       |
+------------------------------+--------------------------------------+
Glance Show:
glance image-show bb129698-c6f8-44be-8d3d-427d02c2f41f
+------------------+-----------------------------------------------+
| Property         | Value                                         |
+------------------+-----------------------------------------------+
| checksum         | 90956b2310c742b42e80c5eee9e6efb4              |
| container_format | bare                                          |
| created_at       | 2019-03-12T00:18:20Z                          |
| direct_url       | cinder://921f0d08-6801-453a-85d4-194752ec97c9 |
| disk_format      | qcow2                                         |
| id               | bb129698-c6f8-44be-8d3d-427d02c2f41f          |
| min_disk         | 0                                             |
| min_ram          | 0                                             |
| name             | centos-7                                      |
| owner            | 72fefc72b09e425fb151478146da823b              |
| protected        | False                                         |
| size             | 854851584                                     |
| status           | active                                        |
| tags             | []                                            |
| updated_at       | 2019-03-12T00:19:05Z                          |
| virtual_size     | None                                          |
| visibility       | public                                        |
+------------------+-----------------------------------------------+

cinder show 921f0d08-6801-453a-85d4-194752ec97c9
+--------------------------------+--------------------------------------------------------+
| Property                       | Value                                                  |
+--------------------------------+--------------------------------------------------------+
| attached_servers               | []                                                     |
| attachment_ids                 | []                                                     |
| availability_zone              | nova                                                   |
| bootable                       | false                                                  |
| consistencygroup_id            | None                                                   |
| created_at                     | 2019-03-12T00:18:22.000000                             |
| description                    | None                                                   |
| encrypted                      | False                                                  |
| id                             | 921f0d08-6801-453a-85d4-194752ec97c9                   |
| metadata                       | glance_image_id : bb129698-c6f8-44be-8d3d-427d02c2f41f |
|                                | image_owner : 72fefc72b09e425fb151478146da823b         |
|                                | image_size : 854851584                                 |
|                                | readonly : True                                        |
| migration_status               | None                                                   |
| multiattach                    | False                                                  |
| name                           | image-bb129698-c6f8-44be-8d3d-427d02c2f41f             |
| os-vol-host-attr:host          | hostgroup@tripleo_dellsc#tripleo_dellsc                |
| os-vol-mig-status-attr:migstat | None                                                   |
| os-vol-mig-status-attr:name_id | None                                                   |
| os-vol-tenant-attr:tenant_id   | 72fefc72b09e425fb151478146da823b                       |
| readonly                       | True                                                   |
| replication_status             | None                                                   |
| size                           | 1                                                      |
| snapshot_id                    | None                                                   |
| source_volid                   | None                                                   |
| status                         | available                                              |
| updated_at                     | 2019-04-29T19:50:10.000000                             |
| user_id                        | 0a1c543eec7c47faaab1f5d7717b2e38                       |
| volume_type                    | None                                                   |
+--------------------------------+--------------------------------------------------------+

-------------------------------------------

There's no problem to create volumes from this glance image with the admin user.

The customer has a _member_ user that has access to all project called apiuser. They went in as that user to the admin project and it was able to create the volume from image. He did some research into their configuration for the _member_ user for cinder and found the permissions set as:

oslopolicy-policy-generator --namespace cinder
...
"volume:create": ""
"volume:create_from_image": ""
...

Which means that any user should be able to create volumes from the images, regardless of whether the image is owned by admin in the admin project. According to the customer, this worked in OSP 10, but in OSP 13, this no longer works.

Comment 4 Brian Rosmaita 2019-05-08 02:59:19 UTC
(In reply to Andreas Karis from comment #2)
> "volume:create": ""
> "volume:create_from_image": ""
> ...
> 
> Which means that any user should be able to create volumes from the images,
> regardless of whether the image is owned by admin in the admin project.

Just want to be clear about this.  What the unrestricted policy targets mean is that any authenticated user can make either of the API calls for these actions.  Whether the call will succeed or not depends on other factors, for example, whether the caller has permission to use the image.

Comment 5 Brian Rosmaita 2019-05-08 03:02:51 UTC
Andreas:

Just an FYI -- the glance_store cinder driver was experimental until Rocky (0.26.0).
That being said, there don't seem to be any code changes between Queens and Rocky that would impact this issue.

OK, the situation is:
(1) the image has 'public' visibility, so as far as Glance is concerned, any authenticated user has access to it
(2) the stuff from the Glance log is saying that glance_store is getting a 404 from Cinder when it tries to get the image from the Cinder store
(3) but, you can see that the image is there by going around Glance's back and looking directly in Cinder
(4) and, the customer reports that a user with admin credentials *can* create a volume from the image

So I suspect a permissions problem here.  There are two possibilities:

(A) When Glance uses the Cinder driver for its image store, the operator has the option to set the account information for a user (usually the glance service account) who "owns" all the data in the backend store.  (This is different from the owner of any particular image; Glance keeps track of who owns which image in the glance database.)  If the account options are not set, then the glance_store cinder driver uses the user info in the context (that is, the tenant_id and user_id of the user who made the Cinder call to create the volume).  Since an admin typically has access to all data, and since someone with role:admin is usually recognized as an admin by most services, I think the reason the create-volume-from-image succeeds when an admin makes the call is that the glance_store cinder driver is configured to use the credentials in the context.

To see if this is the case, you need to look in the glance-api.conf in the [glance_store] section.  There will be a bunch of options with the 'cinder_' prefix.  The ones we're interested in are:
* cinder_store_auth_address
* cinder_store_user_name
* cinder_store_password
* cinder_store_project_name

The glance_store cinder driver is programmed such that if any one of the above is not specified, then the user information in the calling context is used instead.  As far as why this used to work in Newton but doesn't in Queens, maybe the config wasn't updated properly?  The code that does this was introduced in Mitaka and hasn't changed much.  (There was a change in between Newton and Queens to switch to using the keystoneauth1 library instead of using the keystoneclient, but I don't think that would have changed anything.)

(B) The other possibility is on the Cinder side.  Since the glance_store cinder driver is being used by Glance, Glance needs to have permission to do the "volume:get" action in the *Cinder* policy.json or policy.yaml file.  So it could be that there's a weird rule for that action preventing the Glance user (who would be the user configured by the cinder_* options mentioned in (A)) from being able to get data out of Cinder.  I think this is pretty unlikely, but you might as well check.

Comment 8 Andreas Karis 2019-05-08 15:51:49 UTC
Hi,

I think I can reproduce this issue in OSP 13, in my env, so I'll test your suggestions next / provide the data from my env.

-------------------

Here's my first test with OSP 13.

Apply steps from:
[https://docs.openstack.org/cinder/latest/admin/blockstorage-volume-backed-image.html](https://docs.openstack.org/cinder/latest/admin/blockstorage-volume-backed-image.html)

Set the following settings:
~~~
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf glance_store stores
Section not found: glance_store
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/glance_api/etc/glance/glance-api.conf glance_store stores
http,swift,cinder
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/glance_api/etc/glance/glance-api.conf DEFAULT show_multiple_locations
True
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf DEFAULT allowed_direct_url_schemes
cinder
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf DEFAULT image_upload_use_cinder_backend 
true
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf DEFAULT image_upload_use_internal_tenant 
true
~~~

Restart glance and cinder containers:
~~~
docker ps | egrep 'cinder|glance' | awk '{print $NF}' | xargs docker restart
~~~

Create a cinder volume from an existing glance image:
~~~
(overcloud) [stack@undercloud-1 ~]$ glance image-list | grep cirros
| a840282f-b914-4ae4-8185-fd3415b4452b | cirros                                 |
(overcloud) [stack@undercloud-1 ~]$ cinder create --image-id a840282f-b914-4ae4-8185-fd3415b4452b --display_name=bootable_volume 1
+--------------------------------+--------------------------------------+
| Property                       | Value                                |
+--------------------------------+--------------------------------------+
| attachments                    | []                                   |
| availability_zone              | nova                                 |
| bootable                       | false                                |
| consistencygroup_id            | None                                 |
| created_at                     | 2019-05-07T21:33:05.000000           |
| description                    | None                                 |
| encrypted                      | False                                |
| id                             | 2e1cd04b-62bd-4a37-892c-8731c0019568 |
| metadata                       | {}                                   |
| migration_status               | None                                 |
| multiattach                    | False                                |
| name                           | bootable_volume                      |
| os-vol-host-attr:host          | None                                 |
| os-vol-mig-status-attr:migstat | None                                 |
| os-vol-mig-status-attr:name_id | None                                 |
| os-vol-tenant-attr:tenant_id   | a584030bcca840b696b172842b3d1c8c     |
| replication_status             | None                                 |
| size                           | 1                                    |
| snapshot_id                    | None                                 |
| source_volid                   | None                                 |
| status                         | creating                             |
| updated_at                     | None                                 |
| user_id                        | 5f57f23fcf104b41a44738c769f0830c     |
| volume_type                    | None                                 |
+--------------------------------+--------------------------------------+
~~~

Create a glance image from the volume:
~~~
(overcloud) [stack@undercloud-1 ~]$ openstack image create --disk-format raw --container-format bare bootable_volume_image
(...)
| id               | 4173209e-4fe5-4cb5-9dda-975821493f45                 |
(...)
~~~

~~~
(overcloud) [stack@undercloud-1 ~]$ glance location-add 4173209e-4fe5-4cb5-9dda-975821493f45 --url cinder://2e1cd04b-62bd-4a37-892c-8731c0019568
(...)   |
| direct_url       | cinder://2e1cd04b-62bd-4a37-892c-8731c0019568                              |
| disk_format      | raw                                                                        |
| file             | /v2/images/4173209e-4fe5-4cb5-9dda-975821493f45/file                       |
| id               | 4173209e-4fe5-4cb5-9dda-975821493f45                                       |
| locations        | [{"url": "cinder://2e1cd04b-62bd-4a37-892c-8731c0019568", "metadata": {}}] |
(...)
(overcloud) [stack@undercloud-1 ~]$ openstack image set 4173209e-4fe5-4cb5-9dda-975821493f45 --public
(overcloud) [stack@undercloud-1 ~]$ glance image-show 4173209e-4fe5-4cb5-9dda-975821493f45
+------------------+----------------------------------------------------------------------------+
| Property         | Value                                                                      |
+------------------+----------------------------------------------------------------------------+
| checksum         | None                                                                       |
| container_format | bare                                                                       |
| created_at       | 2019-05-07T22:07:07Z                                                       |
| direct_url       | cinder://2e1cd04b-62bd-4a37-892c-8731c0019568                              |
| disk_format      | raw                                                                        |
| id               | 4173209e-4fe5-4cb5-9dda-975821493f45                                       |
| locations        | [{"url": "cinder://2e1cd04b-62bd-4a37-892c-8731c0019568", "metadata": {}}] |
| min_disk         | 0                                                                          |
| min_ram          | 0                                                                          |
| name             | bootable_volume_image                                                      |
| owner            | a584030bcca840b696b172842b3d1c8c                                           |
| protected        | False                                                                      |
| size             | 1073741824                                                                 |
| status           | active                                                                     |
| tags             | []                                                                         |
| updated_at       | 2019-05-07T22:32:50Z                                                       |
| virtual_size     | None                                                                       |
| visibility       | public                                                                     |
+------------------+----------------------------------------------------------------------------+
~~~

Create a user, test project, assign user as _member_ to test project:
~~~
(overcloud) [stack@undercloud-1 ~]$ openstack project create test
+-------------+----------------------------------+
| Field       | Value                            |
+-------------+----------------------------------+
| description |                                  |
| domain_id   | default                          |
| enabled     | True                             |
| id          | 7bfedc571c954949b7d27da0ef9efa4d |
| is_domain   | False                            |
| name        | test                             |
| parent_id   | default                          |
| tags        | []                               |
+-------------+----------------------------------+
(overcloud) [stack@undercloud-1 ~]$ openstack user create test --password test --project test
openstakc help +---------------------+----------------------------------+
| Field               | Value                            |
+---------------------+----------------------------------+
| default_project_id  | 7bfedc571c954949b7d27da0ef9efa4d |
| domain_id           | default                          |
| enabled             | True                             |
| id                  | ab94920254ce4ea091bb9cfe4ecd3e5a |
| name                | test                             |
| options             | {}                               |
| password_expires_at | None                             |
+---------------------+----------------------------------+
(overcloud) [stack@undercloud-1 ~]$ openstack role add --user ab94920254ce4ea091bb9cfe4ecd3e5a --project 7bfedc571c954949b7d27da0ef9efa4d 9fe2ff9ee4384b1894a90878d3e92bab
~~~

Create volume from public glance image:
~~~
(overcloud) [stack@undercloud-1 ~]$ source testrc
(overcloud) [stack@undercloud-1 ~]$ nova list
+----+------+--------+------------+-------------+----------+
| ID | Name | Status | Task State | Power State | Networks |
+----+------+--------+------------+-------------+----------+
+----+------+--------+------------+-------------+----------+
(overcloud) [stack@undercloud-1 ~]$ glance image-list
+--------------------------------------+-----------------------+
| ID                                   | Name                  |
+--------------------------------------+-----------------------+
| 4173209e-4fe5-4cb5-9dda-975821493f45 | bootable_volume_image |
+--------------------------------------+-----------------------+
(overcloud) [stack@undercloud-1 ~]$ cinder list
+----+--------+------+------+-------------+----------+-------------+
| ID | Status | Name | Size | Volume Type | Bootable | Attached to |
+----+--------+------+------+-------------+----------+-------------+
+----+--------+------+------+-------------+----------+-------------+
(overcloud) [stack@undercloud-1 ~]$ cinder create --image-id 4173209e-4fe5-4cb5-9dda-975821493f45 --display_name=test 1
+------------------------------+--------------------------------------+
| Property                     | Value                                |
+------------------------------+--------------------------------------+
| attachments                  | []                                   |
| availability_zone            | nova                                 |
| bootable                     | false                                |
| consistencygroup_id          | None                                 |
| created_at                   | 2019-05-07T22:37:40.000000           |
| description                  | None                                 |
| encrypted                    | False                                |
| id                           | 2ade3397-7307-49d9-bd83-d6bc8a2dccd8 |
| metadata                     | {}                                   |
| multiattach                  | False                                |
| name                         | test                                 |
| os-vol-tenant-attr:tenant_id | 7bfedc571c954949b7d27da0ef9efa4d     |
| replication_status           | None                                 |
| size                         | 1                                    |
| snapshot_id                  | None                                 |
| source_volid                 | None                                 |
| status                       | creating                             |
| updated_at                   | None                                 |
| user_id                      | ab94920254ce4ea091bb9cfe4ecd3e5a     |
| volume_type                  | None                                 |
+------------------------------+--------------------------------------+
(overcloud) [stack@undercloud-1 ~]$ watch cinder list
(overcloud) [stack@undercloud-1 ~]$ cinder show 2ade3397-7307-49d9-bd83-d6bc8a2dccd8
+------------------------------+--------------------------------------+
| Property                     | Value                                |
+------------------------------+--------------------------------------+
| attached_servers             | []                                   |
| attachment_ids               | []                                   |
| availability_zone            | nova                                 |
| bootable                     | false                                |
| consistencygroup_id          | None                                 |
| created_at                   | 2019-05-07T22:37:40.000000           |
| description                  | None                                 |
| encrypted                    | False                                |
| id                           | 2ade3397-7307-49d9-bd83-d6bc8a2dccd8 |
| metadata                     |                                      |
| multiattach                  | False                                |
| name                         | test                                 |
| os-vol-tenant-attr:tenant_id | 7bfedc571c954949b7d27da0ef9efa4d     |
| replication_status           | None                                 |
| size                         | 1                                    |
| snapshot_id                  | None                                 |
| source_volid                 | None                                 |
| status                       | error                                |
| updated_at                   | 2019-05-07T22:37:50.000000           |
| user_id                      | ab94920254ce4ea091bb9cfe4ecd3e5a     |
~~~

This yields the following ERROR messages in the logs:
~~~
[root@overcloud-controller-0 cinder]# pwd
/var/log/containers/cinder
[root@overcloud-controller-0 cinder]#  grep ERROR cinder-volume.log
(...)
019-05-07 22:37:45.416 41 ERROR cinder.volume.manager Traceback (most recent call last):
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     result = task.execute(**arguments)
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1020, in execute
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     **volume_spec)
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 929, in _create_from_image
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     image_service)
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 812, in _create_from_image_cache_or_download
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     backend_name) as tmp_image:
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib64/python2.7/contextlib.py", line 17, in __enter__
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     return self.gen.next()
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 764, in fetch
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     fetch_verify_image(context, image_service, image_id, tmp)
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 378, in fetch_verify_image
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     None, None)
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 304, in fetch
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     tpool.Proxy(image_file))
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 333, in download
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager     for chunk in image_chunks:
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:45.416 41 ERROR cinder.volume.manager 
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server [req-b69ec362-a582-4f16-bbca-843d854e05a8 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Exception during message handling: TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "<string>", line 2, in create_volume
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/objects/cleanable.py", line 207, in wrapper
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     result = f(*args, **kwargs)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 690, in create_volume
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     _run_flow()
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 682, in _run_flow
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     flow_engine.run()
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     failures[0].reraise()
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     six.reraise(*self._exc_info)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     result = task.execute(**arguments)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1020, in execute
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     **volume_spec)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 929, in _create_from_image
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     image_service)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 812, in _create_from_image_cache_or_download
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     backend_name) as tmp_image:
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python2.7/contextlib.py", line 17, in __enter__
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     return self.gen.next()
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 764, in fetch
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     fetch_verify_image(context, image_service, image_id, tmp)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 378, in fetch_verify_image
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     None, None)
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 304, in fetch
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     tpool.Proxy(image_file))
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 333, in download
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server     for chunk in image_chunks:
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:46.846 41 ERROR oslo_messaging.rpc.server 
2019-05-07 22:37:47.070 41 DEBUG oslo_db.sqlalchemy.engines [req-b69ec362-a582-4f16-bbca-843d854e05a8 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python2.7/site-packages/oslo_db/sqlalchemy/engines.py:290
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager Traceback (most recent call last):
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     result = task.execute(**arguments)
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1020, in execute
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     **volume_spec)
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 929, in _create_from_image
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     image_service)
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 812, in _create_from_image_cache_or_download
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     backend_name) as tmp_image:
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib64/python2.7/contextlib.py", line 17, in __enter__
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     return self.gen.next()
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 764, in fetch
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     fetch_verify_image(context, image_service, image_id, tmp)
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 378, in fetch_verify_image
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     None, None)
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 304, in fetch
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     tpool.Proxy(image_file))
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 333, in download
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager     for chunk in image_chunks:
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:47.457 41 ERROR cinder.volume.manager 
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server [req-b69ec362-a582-4f16-bbca-843d854e05a8 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Exception during message handling: TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "<string>", line 2, in create_volume
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/objects/cleanable.py", line 207, in wrapper
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     result = f(*args, **kwargs)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 690, in create_volume
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     _run_flow()
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 682, in _run_flow
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     flow_engine.run()
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     failures[0].reraise()
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     six.reraise(*self._exc_info)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     result = task.execute(**arguments)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1020, in execute
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     **volume_spec)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 929, in _create_from_image
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     image_service)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 812, in _create_from_image_cache_or_download
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     backend_name) as tmp_image:
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python2.7/contextlib.py", line 17, in __enter__
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     return self.gen.next()
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 764, in fetch
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     fetch_verify_image(context, image_service, image_id, tmp)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 378, in fetch_verify_image
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     None, None)
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 304, in fetch
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     tpool.Proxy(image_file))
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 333, in download
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server     for chunk in image_chunks:
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:49.042 41 ERROR oslo_messaging.rpc.server 
2019-05-07 22:37:49.180 41 DEBUG oslo_db.sqlalchemy.engines [req-b69ec362-a582-4f16-bbca-843d854e05a8 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python2.7/site-packages/oslo_db/sqlalchemy/engines.py:290
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager Traceback (most recent call last):
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     result = task.execute(**arguments)
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1020, in execute
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     **volume_spec)
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 929, in _create_from_image
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     image_service)
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 812, in _create_from_image_cache_or_download
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     backend_name) as tmp_image:
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib64/python2.7/contextlib.py", line 17, in __enter__
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     return self.gen.next()
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 764, in fetch
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     fetch_verify_image(context, image_service, image_id, tmp)
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 378, in fetch_verify_image
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     None, None)
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 304, in fetch
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     tpool.Proxy(image_file))
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 333, in download
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager     for chunk in image_chunks:
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:49.803 41 ERROR cinder.volume.manager 
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server [req-b69ec362-a582-4f16-bbca-843d854e05a8 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Exception during message handling: TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "<string>", line 2, in create_volume
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/objects/cleanable.py", line 207, in wrapper
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     result = f(*args, **kwargs)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 690, in create_volume
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     _run_flow()
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 682, in _run_flow
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     flow_engine.run()
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     failures[0].reraise()
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     six.reraise(*self._exc_info)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     result = task.execute(**arguments)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1020, in execute
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     **volume_spec)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 929, in _create_from_image
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     image_service)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 812, in _create_from_image_cache_or_download
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     backend_name) as tmp_image:
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python2.7/contextlib.py", line 17, in __enter__
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     return self.gen.next()
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 764, in fetch
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     fetch_verify_image(context, image_service, image_id, tmp)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 378, in fetch_verify_image
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     None, None)
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 304, in fetch
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     tpool.Proxy(image_file))
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 333, in download
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server     for chunk in image_chunks:
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server TypeError: 'NoneType' object is not iterable
2019-05-07 22:37:51.251 41 ERROR oslo_messaging.rpc.server 
[root@overcloud-controller-0 cinder]# 
~~~

And in glance:
~~~
[root@overcloud-controller-0 glance]# pwd
/var/log/containers/glance
[root@overcloud-controller-0 glance]# grep ERROR *
api.log:2019-05-07 22:06:33.060 1 DEBUG glance.common.config [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python2.7/site-packages/oslo_config/cfg.py:2894
api.log:2019-05-07 22:07:07.718 25 DEBUG oslo_db.sqlalchemy.engines [req-67b83b94-f413-45b6-8266-2357b1da558c 5f57f23fcf104b41a44738c769f0830c a584030bcca840b696b172842b3d1c8c - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python2.7/site-packages/oslo_db/sqlalchemy/engines.py:290
api.log:2019-05-07 22:10:20.902 1 DEBUG glance.common.config [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python2.7/site-packages/oslo_config/cfg.py:2894
api.log:2019-05-07 22:11:15.436 26 DEBUG oslo_db.sqlalchemy.engines [req-c83ca189-f03f-4edd-8727-94d2040264f4 5f57f23fcf104b41a44738c769f0830c a584030bcca840b696b172842b3d1c8c - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python2.7/site-packages/oslo_db/sqlalchemy/engines.py:290
api.log:2019-05-07 22:37:45.349 26 ERROR glance_store._drivers.cinder [req-13144ca5-903e-4ed0-bba4-b1b49c2acb8e ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image size due to volume can not be found: 2e1cd04b-62bd-4a37-892c-8731c0019568: NotFound: Volume 2e1cd04b-62bd-4a37-892c-8731c0019568 could not be found. (HTTP 404) (Request-ID: req-588ce795-3166-4d85-8b5d-8f430f1aefec)
api.log:2019-05-07 22:37:45.351 26 ERROR glance.location [req-13144ca5-903e-4ed0-bba4-b1b49c2acb8e ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Glance tried all active locations to get data for image 4173209e-4fe5-4cb5-9dda-975821493f45 but all have failed.: NotFound: Failed to get image size due to volume can not be found: 2e1cd04b-62bd-4a37-892c-8731c0019568
api.log:2019-05-07 22:37:47.442 26 ERROR glance_store._drivers.cinder [req-e62f850b-93ea-47ba-9b3d-b61058e94747 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image size due to volume can not be found: 2e1cd04b-62bd-4a37-892c-8731c0019568: NotFound: Volume 2e1cd04b-62bd-4a37-892c-8731c0019568 could not be found. (HTTP 404) (Request-ID: req-ff04c5ed-6b29-4145-bd84-791ad6138b45)
api.log:2019-05-07 22:37:47.443 26 ERROR glance.location [req-e62f850b-93ea-47ba-9b3d-b61058e94747 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Glance tried all active locations to get data for image 4173209e-4fe5-4cb5-9dda-975821493f45 but all have failed.: NotFound: Failed to get image size due to volume can not be found: 2e1cd04b-62bd-4a37-892c-8731c0019568
api.log:2019-05-07 22:37:49.788 26 ERROR glance_store._drivers.cinder [req-09b5f87f-9ffc-4b0e-8c19-3dd0f27ba0a1 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image size due to volume can not be found: 2e1cd04b-62bd-4a37-892c-8731c0019568: NotFound: Volume 2e1cd04b-62bd-4a37-892c-8731c0019568 could not be found. (HTTP 404) (Request-ID: req-7a9ca608-1d5c-4df3-85f5-4cc9c5c1267c)
api.log:2019-05-07 22:37:49.789 26 ERROR glance.location [req-09b5f87f-9ffc-4b0e-8c19-3dd0f27ba0a1 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Glance tried all active locations to get data for image 4173209e-4fe5-4cb5-9dda-975821493f45 but all have failed.: NotFound: Failed to get image size due to volume can not be found: 2e1cd04b-62bd-4a37-892c-8731c0019568
~~~

#81 (Associate) Make PrivatePrivate

Comment 9 Andreas Karis 2019-05-08 16:39:57 UTC
~~~
To see if this is the case, you need to look in the glance-api.conf in the [glance_store] section.  There will be a bunch of options with the 'cinder_' prefix.  The ones we're interested in are:
* cinder_store_auth_address
* cinder_store_user_name
* cinder_store_password
* cinder_store_project_name
~~~

~~~
[root@overcloud-controller-0 ~]# grep cinder_ /var/lib/config-data/puppet-generated/glance_api/etc/glance/* | grep -v ':#'
[root@overcloud-controller-0 ~]# 
~~~

Comment 10 Andreas Karis 2019-05-08 16:43:02 UTC
And:
~~~
[root@overcloud-controller-0 ~]# find /var/lib/config-data/puppet-generated/cinder -name '*policy*'
[root@overcloud-controller-0 ~]# docker ps | grep cinder | awk '{print $NF}' | tr '\n' ' '
openstack-cinder-volume-docker-0 cinder_api_cron cinder_scheduler cinder_api [root@overcloud-controller-0 ~]# docker exec -it openstack-cinder-volume-docker-0 oslopolicy-policy-generator --namespace cinder | egrep 'volume:create'
"volume:create_volume_metadata": "rule:admin_or_owner"
"volume:create_from_image": ""
"volume:create_snapshot": "rule:admin_or_owner"
"volume:create": ""
"volume:create_transfer": "rule:admin_or_owner"
[root@overcloud-controller-0 ~]# 
~~~

Comment 11 Andreas Karis 2019-05-08 16:48:25 UTC
In a default OSP 10 environment, FYI, I see a policy.json file with:
~~~
[root@control-0 ~]# grep volume:create /etc/cinder/policy.json 
    "volume:create": "",
    "volume:create_volume_metadata": "rule:admin_or_owner",
    "volume:create_snapshot": "rule:admin_or_owner",
    "volume:create_transfer": "rule:admin_or_owner",
[root@control-0 ~]# 
~~~

Comment 12 Andreas Karis 2019-05-08 16:58:42 UTC
I copied the policy file from my OSP 10 to OSP 13 and restarted the containers, but the result is still the same:

root@overcloud-controller-0 ~]# vi /var/lib/config-data/puppet-generated/cinder/etc/cinder/policy.json
[root@overcloud-controller-0 ~]# docker ps | grep cinder | awk '{print $NF}'
openstack-cinder-volume-docker-0
cinder_api_cron
cinder_scheduler
cinder_api
[root@overcloud-controller-0 ~]# docker ps | grep cinder | awk '{print $NF}' | xargs docker restart
openstack-cinder-volume-docker-0
cinder_api_cron
cinder_scheduler
cinder_api


(overcloud) [stack@undercloud-1 ~]$  cinder create --image-id 4173209e-4fe5-4cb5-9dda-975821493f45 --display_name=test 1
cinder list
+------------------------------+--------------------------------------+
| Property                     | Value                                |
+------------------------------+--------------------------------------+
| attachments                  | []                                   |
| availability_zone            | nova                                 |
| bootable                     | false                                |
| consistencygroup_id          | None                                 |
| created_at                   | 2019-05-08T16:54:04.000000           |
| description                  | None                                 |
| encrypted                    | False                                |
| id                           | 35f51468-90be-461a-8d0f-5e4bb39ff97a |
| metadata                     | {}                                   |
| multiattach                  | False                                |
| name                         | test                                 |
| os-vol-tenant-attr:tenant_id | 7bfedc571c954949b7d27da0ef9efa4d     |
| replication_status           | None                                 |
| size                         | 1                                    |
| snapshot_id                  | None                                 |
| source_volid                 | None                                 |
| status                       | creating                             |
| updated_at                   | None                                 |
| user_id                      | ab94920254ce4ea091bb9cfe4ecd3e5a     |
| volume_type                  | None                                 |
+------------------------------+--------------------------------------+
(overcloud) [stack@undercloud-1 ~]$ cinder list
+--------------------------------------+----------+------+------+-------------+----------+-------------+
| ID                                   | Status   | Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+----------+------+------+-------------+----------+-------------+
| 2ade3397-7307-49d9-bd83-d6bc8a2dccd8 | error    | test | 1    | -           | false    |             |
| 35f51468-90be-461a-8d0f-5e4bb39ff97a | creating | test | 1    | -           | false    |             |
+--------------------------------------+----------+------+------+-------------+----------+-------------+
(overcloud) [stack@undercloud-1 ~]$ cinder list
+--------------------------------------+--------+------+------+-------------+----------+-------------+
| ID                                   | Status | Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+--------+------+------+-------------+----------+-------------+
| 2ade3397-7307-49d9-bd83-d6bc8a2dccd8 | error  | test | 1    | -           | false    |             |
| 35f51468-90be-461a-8d0f-5e4bb39ff97a | error  | test | 1    | -           | false    |             |
+--------------------------------------+--------+------+------+-------------+----------+-------------+

Comment 13 Brian Rosmaita 2019-05-08 17:33:57 UTC
Andreas: thanks for the info.  I don't think it's a policy problem.  Could you please do the following:

(1) verify that you can 'source' admin credentials and create a volume from image 4173209e-4fe5-4cb5-9dda-975821493f45

(2) if (1) succeeds, try adding the cinder_* config to the glance-api.conf file in the [glance_store] section.

To get the values, use the shell where you sourced the admin credentials.

a) cinder_store_auth_address - do 'openstack catalog show identity' to get the endpoint (or look in clouds.yaml).
it should look something like 'https://192.168.122.4/identity'.  I think you want to set it to use v3, so you'd use 'https://192.168.122.4/identity/v3' as the value

b) cinder_store_user_name - look in /etc/cinder/cinder.conf in the [DEFAULT] section, there should be a line like
cinder_internal_tenant_user_id = <user_id>
Do 'openstack user show <user_id>' to find the name.

c) cinder_store_project_name - look in cinder.conf for cinder_internal_tenant_project_id, and then do
'openstack project show <project_id>' to find the name.

d) cinder_store_password - yeah ... i think what you're going to find is that the username for (b) is 'cinder' and that the project name for (c) is 'service'.  Look in the cinder.conf file in the [keystone_authtoken] section, if the 'username' is 'cinder' (or whatever you found for (b)) and the 'project_name' is 'service' (or whatever you found for (c)), then use the value that's set for the 'password' config option.

Then restart glance-api, open a new shell, source your testrc to get regular user credentials into the environment, and then try to create a volume from image 4173209e-4fe5-4cb5-9dda-975821493f45

Comment 14 Andreas Karis 2019-05-08 17:53:34 UTC
1)

~~~
(overcloud) [stack@undercloud-1 ~]$ source overcloudrc
(overcloud) [stack@undercloud-1 ~]$  cinder create --image-id 4173209e-4fe5-4cb5-9dda-975821493f45 --display_name=test 1
+--------------------------------+--------------------------------------+
| Property                       | Value                                |
+--------------------------------+--------------------------------------+
| attachments                    | []                                   |
| availability_zone              | nova                                 |
| bootable                       | false                                |
| consistencygroup_id            | None                                 |
| created_at                     | 2019-05-08T16:57:46.000000           |
| description                    | None                                 |
| encrypted                      | False                                |
| id                             | a1dcceb5-0214-4e8e-87c6-e02a90239c26 |
| metadata                       | {}                                   |
| migration_status               | None                                 |
| multiattach                    | False                                |
| name                           | test                                 |
| os-vol-host-attr:host          | None                                 |
| os-vol-mig-status-attr:migstat | None                                 |
| os-vol-mig-status-attr:name_id | None                                 |
| os-vol-tenant-attr:tenant_id   | a584030bcca840b696b172842b3d1c8c     |
| replication_status             | None                                 |
| size                           | 1                                    |
| snapshot_id                    | None                                 |
| source_volid                   | None                                 |
| status                         | creating                             |
| updated_at                     | None                                 |
| user_id                        | 5f57f23fcf104b41a44738c769f0830c     |
| volume_type                    | None                                 |
+--------------------------------+--------------------------------------+
(overcloud) [stack@undercloud-1 ~]$ cinder list
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+
| ID                                   | Status    | Name            | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+
| 2e1cd04b-62bd-4a37-892c-8731c0019568 | available | bootable_volume | 1    | -           | true     |             |
| a1dcceb5-0214-4e8e-87c6-e02a90239c26 | available | test            | 1    | -           | true     |             |
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+
(overcloud) [stack@undercloud-1 ~]$ cinder delete a1dcceb5-0214-4e8e-87c6-e02a90239c26
Request to delete volume a1dcceb5-0214-4e8e-87c6-e02a90239c26 has been accepted.
~~~

2) I get stuck here: ;-)

(overcloud) [stack@undercloud-1 ~]$ openstack catalog show identity
+-----------+-------------------------------------+
| Field     | Value                               |
+-----------+-------------------------------------+
| endpoints | regionOne                           |
|           |   public: http://10.0.0.12:5000     |
|           | regionOne                           |
|           |   admin: http://192.168.24.18:35357 |
|           | regionOne                           |
|           |   internal: http://172.16.2.14:5000 |
|           |                                     |
| id        | 9757786bc0514a7fba14dd71d897f72f    |
| name      | keystone                            |
| type      | identity                            |
+-----------+-------------------------------------+


[root@overcloud-controller-0 ~]# grep cinder_internal_tenant_user_id /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf 
#cinder_internal_tenant_user_id = <None>

Comment 15 Andreas Karis 2019-05-08 17:55:03 UTC
[root@overcloud-controller-0 ~]# grep cinder_internal_tenant_project_id /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf 
#cinder_internal_tenant_project_id = <None>

Comment 16 Andreas Karis 2019-05-08 18:05:16 UTC
Because in cinder I got:
~~~
auth_url=http://172.16.2.14:5000
username=cinder
password=H9pvbYaz9eVJfmeGkdP7A98pD
user_domain_name=Default
project_name=service
project_domain_name=Default
~~~

I'm going to use the same credentials:
~~~
(overcloud) [stack@undercloud-1 ~]$ openstack user list
+----------------------------------+-------------------------+
| ID                               | Name                    |
+----------------------------------+-------------------------+
| 05ed6385f7584c518133821512fa6192 | glance                  |
| 085ed337aebb474a9d1c661372b92ff6 | cinder                  |
| 20109638f01145dc910f66c21671a9e6 | placement               |
| 3b2ff8becdf54587b3a0656dd9975a93 | neutron                 |
| 464b63e39c4f4901b4764c7903e14e63 | swift                   |
| 4e64f209094d42528daaebdfd4d717b3 | aodh                    |
| 4f21fdef509e42f4bad0d7f83d72d8c2 | heat                    |
| 553c844efb37441da4e9a1cfc933aca0 | octavia                 |
| 5f57f23fcf104b41a44738c769f0830c | admin                   |
| 611bd3ead59346518c75c28cf64305fa | nova                    |
| 8799e138543d4bc59c1ee3bf09f58a48 | panko                   |
| 9c99d3f5a3c349d7b906ca8777eda741 | ceilometer              |
| a08ed76b3e2f4559950f5d3f0579eb37 | heat-cfn                |
| ab94920254ce4ea091bb9cfe4ecd3e5a | test                    |
| d11376ac98194dbd9669ec6753fb89bf | gnocchi                 |
| d9136eaad04f4cbf84ae019c8a270094 | heat_stack_domain_admin |
+----------------------------------+-------------------------+
(overcloud) [stack@undercloud-1 ~]$ openstack project list
+----------------------------------+---------+
| ID                               | Name    |
+----------------------------------+---------+
| 7bfedc571c954949b7d27da0ef9efa4d | test    |
| a584030bcca840b696b172842b3d1c8c | admin   |
| fb8d792da60f4aebb3252a1cc97ace5e | service |
+----------------------------------+---------+
~~~

So I updated cinder.conf with:
~~~
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf DEFAULT cinder_internal_tenant_user_id
085ed337aebb474a9d1c661372b92ff6
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/cinder/etc/cinder/cinder.conf DEFAULT cinder_internal_tenant_project_id
fb8d792da60f4aebb3252a1cc97ace5e
[root@overcloud-controller-0 ~]# 
~~~

And glance with:
~~~
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/glance_api/etc/glance/glance-api.conf glance_store cinder_store_auth_address
http://172.16.2.14:5000
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/glance_api/etc/glance/glance-api.conf glance_store cinder_store_user_name
cinder
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/glance_api/etc/glance/glance-api.conf glance_store cinder_store_project_name
service
[root@overcloud-controller-0 ~]# crudini --get /var/lib/config-data/puppet-generated/glance_api/etc/glance/glance-api.conf glance_store cinder_store_password
H9pvbYaz9eVJfmeGkdP7A98pD
~~~

And restarted the services:
~~~
[root@overcloud-controller-0 ~]# docker ps | egrep 'cinder|glance' | awk '{print $NF}' | xargs docker restart
openstack-cinder-volume-docker-0
cinder_api_cron
glance_api
cinder_scheduler
cinder_api
~~~

~~~
(overcloud) [stack@undercloud-1 ~]$ . overcloudrc 
(overcloud) [stack@undercloud-1 ~]$  cinder create --image-id 4173209e-4fe5-4cb5-9dda-975821493f45 --display_name=test 1
cinder list+--------------------------------+--------------------------------------+
| Property                       | Value                                |
+--------------------------------+--------------------------------------+
| attachments                    | []                                   |
| availability_zone              | nova                                 |
| bootable                       | false                                |
| consistencygroup_id            | None                                 |
| created_at                     | 2019-05-08T18:04:06.000000           |
| description                    | None                                 |
| encrypted                      | False                                |
| id                             | ebe93e4f-961f-4139-8ab0-9bd72031ee2e |
| metadata                       | {}                                   |
| migration_status               | None                                 |
| multiattach                    | False                                |
| name                           | test                                 |
| os-vol-host-attr:host          | None                                 |
| os-vol-mig-status-attr:migstat | None                                 |
| os-vol-mig-status-attr:name_id | None                                 |
| os-vol-tenant-attr:tenant_id   | a584030bcca840b696b172842b3d1c8c     |
| replication_status             | None                                 |
| size                           | 1                                    |
| snapshot_id                    | None                                 |
| source_volid                   | None                                 |
| status                         | creating                             |
| updated_at                     | None                                 |
| user_id                        | 5f57f23fcf104b41a44738c769f0830c     |
| volume_type                    | None                                 |
+--------------------------------+--------------------------------------+
(overcloud) [stack@undercloud-1 ~]$ cinder list
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+
| ID                                   | Status    | Name            | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+
| 2e1cd04b-62bd-4a37-892c-8731c0019568 | available | bootable_volume | 1    | -           | true     |             |
| ebe93e4f-961f-4139-8ab0-9bd72031ee2e | available | test            | 1    | -           | true     |             |
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+
~~~

~~~
(overcloud) [stack@undercloud-1 ~]$ source testrc
(overcloud) [stack@undercloud-1 ~]$  cinder create --image-id 4173209e-4fe5-4cb5-9dda-975821493f45 --display_name=test 1
+------------------------------+--------------------------------------+
| Property                     | Value                                |
+------------------------------+--------------------------------------+
| attachments                  | []                                   |
| availability_zone            | nova                                 |
| bootable                     | false                                |
| consistencygroup_id          | None                                 |
| created_at                   | 2019-05-08T18:04:30.000000           |
| description                  | None                                 |
| encrypted                    | False                                |
| id                           | 9ac05805-6adf-4cc9-a61f-8f8bc4e96125 |
| metadata                     | {}                                   |
| multiattach                  | False                                |
| name                         | test                                 |
| os-vol-tenant-attr:tenant_id | 7bfedc571c954949b7d27da0ef9efa4d     |
| replication_status           | None                                 |
| size                         | 1                                    |
| snapshot_id                  | None                                 |
| source_volid                 | None                                 |
| status                       | creating                             |
| updated_at                   | None                                 |
| user_id                      | ab94920254ce4ea091bb9cfe4ecd3e5a     |
| volume_type                  | None                                 |
+------------------------------+--------------------------------------+
(overcloud) [stack@undercloud-1 ~]$ cinder list
+--------------------------------------+--------+------+------+-------------+----------+-------------+
| ID                                   | Status | Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+--------+------+------+-------------+----------+-------------+
| 2ade3397-7307-49d9-bd83-d6bc8a2dccd8 | error  | test | 1    | -           | false    |             |
| 35f51468-90be-461a-8d0f-5e4bb39ff97a | error  | test | 1    | -           | false    |             |
| 9ac05805-6adf-4cc9-a61f-8f8bc4e96125 | error  | test | 1    | -           | false    |             |
+--------------------------------------+--------+------+------+-------------+----------+-------------+
~~~

Comment 17 Andreas Karis 2019-05-08 18:14:35 UTC
Though what I get *now* is:
~~~
2019-05-08 18:07:06.693 42 DEBUG cinder.volume.manager [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Task 'cinder.volume.flows.manager.create_volume.OnFailureRescheduleTask;volume:create' (9fa7a03d-5c31-489a-930e-aaa253f9434d) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
2019-05-08 18:07:06.695 42 DEBUG cinder.volume.flows.manager.create_volume [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Updating volume 6c065b44-63b1-4be1-a7a9-56812a8dd9ef with {'host': None, 'scheduled_at': datetime.datetime(2019, 5, 8, 18, 7, 6, 695187)}. _pre_reschedule /usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py:124
2019-05-08 18:07:06.736 42 DEBUG cinder.volume.flows.manager.create_volume [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Volume 6c065b44-63b1-4be1-a7a9-56812a8dd9ef: re-scheduling SchedulerAPI.create_volume attempt 3 due to HTTPInternalServerError (HTTP 500) _reschedule /usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py:151
2019-05-08 18:07:06.795 42 DEBUG cinder.volume.flows.manager.create_volume [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Volume 6c065b44-63b1-4be1-a7a9-56812a8dd9ef: re-scheduled _post_reschedule /usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py:163
2019-05-08 18:07:06.797 42 DEBUG oslo_concurrency.processutils [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Running cmd (subprocess): env LC_ALL=C lvs --noheadings --unit=g -o vg_name,name,size --nosuffix cinder-volumes/volume-6c065b44-63b1-4be1-a7a9-56812a8dd9ef execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:372
2019-05-08 18:07:08.118 42 DEBUG oslo_concurrency.processutils [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] CMD "env LC_ALL=C lvs --noheadings --unit=g -o vg_name,name,size --nosuffix cinder-volumes/volume-6c065b44-63b1-4be1-a7a9-56812a8dd9ef" returned: 5 in 1.322s execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:409
2019-05-08 18:07:08.121 42 DEBUG oslo_concurrency.processutils [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] u'env LC_ALL=C lvs --noheadings --unit=g -o vg_name,name,size --nosuffix cinder-volumes/volume-6c065b44-63b1-4be1-a7a9-56812a8dd9ef' failed. Not Retrying. execute /usr/lib/python2.7/site-packages/oslo_concurrency/processutils.py:457
2019-05-08 18:07:08.121 42 INFO cinder.brick.local_dev.lvm [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Logical Volume not found when querying LVM info. (vg_name=cinder-volumes, lv_name=volume-6c065b44-63b1-4be1-a7a9-56812a8dd9ef
2019-05-08 18:07:08.126 42 WARNING cinder.volume.manager [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Task 'cinder.volume.flows.manager.create_volume.OnFailureRescheduleTask;volume:create' (9fa7a03d-5c31-489a-930e-aaa253f9434d) transitioned into state 'REVERTED' from state 'REVERTING' with result 'True'
2019-05-08 18:07:08.130 42 DEBUG cinder.volume.manager [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Task 'cinder.volume.flows.manager.create_volume.ExtractVolumeRefTask;volume:create' (271bdc3b-e063-4310-b17c-f036415f705f) transitioned into state 'REVERTING' from state 'SUCCESS' _task_receiver /usr/lib/python2.7/site-packages/taskflow/listeners/logging.py:194
2019-05-08 18:07:08.133 42 WARNING cinder.volume.manager [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Task 'cinder.volume.flows.manager.create_volume.ExtractVolumeRefTask;volume:create' (271bdc3b-e063-4310-b17c-f036415f705f) transitioned into state 'REVERTED' from state 'REVERTING' with result 'None'
2019-05-08 18:07:08.137 42 WARNING cinder.volume.manager [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Flow 'volume_create_manager' (f7b475cb-8077-442d-b4f1-2f2bd5790a70) transitioned into state 'REVERTED' from state 'RUNNING'
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server [req-36144ad2-e36f-4c7c-90c1-07e3b733548c ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Exception during message handling: HTTPInternalServerError: HTTPInternalServerError (HTTP 500)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 163, in _process_incoming
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 220, in dispatch
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 190, in _do_dispatch
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "<string>", line 2, in create_volume
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/objects/cleanable.py", line 207, in wrapper
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     result = f(*args, **kwargs)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 690, in create_volume
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     _run_flow()
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 682, in _run_flow
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     flow_engine.run()
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 247, in run
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     for _state in self.run_iter(timeout=timeout):
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/engine.py", line 340, in run_iter
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     failure.Failure.reraise_if_any(er_failures)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 336, in reraise_if_any
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     failures[0].reraise()
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/types/failure.py", line 343, in reraise
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     six.reraise(*self._exc_info)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/taskflow/engines/action_engine/executor.py", line 53, in _execute_task
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     result = task.execute(**arguments)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 1020, in execute
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     **volume_spec)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 929, in _create_from_image
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     image_service)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/volume/flows/manager/create_volume.py", line 812, in _create_from_image_cache_or_download
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     backend_name) as tmp_image:
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python2.7/contextlib.py", line 17, in __enter__
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     return self.gen.next()
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 764, in fetch
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     fetch_verify_image(context, image_service, image_id, tmp)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 378, in fetch_verify_image
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     None, None)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/image_utils.py", line 304, in fetch
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     tpool.Proxy(image_file))
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 324, in download
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     _reraise_translated_image_exception(image_id)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 563, in _reraise_translated_image_exception
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     six.reraise(type(new_exc), new_exc, exc_trace)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 322, in download
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     image_chunks = self._client.call(context, 'data', image_id)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/cinder/image/glance.py", line 201, in call
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     return getattr(controller, method)(*args, **kwargs)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/glanceclient/common/utils.py", line 545, in inner
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     return RequestIdProxy(wrapped(*args, **kwargs))
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/glanceclient/v2/images.py", line 208, in data
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     resp, body = self.http_client.get(url)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/keystoneauth1/adapter.py", line 304, in get
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     return self.request(url, 'GET', **kwargs)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/glanceclient/common/http.py", line 349, in request
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     return self._handle_response(resp)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server   File "/usr/lib/python2.7/site-packages/glanceclient/common/http.py", line 98, in _handle_response
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server     raise exc.from_response(resp, resp.content)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server HTTPInternalServerError: HTTPInternalServerError (HTTP 500)
2019-05-08 18:07:08.139 42 ERROR oslo_messaging.rpc.server 
^C
~~~

These settings are clearly not taken into consideration by glance:
~~~
[root@overcloud-controller-0 ~]# tail -n2 /var/log/containers/glance/api.log
2019-05-08 18:12:55.948 25 DEBUG eventlet.wsgi.server [-] (25) accepted ('172.16.2.17', 43664) server /usr/lib/python2.7/site-packages/eventlet/wsgi.py:883
2019-05-08 18:12:55.952 25 INFO eventlet.wsgi.server [-] 172.16.2.17 - - [08/May/2019 18:12:55] "GET /healthcheck HTTP/1.0" 200 137 0.003297
[root@overcloud-controller-0 ~]# grep cinder_store /var/log/containers/glance/*
[root@overcloud-controller-0 ~]# 
~~~

~~~
[root@overcloud-controller-0 ~]# docker ps | egrep 'glance' | awk '{print $NF}' | xargs docker restart
glance_api
[root@overcloud-controller-0 ~]# 2019-05-08 18:13:47.547 1 DEBUG glance_store.backend [-] Attempting to import store cinder _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.547 1 DEBUG glance_store.backend [-] Attempting to import store file _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.548 1 DEBUG glance_store.backend [-] Attempting to import store glance.store.cinder.Store _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.548 1 DEBUG glance_store.backend [-] Attempting to import store glance.store.filesystem.Store _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.549 1 DEBUG glance_store.backend [-] Attempting to import store glance.store.http.Store _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.549 1 DEBUG glance_store.backend [-] Attempting to import store glance.store.rbd.Store _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.549 1 DEBUG glance_store.backend [-] Attempting to import store glance.store.sheepdog.Store _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.550 1 DEBUG glance_store.backend [-] Attempting to import store glance.store.swift.Store _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.550 1 DEBUG glance_store.backend [-] Attempting to import store glance.store.vmware_datastore.Store _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.551 1 DEBUG glance_store.backend [-] Attempting to import store http _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.551 1 DEBUG glance_store.backend [-] Attempting to import store no_conf _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.552 1 DEBUG glance_store.backend [-] Attempting to import store rbd _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.552 1 DEBUG glance_store.backend [-] Attempting to import store sheepdog _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.552 1 DEBUG glance_store.backend [-] Attempting to import store swift _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.553 1 DEBUG glance_store.backend [-] Attempting to import store vmware _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.553 1 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/lib/python2.7/site-packages/glance_store/backend.py:160
2019-05-08 18:13:47.554 1 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/lib/python2.7/site-packages/glance_store/backend.py:160
2019-05-08 18:13:47.555 1 DEBUG glance_store.backend [-] Attempting to import store swift _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.557 1 DEBUG glance_store.capabilities [-] Store glance_store._drivers.swift.store.SingleTenantStore doesn't support updating dynamic storage capabilities. Please overwrite 'update_capabilities' method of the store to implement updating logics if needed. update_capabilities /usr/lib/python2.7/site-packages/glance_store/capabilities.py:97
2019-05-08 18:13:47.558 1 DEBUG glance_store.backend [-] Registering store swift with schemes ('swift+https', 'swift', 'swift+http', 'swift+config') create_stores /usr/lib/python2.7/site-packages/glance_store/backend.py:278
2019-05-08 18:13:47.558 1 DEBUG glance_store.driver [-] Late loading location class glance_store._drivers.swift.store.StoreLocation get_store_location_class /usr/lib/python2.7/site-packages/glance_store/driver.py:89
2019-05-08 18:13:47.558 1 DEBUG glance_store.location [-] Registering scheme swift+https with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f33b90>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.559 1 DEBUG glance_store.location [-] Registering scheme swift+http with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f33b90>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.559 1 DEBUG glance_store.location [-] Registering scheme swift with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f33b90>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.560 1 DEBUG glance_store.location [-] Registering scheme swift+config with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f33b90>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.560 1 DEBUG glance_store.backend [-] Attempting to import store http _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.560 1 DEBUG glance_store.capabilities [-] Store glance_store._drivers.http.Store doesn't support updating dynamic storage capabilities. Please overwrite 'update_capabilities' method of the store to implement updating logics if needed. update_capabilities /usr/lib/python2.7/site-packages/glance_store/capabilities.py:97
2019-05-08 18:13:47.561 1 DEBUG glance_store.backend [-] Registering store http with schemes ('http', 'https') create_stores /usr/lib/python2.7/site-packages/glance_store/backend.py:278
2019-05-08 18:13:47.561 1 DEBUG glance_store.driver [-] Late loading location class glance_store._drivers.http.StoreLocation get_store_location_class /usr/lib/python2.7/site-packages/glance_store/driver.py:89
2019-05-08 18:13:47.562 1 DEBUG glance_store.location [-] Registering scheme http with {'location_class': <class 'glance_store._drivers.http.StoreLocation'>, 'store': <glance_store._drivers.http.Store object at 0x7feec8f24a50>, 'store_entry': 'http'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.562 1 DEBUG glance_store.location [-] Registering scheme https with {'location_class': <class 'glance_store._drivers.http.StoreLocation'>, 'store': <glance_store._drivers.http.Store object at 0x7feec8f24a50>, 'store_entry': 'http'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.562 1 DEBUG glance_store.backend [-] Attempting to import store cinder _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.563 1 WARNING glance_store._drivers.cinder [-] Cinder store is considered experimental. Current deployers should be aware that the use of it in production right now may be risky.
2019-05-08 18:13:47.564 1 DEBUG glance_store.capabilities [-] Store glance_store._drivers.cinder.Store doesn't support updating dynamic storage capabilities. Please overwrite 'update_capabilities' method of the store to implement updating logics if needed. update_capabilities /usr/lib/python2.7/site-packages/glance_store/capabilities.py:97
2019-05-08 18:13:47.564 1 DEBUG glance_store.backend [-] Registering store cinder with schemes ('cinder',) create_stores /usr/lib/python2.7/site-packages/glance_store/backend.py:278
2019-05-08 18:13:47.564 1 DEBUG glance_store.driver [-] Late loading location class glance_store._drivers.cinder.StoreLocation get_store_location_class /usr/lib/python2.7/site-packages/glance_store/driver.py:89
2019-05-08 18:13:47.565 1 DEBUG glance_store.location [-] Registering scheme cinder with {'location_class': <class 'glance_store._drivers.cinder.StoreLocation'>, 'store': <glance_store._drivers.cinder.Store object at 0x7feed413efd0>, 'store_entry': 'cinder'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.565 1 DEBUG glance_store.backend [-] Attempting to import store swift _load_store /usr/lib/python2.7/site-packages/glance_store/backend.py:231
2019-05-08 18:13:47.567 1 DEBUG glance_store.capabilities [-] Store glance_store._drivers.swift.store.SingleTenantStore doesn't support updating dynamic storage capabilities. Please overwrite 'update_capabilities' method of the store to implement updating logics if needed. update_capabilities /usr/lib/python2.7/site-packages/glance_store/capabilities.py:97
2019-05-08 18:13:47.568 1 DEBUG glance_store.driver [-] Late loading location class glance_store._drivers.swift.store.StoreLocation get_store_location_class /usr/lib/python2.7/site-packages/glance_store/driver.py:89
2019-05-08 18:13:47.568 1 DEBUG glance_store.location [-] Registering scheme swift+https with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.569 1 DEBUG glance_store.location [-] Registering scheme swift+https with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.569 1 DEBUG glance_store.location [-] Registering scheme swift with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.570 1 DEBUG glance_store.location [-] Registering scheme swift+https with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.570 1 DEBUG glance_store.location [-] Registering scheme swift+http with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.570 1 DEBUG glance_store.location [-] Registering scheme swift with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.571 1 DEBUG glance_store.location [-] Registering scheme swift+https with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.571 1 DEBUG glance_store.location [-] Registering scheme swift+http with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.571 1 DEBUG glance_store.location [-] Registering scheme swift with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
2019-05-08 18:13:47.572 1 DEBUG glance_store.location [-] Registering scheme swift+config with {'location_class': <class 'glance_store._drivers.swift.store.StoreLocation'>, 'store': <glance_store._drivers.swift.store.SingleTenantStore object at 0x7feec8f3e5d0>, 'store_entry': 'swift'} register_scheme_map /usr/lib/python2.7/site-packages/glance_store/location.py:88
[root@overcloud-controller-0 ~]# 
~~~

Or is glance different wrt to reporting parameters when in DEBUG mode?

Comment 18 Andreas Karis 2019-05-08 18:16:06 UTC
And glance reports:

[root@overcloud-controller-0 ~]# tail -f !$ | grep ERROR
tail -f /var/log/containers/glance/api.log | grep ERROR
2019-05-08 18:15:42.665 25 DEBUG oslo_db.sqlalchemy.engines [req-65ae86cb-5d45-47dc-b426-bd2782f3d4ba ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python2.7/site-packages/oslo_db/sqlalchemy/engines.py:290
2019-05-08 18:15:44.011 25 DEBUG glance_store._drivers.cinder [req-4aadc7d7-0255-48ab-b926-a2ee59ae5df2 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Cinderclient connection created for user cinder using URL: http://172.16.2.14:5000. get_cinderclient /usr/lib/python2.7/site-packages/glance_store/_drivers/cinder.py:360
2019-05-08 18:15:44.041 25 ERROR glance_store._drivers.cinder [req-4aadc7d7-0255-48ab-b926-a2ee59ae5df2 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300): ClientException: n/a (HTTP 300)
2019-05-08 18:15:44.041 25 ERROR glance_store._drivers.cinder [req-4aadc7d7-0255-48ab-b926-a2ee59ae5df2 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300): ClientException: n/a (HTTP 300)
2019-05-08 18:15:44.042 25 ERROR glance.location [req-4aadc7d7-0255-48ab-b926-a2ee59ae5df2 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Glance tried all active locations to get data for image 4173209e-4fe5-4cb5-9dda-975821493f45 but all have failed.: BackendException: Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300)
2019-05-08 18:15:45.928 25 DEBUG glance_store._drivers.cinder [req-3c91f918-1b78-4076-80fd-4718fbc5a233 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Cinderclient connection created for user cinder using URL: http://172.16.2.14:5000. get_cinderclient /usr/lib/python2.7/site-packages/glance_store/_drivers/cinder.py:360
2019-05-08 18:15:45.939 25 ERROR glance_store._drivers.cinder [req-3c91f918-1b78-4076-80fd-4718fbc5a233 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300): ClientException: n/a (HTTP 300)
2019-05-08 18:15:45.939 25 ERROR glance_store._drivers.cinder [req-3c91f918-1b78-4076-80fd-4718fbc5a233 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300): ClientException: n/a (HTTP 300)
2019-05-08 18:15:45.940 25 ERROR glance.location [req-3c91f918-1b78-4076-80fd-4718fbc5a233 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Glance tried all active locations to get data for image 4173209e-4fe5-4cb5-9dda-975821493f45 but all have failed.: BackendException: Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300)
2019-05-08 18:15:47.778 25 DEBUG glance_store._drivers.cinder [req-5c581b28-0f4c-4fd7-b00c-3bfdc40ed77f ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Cinderclient connection created for user cinder using URL: http://172.16.2.14:5000. get_cinderclient /usr/lib/python2.7/site-packages/glance_store/_drivers/cinder.py:360
2019-05-08 18:15:47.788 25 ERROR glance_store._drivers.cinder [req-5c581b28-0f4c-4fd7-b00c-3bfdc40ed77f ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300): ClientException: n/a (HTTP 300)
2019-05-08 18:15:47.788 25 ERROR glance_store._drivers.cinder [req-5c581b28-0f4c-4fd7-b00c-3bfdc40ed77f ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300): ClientException: n/a (HTTP 300)
2019-05-08 18:15:47.789 25 ERROR glance.location [req-5c581b28-0f4c-4fd7-b00c-3bfdc40ed77f ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Glance tried all active locations to get data for image 4173209e-4fe5-4cb5-9dda-975821493f45 but all have failed.: BackendException: Failed to get image volume 2e1cd04b-62bd-4a37-892c-8731c0019568: n/a (HTTP 300)
^C
[root@overcloud-controller-0 ~]# ^C
[root@overcloud-controller-0 ~]#

Comment 19 Alan Bishop 2019-05-08 18:17:48 UTC
I haven't been closely following this bz, but bear in mind support for configuring glance's cinder_store_* settings just shipped in 13z6 (see bug #1646955).

Comment 20 Brian Rosmaita 2019-05-08 18:23:13 UTC
@Alan: thanks for the info, I didn't realize that.

@Andreas: I think all those HTTP 300 responses in the glance log are from the identity service giving a "multiple choices" response.  Try setting [glance_store]/cinder_store_auth_address in glance-api.conf to http://172.16.2.14:5000/v3 and see what happens.

Comment 21 Andreas Karis 2019-05-08 19:45:44 UTC
Now i get:

[root@overcloud-controller-0 ~]# tail -f /var/log/containers/glance/api.log | grep ERROR
2019-05-08 19:43:45.259 1 ERROR glance.common.wsgi [-] Not respawning child 124, cannot recover from termination
2019-05-08 19:43:45.164 124 ERROR oslo.privsep.daemon [-] [Errno 1] Operation not permitted
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon [req-c83ff4d3-51ec-4d99-8585-b86e5203b98b ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Error while sending initial PING to privsep: [Errno 32] Broken pipe: error: [Errno 32] Broken pipe
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon Traceback (most recent call last):
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/daemon.py", line 181, in exchange_ping
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon     reply = self.send_recv((Message.PING.value,))
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/comm.py", line 163, in send_recv
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon     self.writer.send((myid, msg))
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/comm.py", line 55, in send
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon     self.writesock.sendall(buf)
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 390, in sendall
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon     tail = self.send(data, flags)
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 384, in send
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon     return self._send_loop(self.fd.send, data, flags)
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 371, in _send_loop
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon     return send_method(data, *args)
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon error: [Errno 32] Broken pipe
2019-05-08 19:43:45.278 26 ERROR oslo.privsep.daemon 
2019-05-08 19:43:49.972 1 ERROR glance.common.wsgi [-] Not respawning child 140, cannot recover from termination
2019-05-08 19:43:49.912 140 ERROR oslo.privsep.daemon [-] [Errno 1] Operation not permitted
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon [req-bcfd82ef-52cf-4147-a73b-dd136a70990a ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Error while sending initial PING to privsep: [Errno 32] Broken pipe: error: [Errno 32] Broken pipe
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon Traceback (most recent call last):
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/daemon.py", line 181, in exchange_ping
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon     reply = self.send_recv((Message.PING.value,))
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/comm.py", line 163, in send_recv
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon     self.writer.send((myid, msg))
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/comm.py", line 55, in send
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon     self.writesock.sendall(buf)
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 390, in sendall
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon     tail = self.send(data, flags)
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 384, in send
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon     return self._send_loop(self.fd.send, data, flags)
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 371, in _send_loop
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon     return send_method(data, *args)
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon error: [Errno 32] Broken pipe
2019-05-08 19:43:49.996 26 ERROR oslo.privsep.daemon 
2019-05-08 19:44:00.627 1 ERROR glance.common.wsgi [-] Not respawning child 156, cannot recover from termination
2019-05-08 19:44:00.540 156 ERROR oslo.privsep.daemon [-] [Errno 1] Operation not permitted
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon [req-a7dc3723-0b98-4fa1-b840-c40d34aca0f4 ab94920254ce4ea091bb9cfe4ecd3e5a 7bfedc571c954949b7d27da0ef9efa4d - default default] Error while sending initial PING to privsep: [Errno 32] Broken pipe: error: [Errno 32] Broken pipe
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon Traceback (most recent call last):
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/daemon.py", line 181, in exchange_ping
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon     reply = self.send_recv((Message.PING.value,))
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/comm.py", line 163, in send_recv
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon     self.writer.send((myid, msg))
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/oslo_privsep/comm.py", line 55, in send
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon     self.writesock.sendall(buf)
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 390, in sendall
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon     tail = self.send(data, flags)
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 384, in send
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon     return self._send_loop(self.fd.send, data, flags)
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon   File "/usr/lib/python2.7/site-packages/eventlet/greenio/base.py", line 371, in _send_loop
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon     return send_method(data, *args)
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon error: [Errno 32] Broken pipe
2019-05-08 19:44:00.632 26 ERROR oslo.privsep.daemon 


Perhaps I'm using the wrong endpoint url? (I'm using the internal one)

(overcloud) [stack@undercloud-1 ~]$ openstack catalog show identity
+-----------+-------------------------------------+
| Field     | Value                               |
+-----------+-------------------------------------+
| endpoints | regionOne                           |
|           |   public: http://10.0.0.12:5000     |
|           | regionOne                           |
|           |   admin: http://192.168.24.18:35357 |
|           | regionOne                           |
|           |   internal: http://172.16.2.14:5000 |
|           |                                     |
| id        | 9757786bc0514a7fba14dd71d897f72f    |
| name      | keystone                            |
| type      | identity                            |
+-----------+-------------------------------------+

Comment 22 Andreas Karis 2019-05-08 19:46:33 UTC
https://bugzilla.redhat.com/show_bug.cgi?id=1658367

Comment 23 Andreas Karis 2019-05-08 19:52:14 UTC
So that matches: https://bugzilla.redhat.com/show_bug.cgi?id=1658367#c22

But I'd like to keep both the file backend and the cinder backend, how'd I do this? I'm only configuring all the cinder stuff to fix what's wrong with the images with a cinder backing, obviously.

Comment 24 Andreas Karis 2019-05-08 20:00:06 UTC
Might the containerization be what's causing the customer's issues in the first place from OSP 10 to OSP 13?

--------------------

 Alan Bishop 2019-01-11 15:03:00 UTC

Sorry, no, that's not sufficient.

Setting "GlanceBackend: cinder" adds two critical settings that are used when the glance-api container is started:

1) Two docker volume mounts required to use the iSCSI service [1]
2) Run the container in privileged mode [2], which is also required to access iSCSI services

The second item is what I believe will eliminate your privsep failure, but you need both.

[1] https://github.com/openstack/tripleo-heat-templates/blob/stable/queens/docker/services/glance-api.yaml#L197
[2] https://github.com/openstack/tripleo-heat-templates/blob/stable/queens/docker/services/glance-api.yaml#L210

--------------------

I'm going to test this in OSP 10 now, just in case ..

Comment 25 Alan Bishop 2019-05-13 16:19:49 UTC
My statement in bug #1658367 comment #22 was definitely related to the glance service running in a container, but glance's ability to use cinder for its backend was introduced in OSP 12.

Comment 26 Alan Bishop 2019-05-22 19:28:06 UTC
When creating a volume from an image, cinder asks glance for the image data,
but tenant A is unable to access the cinder volume were the image is stored.
This causes this glance error:

ERROR glance_store._drivers.cinder Failed to get image size due to volume can
not be found: XXX: NotFound: Volume XXX could not be found. (HTTP 404)

Although a glance image is public, when it's stored in cinder then access to
the volume is controlled by cinder's access policies. When a user in tenant A
attempts to generate a volume from the image, glance may allow the operation
(the image is public), but users in tenant A and not allowed access to volumes
created by users in tenant B.

The glance-api failure then causes cinder to fail. The "'NoneType' object is
not iterable" actually a separate issue, fixed in bug #1639820.

The root problem is that glance needs its cinder_store_XXX settings configured
so that all image volumes are accessed via glance itself, and not the
user. For that to happen, you need the fix for bug #1646955 (which is part
of the 13z6 release).

Once glance's cinder_store settings are configured, any new image uploaded by
users in tenant B should be accessible to users in tenent A.

However! Configuring glance's cinder_store settings does affect existing
images. Fortunately, it's possible to "fix" these otherwise broken images by
transfering them from their original owner to the glance service.

The following sequence should work:

1) Install the fix for bug #1646955 (i.e. update to 13z6)

2) As an admin, run "cinder list --all-tenants" to identify the volumes
   associated with glance images.

3) Each user who created an image needs to create a volume transfer record.
   Record the request ID and auth token for each transfer request.

$ openstack volume transfer request create <cinder's image volume ID>

4) As an admin, transfer the volume(s) to the glance service

$ OS_USERNAME=glance OS_PROJECT_NAME=service OS_TENANT_NAME=service \
  OS_PASSWORD=<glance's service password> \
  openstack volume transfer request accept <request ID> <auth token>

Now, users in tenant A should be able to create volumes from images that were
previously uploaded by users in tenant B.

Comment 28 Alan Bishop 2019-09-17 22:10:25 UTC
Updating to the latest 13z release will resolve the problem (see comment #26 for further details). The fix that configures the cinder_store settings was included 13z6.


Note You need to log in before you can comment on or make changes to this bug.