Customer is facing ERROR oslo_service.service NovaException: Unsupported VIF type binding_failed convert '_nova_to_osvif_vif_binding_failed' and even after update the DB as solution https://access.redhat.com/solutions/3465681 propose, the problem persists. In neutron/server.log we found what seems to be the RC of the problem ERROR oslo_db.sqlalchemy.exc_filters [req-c5e094e8-e792-4d2d-9847-352f10326483 - - - - -] DBAPIError exception wrapped from (pymysql.err.InternalError) (1305, u'SAVEPOINT sa_savepoint_17 does not exist') [SQL: u'ROLLBACK TO SAVEPOINT sa_savepoint_17'] (Background on this error at: http://sqlalche.me/e/2j85): InternalError: (1305, u'SAVEPOINT sa_savepoint_17 does not exist') ** This is very similar to https://bugzilla.redhat.com/show_bug.cgi?id=1567627 https://review.opendev.org/#/c/326927/ https://review.opendev.org/#/c/326927/6/neutron/db/api.py That was merged in 2016 and is in the code: ~~~ [root@undercloud-1 ~]# grep 'def is_retriable' -A10 -n /usr/lib/python2.7/site-packages/neutron/db/api.py 59:def is_retriable(e): 60- if getattr(e, '_RETRY_EXCEEDED', False): 61- return False 62- if _is_nested_instance(e, (db_exc.DBDeadlock, exc.StaleDataError, 63- db_exc.DBConnectionError, 64- db_exc.DBDuplicateEntry, db_exc.RetryRequest, 65- obj_exc.NeutronDbObjectDuplicateEntry)): 66- return True 67- # looking savepoints mangled by deadlocks. see bug/1590298 for details. 68- return _is_nested_instance(e, db_exc.DBError) and '1305' in str(e) 69- ~~~ the code changed a bit, but unless there's a regression, that specific upstream bug should be fixed.