Description of problem: Currently, the only way to determine the size of a collection or sub-collection that is not included in the api summary information (which only contains vms, hosts, users, and storage_domains) is to iterate through all pages of the list method like so: dc_list = [] dc_page_index = 1 dc_page_current = api.datacenters.list(query="page %s" % dc_page_index) while(len(dc_page_current) != 0): dc_list = dc_list + dc_page_current dc_page_index = dc_page_index + 1 dc_page_current = api.datacenters.list(query="page %s" % dc_page_index) print len(dc_list) The REST API and SDK should ideally expose the size of all collections more directly.
1. i'm not sure if it works for RESTfull api 2. i did not convinced that knowing collection size ahead of paging/looping gives any benefit to the client. what i do think, that client should be able retrieving entire collection, using max=-1 or search dialect.
(In reply to comment #1) > 2. i did not convinced that knowing collection size ahead of > paging/looping gives any benefit to the client. This isn't the only reason you might want the size though. In my scenario I was working on an overview dashboard application with summary totals. The problem is that to generate those summary totals you effectively have to page through every record in every collection you are interested in - just to determine the size. > what i do think, that client should be able retrieving entire collection, > using max=-1 or search dialect. I think, based on the discussions I lurked on in #rhev on Freenode, this is what many users/developers expect. Obviously in the documentation we would need to point out the performance risks of using this operation for large environments.
After analyzing the complexity required to implement this (specially for subcollections implemented using queries), the cost in terms of performance and the benefit for the user we have came to the conclussion that it isn't worth implementing it.