Description of problem: Scheduler ignores free hugepages of numa-nodes when using numa-pinning and hugegaes (1G) for VM. Only free memory is taken into calculation which is too low per numa node since most ist reseved for hugepages. Version-Release number of selected component (if applicable): ovirt-engine-4.3.4.3-1.el7.noarch vdsm-4.30.17-1.el7.x86_64 ovirt-release-host-node-4.3.4-1.el7.noarch How reproducible: Reserve 3/4 of RAM for hugepages (1G) try to ping VM with larger Memory requirements i.e. 32 GB to numa-nodes Steps to Reproduce: 1. My EPYC 7281 based Servers (Dual Socket) have 8 Numa-Nodes each having 32 GB of memory for a total of 256 GB System Memory 2. reserve 192 x 1 GB hugepages reserved on the kernel cmdline default_hugepagesz=1G hugepagesz=1G hugepages=192 This reserves 24 hugepages on each numa-node. 3. Pin VM using 32 GB (custom Ppoperty hugepages=1048576) to numa-nodes 0-3 of CPU-Socket 1 4. Start VM Actual results: VM can't be started error message in UI "The host foo did not satisfy internal filter NUMA because cannot accommodate memory of VM's pinned virtual NUMA nodes within host's physical NUMA nodes" Expected results: Should start and use 8 hugepages = 8 GB / numa node 0-3 for 32 GB Memory Additional info: System has enough free-pages on numa-node 0-3: grep "" /sys/devices/system/node/*/hugepages/hugepages-1048576kB/free_hugepages /sys/devices/system/node/node0/hugepages/hugepages-1048576kB/free_hugepages:24 /sys/devices/system/node/node1/hugepages/hugepages-1048576kB/free_hugepages:22 /sys/devices/system/node/node2/hugepages/hugepages-1048576kB/free_hugepages:22 /sys/devices/system/node/node3/hugepages/hugepages-1048576kB/free_hugepages:24 /sys/devices/system/node/node4/hugepages/hugepages-1048576kB/free_hugepages:22 /sys/devices/system/node/node5/hugepages/hugepages-1048576kB/free_hugepages:14 /sys/devices/system/node/node6/hugepages/hugepages-1048576kB/free_hugepages:17 /sys/devices/system/node/node7/hugepages/hugepages-1048576kB/free_hugepages:19 was already in https://bugzilla.redhat.com/show_bug.cgi?id=1720558 but bug description changed and didn't show root cause of problem
Sure, but the root cause is still the same as that bug. Closing as a duplicate, and we'll track there. *** This bug has been marked as a duplicate of bug 1720558 ***