Bug 1741792 - Add clevis RPMs to RHV-H image / repo [NEEDINFO]
Summary: Add clevis RPMs to RHV-H image / repo
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-host
Version: 4.3.6
Hardware: Unspecified
OS: Unspecified
Target Milestone: ovirt-4.4.0
: 4.4.0
Assignee: Yuval Turgeman
QA Contact: shiyi lei
Depends On: 1781184
Blocks: 1759015 1760262
TreeView+ depends on / blocked
Reported: 2019-08-16 06:36 UTC by John Call
Modified: 2020-08-04 13:27 UTC (History)
20 users (show)

Fixed In Version: ovirt-host-4.4.0-0.2.alpha
Doc Type: Bug Fix
Doc Text:
Previously, using LUKS alone was a problem because the RHV Manager could reboot a node using Power Management commands. However, the node would not reboot because it was waiting for the user to enter a decrypt/open/unlock passphrase. This release fixes the issue by adding clevis RPMs to the Red Hat Virtualization Host (RHVH) image. As a result, a Manager can automatically unlock/decrypt/open an RHVH using TPM or NBDE.
Clone Of:
: 1759015 (view as bug list)
Last Closed: 2020-08-04 13:27:17 UTC
oVirt Team: Node
Target Upstream Version:
rdlugyhe: needinfo? (nlevy)

Attachments (Terms of Use)
kickstart to deploy RHV-H via PXE (including cleanup from previous installs) (2.57 KB, text/plain)
2019-08-16 06:52 UTC, John Call
no flags Details

System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHEA-2020:3246 0 None None None 2020-08-04 13:27:55 UTC

Description John Call 2019-08-16 06:36:33 UTC
###Description of problem:###
The rhel-7-server-rhvh-4-rpms repo should have the clevis RPMs and dependencies available (or pre-installed in the image.)  Customers are requesting at-rest encryption for their RHHI and RHV deployments.  Using LUKS alone is a problem because RHVM can use Power Management commands to reboot a node.  The node will never reboot because it is waiting forever for the user to type in the decrypt/open/unlock passphrase.  The clevis RPMs will allow for automatic unlock/decrypt/open via TPM or NBDE.

###Additional info:###
Using LUKS - https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/7/html-single/security_guide/index#sec-Using_LUKS_Disk_Encryption

Auto-unlock via Network (or TPM) - https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/7/html-single/security_guide/index#sec-Policy-Based_Decryption

The required RPMs (and their dependencies) are available in the rhel-7-server-rpms repo like this:

# yum history info clevis
Return-Code    : Success
Command Line   : install clevis*
Transaction performed with:
    Installed     rpm-4.11.3-40.el7.x86_64                    installed
    Installed     subscription-manager-1.24.13-1.el7.x86_64   installed
    Installed     yum-3.4.3-163.el7.noarch                    installed
    Installed     yum-plugin-versionlock-1.1.31-52.el7.noarch installed
Packages Altered:
    Install     clevis-7-8.el7.x86_64          @rhel-7-server-rpms
    Install     clevis-dracut-7-8.el7.x86_64   @rhel-7-server-rpms
    Install     clevis-luks-7-8.el7.x86_64     @rhel-7-server-rpms
    Install     clevis-systemd-7-8.el7.x86_64  @rhel-7-server-rpms
    Dep-Install jose-10-1.el7.x86_64           @rhel-7-server-rpms
    Dep-Install libjose-10-1.el7.x86_64        @rhel-7-server-rpms
    Dep-Install libluksmeta-8-2.el7.x86_64     @rhel-7-server-rpms
    Dep-Install luksmeta-8-2.el7.x86_64        @rhel-7-server-rpms
    Dep-Install tpm2-abrmd-1.1.0-11.el7.x86_64 @rhel-7-server-rpms
    Dep-Install tpm2-tools-3.0.4-3.el7.x86_64  @rhel-7-server-rpms
    Dep-Install tpm2-tss-1.4.0-3.el7.x86_64    @rhel-7-server-rpms

I can provide kickstart examples and simplified luks/clevis/tang instructions if that would help...

Comment 1 John Call 2019-08-16 06:50:51 UTC
Here is my simplified steps to test Clevis.

1. Install the Tang server...
[root@tang-srv ~]# yum -y install tang
[root@tang-srv ~]# systemctl enable tangd.socket --now
[root@tang-srv ~]# firewall-cmd --add-port 80/tcp --permanent
[root@tang-srv ~]# firewall-cmd --reload

2. Install RHV-H with encryption.  See attached kickstart file, or check the encryption box via the Anaconda GUI)

3. Configure the Clevis client, and rebuild the initramfs.
   (change /dev/sda3 to whatever is appropriate)
# curl -o /etc/yum.repos.d/clevis.repo http://people.redhat.com/jcall/clevis.repo
# yum -y install 'clevis*'

# lsblk -o +VENDOR,MODEL
# clevis bind luks -d /dev/sda3 tang '{"url":""}'

# echo 'kernel_cmdline="ip=dhcp"' > /etc/dracut.conf.d/clevis-nbde.conf
# dracut -fv

# systemctl enable clevis-luks-askpass.path
# reboot

Comment 2 John Call 2019-08-16 06:52:51 UTC
Created attachment 1604298 [details]
kickstart to deploy RHV-H via PXE (including cleanup from previous installs)

kickstart to deploy RHV-H via PXE (including cleanup from previous installs)

Comment 3 John Call 2019-08-16 06:56:05 UTC
(In reply to John Call from comment #1)
> # clevis bind luks -d /dev/sda3 tang '{"url":""}'

Oops, I co-located Tang on a server of mine that was already using port 80, so I changed it to use port 300 (with some systemd and SElinux foo)

The line above should be simplified to just this...
# clevis bind luks -d /dev/sda3 tang '{"url":""}'

Comment 24 John Call 2019-10-29 14:42:11 UTC
(In reply to John Call from comment #1)
> Here is my simplified steps to test Clevis.
> # echo 'kernel_cmdline="ip=dhcp"' > /etc/dracut.conf.d/clevis-nbde.conf
> # dracut -fv

I realized at some point that I also had to ask dracut to omit the creation of ifcfg files.  Add one more line to the dracut drop-in configuration file like this...
# cat /etc/dracut.conf.d/clevis-nbde.conf
### Use DHCP on a specific interface, and don't create /etc/sysconfig/network-scripts/ifcfg-* files during boot

Comment 32 shiyi lei 2019-12-02 03:46:19 UTC
Test Version:

1. install rhvh-
2. check clevis packages:
   #rpm -qa | grep clevis
3. the query result is:

Test result:
the clevis packages were pre-installed in rhvh image without subscription to RHSM.
This bug has been fixed in the latest version of RHVH-4.4.0, move the status to "VERIFIED".

Comment 36 errata-xmlrpc 2020-08-04 13:27:17 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (RHV RHEL Host (ovirt-host) 4.4), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.


Note You need to log in before you can comment on or make changes to this bug.