Bug 2196825

Summary: satellite-installer fails with error "Ruby Integer outside of Puppet Integer max range" if a filesystem with 8.00 EiB is mounted
Product: Red Hat Satellite Reporter: Stefan Meyer <smeyer>
Component: InstallationAssignee: satellite6-bugs <satellite6-bugs>
Status: CLOSED MIGRATED QA Contact: Satellite QE Team <sat-qe-bz-list>
Severity: high Docs Contact:
Priority: high    
Version: 6.11.4CC: ahumbe, ehelms, jpasqual, mkalyat, rlavi, wclark, yferszt
Target Milestone: UnspecifiedKeywords: MigratedToJIRA, Triaged, Upgrades
Target Release: Unused   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2024-06-06 16:14:00 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Stefan Meyer 2023-05-10 11:04:07 UTC
Description of problem:
satellite-installer fails with error "Ruby Integer outside of Puppet Integer max range" if a filesystem with 8.00 EiB is mounted

Version-Release number of selected component (if applicable):
- Satellite & Capsule 6.11.4

How reproducible:
- The customer has a Netapp fileshare mounted through NFS v4

Steps to Reproduce:
1. Mount a filesystem with a size of 9223371665325424640 bytes
2. Run satellite-installer
3.

Actual results:
The satellite installer shows error

  [ERROR ] [configure] Evaluation Error: Error while evaluating a '=>' expression, Use of a Ruby Integer outside of Puppet Integer max range, got '0x8000000000000000' (file: /usr/share/foreman-installer/modules/foreman_proxy/manifests/register.pp, line: 23, column: 7) on node XXX

Expected results:
The satellite installer should work regardless of a big filesystem

Additional info:

Puppet facter information:

...
/redacted => {
    available => "8.00 EiB",
    available_bytes => 9223371665325424640,
    capacity => "0.00%",
    device => "redacted",
    filesystem => "nfs4",
    options => [
      "rw",
      "relatime",
      "vers=4.1",
      "rsize=1048576",
      "wsize=1048576",
      "namlen=255",
      "hard",
      "proto=tcp",
      "timeo=600",
      "retrans=2",
      "sec=sys",
      "clientaddr=redacted",
      "local_lock=none",
      "addr=redacted"
    ],
    size => "8.00 EiB",
    size_bytes => 9223372036854775808,
    used => "346.01 GiB",
    used_bytes => 371529351168
  },
...

Comment 1 Ewoud Kohl van Wijngaarden 2023-05-11 14:30:41 UTC
I did some minimal testing by defining a custom fact with just a large number:

cat > lib/facter/large.rb <<EOF
Facter.add(:large) do
  setcode do
    9223371665325424640
  end
end
EOF

Then applying the following manifest does work (with Puppet 6.28.0):

if $facts['large'] > 1000 {
  file { '/tmp/bla':
    ensure => file,
    content => String($facts['large']),
  }
}

My next guess was some conversion in custom types, but also there with a small custom type I couldn't easily reproduce it:

Puppet::Type.newtype(:largenumber) do
  ensurable
  newparam(:name)
  newproperty(:content)
end

Puppet::Type.type(:largenumber).provide(:default) do
  mk_resource_methods

  def self.instances
    []
  end

  def self.prefetch(resources)
    resources.each do |name, resource|
      if File.exist?(name)
        resource.provider = new(ensure: :present, content: JSON.load(File.read(name)))
      else
        resource.provider = new(ensure: :absent)
      end
    end
  end

  def create
    require 'json'
    File.write(name, JSON.dump(resource.should(:content)))
    @property_hash[:ensure] = :present
    @property_hash[:content] = resource.should(:content)
  end

  def flush
    require 'json'
    File.write(name, JSON.dump(@property_hash[:content]))
  end

  def exists?
    return File.file?(name)
  end
end

That also happily wrote out the content.

So it needs some further investigation where it's happening.

Comment 8 Eric Helms 2024-06-06 16:14:00 UTC
This BZ has been automatically migrated to the issues.redhat.com Red Hat Issue Tracker. All future work related to this report will be managed there.

Due to differences in account names between systems, some fields were not replicated.  Be sure to add yourself to Jira issue's "Watchers" field to continue receiving updates and add others to the "Need Info From" field to continue requesting information.

To find the migrated issue, look in the "Links" section for a direct link to the new issue location. The issue key will have an icon of 2 footprints next to it, and begin with "SAT-" followed by an integer.  You can also find this issue by visiting https://issues.redhat.com/issues/?jql= and searching the "Bugzilla Bug" field for this BZ's number, e.g. a search like:

"Bugzilla Bug" = 1234567

In the event you have trouble locating or viewing this issue, you can file an issue by sending mail to rh-issues. You can also visit https://access.redhat.com/articles/7032570 for general account information.

Comment 9 Red Hat Bugzilla 2024-10-05 04:26:04 UTC
The needinfo request[s] on this closed bug have been removed as they have been unresolved for 120 days