Bug 1576608
Summary: | System.TermInfoReader cannot handle new NCurses 6 TermInfo files | ||
---|---|---|---|
Product: | [Fedora] Fedora | Reporter: | Dirk Hoffmann <hoffmann> |
Component: | mono | Assignee: | Xavier Lamien <lxtnow> |
Status: | CLOSED DUPLICATE | QA Contact: | Fedora Extras Quality Assurance <extras-qa> |
Severity: | high | Docs Contact: | |
Priority: | unspecified | ||
Version: | 28 | CC: | alexl, chkr, duffy, h.pillay, itamar, john.j5live, lxtnow, mbarnes, pokorra.mailinglists, rhughes, rstrode, sandmann |
Target Milestone: | --- | ||
Target Release: | --- | ||
Hardware: | x86_64 | ||
OS: | Linux | ||
Whiteboard: | |||
Fixed In Version: | Doc Type: | If docs needed, set a value | |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2018-06-07 04:41:20 UTC | Type: | Bug |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: |
Description
Dirk Hoffmann
2018-05-09 23:29:56 UTC
Exactly the same issue. I cleaned out pdfmod and reinstalled it and gave the same error. System is a brand new Fedora 28 installation (ie not upgraded) (64bit). It's a known bug in mono (pdfmod uses mono.) Here's info on the bug from upstream: https://github.com/mono/mono/issues/6752 A workaround for the meanwhile is to set TERM=xterm before running pdfmod. That should let you run it. There is a fix upstream already as I understand so for Fedora to fix this we'd need updated mono packages. I can confirm that the workaround does solve the issue for the time being on my systems. Since I use pdfmod as my viewer with mutt, I've added the following to the mailcap file: application/pdf; { set -m \; /bin/mv -T %s %s.mv \; ( export TERM=xterm\; pdfmod %s.mv \; /bin/rm %s.mv \; ) & } \; disown -a added the export TERM=xterm portion. *** This bug has been marked as a duplicate of bug 1580447 *** (In reply to Harish Pillay from comment #3) > I can confirm that the workaround does solve the issue for the time being on > my systems. +1 Thanks! |