Description of problem: Ran my script: /etc/scripts/yumcheck #!/bin/env bash # service autofs restart && \ yum clean headers metadata dbcache expire-cache rpmdb && \ createrepo -v -C --update /var/cache/yum/18/i386/_lan/packages && \ yum install -y --disablerepo=_lan --downloadonly `cat /mnt/lists/18/i386/18_32.join` && \ yum install --downloadonly --enablerepo=updates-testing,koji kernel kernel-PAE kernel-doc Saw the following: ./yumcheck Redirecting to /bin/systemctl restart autofs.service Loaded plugins: fastestmirror, lan, langpacks, protectbase, tidy-cache Cleaning repos: _lan adobe-linux-i386 fedora fedora-abrt fedora-calibre fedora-chromium-stable : fedora-pulseaudio-backport fedora-yum-rawhide fedora-yumex fedorautils livna remi : rpmfusion-free rpmfusion-free-updates rpmfusion-nonfree rpmfusion-nonfree-updates : updates Error: Error making cache directory: /var/cache/yum/18/i386/_lan error was: [Errno 17] File exists: '/var/cache/yum/18/i386/_lan' cat /var/log/messages | grep createrepo Jun 8 15:34:01 yumser32 yum[2684]: Erased: createrepo-0.9.9-5.fc16.noarch Jun 11 09:01:10 yumser32 abrt: detected unhandled Python exception in '/usr/share/createrepo/genpkgmetadata.py' Jun 13 09:01:09 yumser32 abrt: detected unhandled Python exception in '/usr/share/createrepo/genpkgmetadata.py This is a KVM F18.i386 gues on an F18.x86_64 host the host is an nfs based yum server, takes care of 64bit. The guest takes care of 32bit. The yum cache is exported as /opt/upshare (bind mount to /var/cache/yum) The guest /var/cache/yum loads /opt/udshare The script lines work as individual lines, just not together. The cache is mounted, have checked with "mount" on the guest. yumser.frankly3d.home:/opt/udshare on /var/cache/yum type nfs4 (rw,relatime,vers=4.0,rsize=8192,wsize=8192,namlen=255,hard,proto=tcp,port=0,timeo=600,retrans=2,sec=sys,clientaddr=192.168.1.10,local_lock=none,addr=192.168.0.7) Version-Release number of selected component: createrepo-0.9.9-21.fc18 Additional info: reporter: libreport-2.1.4.28.g07a3 cmdline: /usr/bin/python -t /usr/share/createrepo/genpkgmetadata.py -v /var/cache/yum/18/i386/_lan/packages dso_list: yum-metadata-parser-1.1.4-7.fc18.i686 executable: /usr/share/createrepo/genpkgmetadata.py kernel: 3.9.5-200.fc18.i686.PAE runlevel: N 5 uid: 0 Truncated backtrace: sqlitecachec.py:61:getOtherdata:TypeError: Can not create packages table: disk I/O error Traceback (most recent call last): File "/usr/share/createrepo/genpkgmetadata.py", line 294, in <module> main(sys.argv[1:]) File "/usr/share/createrepo/genpkgmetadata.py", line 272, in main mdgen.doRepoMetadata() File "/usr/lib/python2.7/site-packages/createrepo/__init__.py", line 984, in doRepoMetadata rp.getOtherdata(complete_path, csum) File "/usr/lib/python2.7/site-packages/sqlitecachec.py", line 61, in getOtherdata self.repoid)) TypeError: Can not create packages table: disk I/O error Local variables in innermost frame: checksum: 'f4c447af9ae3521d835136ef000a584f0275ca2cf645f684a0cfc918fe66f564' self: <sqlitecachec.RepodataParserSqlite instance at 0x959a56c> location: '/var/cache/yum/18/i386/_lan/packages/.repodata/other.xml.gz'
Created attachment 760644 [details] File: backtrace
Created attachment 760645 [details] File: core_backtrace
Created attachment 760646 [details] File: environ
It works if: service autofs restart is commented out. even "service autofs restart | sleep 1(2)m" causes the file exists bit.
Seems that the autofs magic confuses yum... /var/cache/yum/18/i386/_lan probably does not exist at (1) but exists at (2). (1) if os.path.exists(dpath) and os.path.isdir(dpath): return try: (2) os.makedirs(dpath, mode=0755) except OSError, e: msg = "%s: %s %s: %s" % ("Error making cache directory", dpath, "error was", e) raise Errors.RepoError, msg I think you can probably work this around by adding "ls /var/cache/yum/ >/dev/null" or something similar between autofs restart and yum clean...
Thank you, "dir /var/cache/yum/i386" did the fix. Apologies for any noise. Have closed as "errata"
Great! Adding --ghost option to /etc/autofs/auto.master would probably fix this, too.