An integer signedness issue leading to a heap-based buffer overflow was found in the way PHP implemented its scandir() function. If scandir() was used to list files and directories from a directory containing a large number of files, it could cause PHP to crash to under some conditions execute arbitrary code with the permissions of the user running PHP.
Upstream commit for 5.3/5.4: https://github.com/php/php-src/commit/fc74503792b1ee92e4b813690890f3ed38fa3ad5
(In reply to comment #3) > https://github.com/php/php-src/commit/ > fc74503792b1ee92e4b813690890f3ed38fa3ad5 http://git.php.net/?p=php-src.git;a=commitdiff;h=fc74503792b1ee92e4b813690890f3ed38fa3ad5
This is public and fixed in 5.4.5 and 5.3.15: Fixed potential overflow in _php_stream_scandir (CVE-2012-2688) (http://www.php.net/ChangeLog-5.php#5.3.15)
Currently 5.3.15 and 5.4.5 are in testing for Fedora 16 and 17 respectively.
https://access.redhat.com/security/cve/CVE-2012-2688 states that a fix may be coming for this issue but based on comments in this bug I do not see any movement for any of the Red Hat provided packages, is there any update that can be made. I know of several RHEL customer show are looking for a fix to this issues.
To clarify, because the description does not indicate the requisite number of files to trigger this flaw. The number of files required in the directory that the PHP scan() function is run on is what PHP defines as INT_MAX, which is defined (in RHEL6): main/php.h:229:#define INT_MAX 2147483647 That means you need to have more than 2,147,483,647 files in the directory being scanned for this to be a problem. One way to mitigate this is to check, before adding or uploading files to this directory, how many are in it. Set an upper limit of one million or even ten million files (I suspect this will cause severe performance issues before you even hit these limits), and refuse to add new files to the directory if the limit is reached, which will prevent any scripts from scanning them with too many files (although I do not believe it will be easy to get that number of files in a directory without someone noticing some severe performance degradation first).
This bug is now being flagged as "high severity" in PCI-DSS. Running CentOS 6 and haven't seen this one fixed yet.
(In reply to comment #14) > This bug is now being flagged as "high severity" in PCI-DSS. Running CentOS > 6 and haven't seen this one fixed yet. See statement in c#9 of this bug / https://access.redhat.com/security/cve/CVE-2012-2688.
This issue has been addressed in following products: Red Hat Enterprise Linux 6 Via RHSA-2013:0514 https://rhn.redhat.com/errata/RHSA-2013-0514.html
This issue has been addressed in following products: Red Hat Enterprise Linux 5 Via RHSA-2013:1307 https://rhn.redhat.com/errata/RHSA-2013-1307.html
Statement: (none)
This issue has been addressed in following products: Red Hat Enterprise Linux 5 Via RHSA-2013:1814 https://rhn.redhat.com/errata/RHSA-2013-1814.html