Description of problem: I'm starting to use php-v8js in some projects. It depends on v8 >= 3.24.6. I'm aware that v8 is a fast-moving target, but the current version is now over 3 years old. From the php-v8js developers: "The reason is that V8 introduced isolates meanwhile and V8Js requires them". Many thanks. Version-Release number of selected component (if applicable): v8-3.14.5.10 How reproducible: n/a Steps to Reproduce: n/a Actual results: n/a Expected results: n/a Additional info:
The linked gist is a Dockerfile for creating a CentOS 7 container that contains a compiled libv8 version more recent than 5.3+, and a v8js.so of the appropriate version for the php version used, using the IUS repository for php. It is configurable for the libv8 version, the v8js version, and php versions (defaulting to php56u, v8js 0.6.4, and libv8 5.7.492.69 . (This provided as a Dockerfile simply to isolate build environment and ensure consistency of build process.) By all means, if this helps expedite this package being created -- please use it (with or without attribution).
https://gist.github.com/Logos01/ddf8e988fa11464d878eac846b843f2b
Fedora added a v8-314 package in F25, and now ships v8-314 3.14 and v8 6.2. Would it be possible to do something similar in EPEL7? 1. branch and build v8-314 for EPEL7 2. update v8 in EPEL7 to 6.2 (probably merge back from rawhide) 2a. make sure to remove the v8-314 virtual provides from v8 in EPEL7 This would allow shipping libv8.so.3 and libv8.so.6 in parallel. Packages already built against v8 3.14 (libv8.so.3) will start resolving their dependency to v8-314 instead. Applications that are compatible with libv8.so.6 would continue to BR v8-devel, and will link against the new soname the next time they are built. Applications that are not compatible would need to change their BR to v8-314-devel (camotics already does this).
EPEL 7 entered end-of-life (EOL) status on 2024-06-30.\n\nEPEL 7 is no longer maintained, which means that it\nwill not receive any further security or bug fix updates.\n As a result we are closing this bug.