Bug 1420540

Summary: RFE: Easy way to build directly from spec files
Product: [Community] Copr Reporter: Megh Parikh <meghprkh>
Component: backendAssignee: clime
Status: CLOSED CURRENTRELEASE QA Contact:
Severity: unspecified Docs Contact:
Priority: unspecified    
Version: unspecifiedCC: brian, clime, john.ellson, mike, msuchy, praiskup
Target Milestone: ---Keywords: FutureFeature
Target Release: ---   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2017-07-07 20:00:10 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 1384609    
Bug Blocks:    

Description Megh Parikh 2017-02-08 22:40:52 UTC
Many projects have a simple spec file which has the source URL specified in Source0 and to build them, they just download the sources and make a SRPM using fedpkg. While I may be wrong (I am not an experienced packager), can the feature be added of simply allowing to upload the SPEC file (and fetching the Sources and building srpm using fedpkg automatically)?

Comment 1 clime 2017-02-09 08:01:37 UTC
Hmm, this is very interesting idea...

Comment 2 Pavel Raiskup 2017-02-09 08:25:44 UTC
FWIW, something like that was discussed in:
https://lists.fedorahosted.org/archives/list/copr-devel@lists.fedorahosted.org/thread/34DIQOJ77ZJOJTAA2CFJGAKMURM4FAAW/

Where Mirek suggested that before we add add something like this into
Copr, we should have this thing in separate project.  So I started:
https://github.com/praiskup/srpm-tools

We use that SRPM generator for CI purposes in pgjdbc project:
https://github.com/pgjdbc/pgjdbc/blob/master/packaging/rpm/.srpmgen
https://copr.fedorainfracloud.org/coprs/g/pgjdbc/pgjdbc-travis/builds/

Comment 3 Pavel Raiskup 2017-02-09 08:41:21 UTC
We should probably resolve bug 1384609 first .. because downloading the
upstream tarball from upstream for each chroot would be sort of DoS.

Comment 4 John Ellson 2017-02-09 21:25:40 UTC
I like the goal,  but for some projects, the tar.gz and the .spec, are the products of a separate build stage that generates the distribition-independent source package.

I'd like a CI tool that could automate that stage, and virtualize the storage of the intermediate products:

e.g. for graphviz, the first stage of building is on a single host that has some extra tools installed:  

    - clones the source from git.
    - runs ./autogen.sh
          -- runs tools like automake, autoconf, libtoolize, swig, yac, lex, ...
          -- extracts version info from last git checkin, writes into .spec
    - runs "make dist"  (and so omits unused stuff from git)
    - stores the tar.gz (and the contained .spec) somewhere public  ( which relates to bug 1384609 )


I'm currently looking at Pavel's "srpm-tools".   This looks like its heading in the right direction.   Currently I've only installed it, and I'm still a bit confused about its use of .spec files:  are there two specs, one for the SRPMs and a second for the RPMS ?

Comment 5 Pavel Raiskup 2017-02-10 08:40:35 UTC
(In reply to John Ellson from comment #4)
> I like the goal,  but for some projects, the tar.gz and the .spec, are the
> products of a separate build stage that generates the distribition-independent
> source package.
> 
> I'd like a CI tool that could automate that stage, and virtualize the
> storage of the intermediate products:
> 
> e.g. for graphviz, the first stage of building is on a single host that has
> some extra tools installed:  
> 
>     - clones the source from git.
>     - runs ./autogen.sh
>           -- runs tools like automake, autoconf, libtoolize, swig, yac, lex,
> ...
>           -- extracts version info from last git checkin, writes into .spec
>     - runs "make dist"  (and so omits unused stuff from git)
>     - stores the tar.gz (and the contained .spec) somewhere public  ( which
> relates to bug 1384609 )

This all makes sense, but it all looks rather like "push" feature (e.g. by
travis-ci or other thing) rather than "pull" feature (doable by copr).
Especially the '.. stores the tar.gz somewhere public' part.

> I'm currently looking at Pavel's "srpm-tools".   This looks like its heading
> in the right direction.   Currently I've only installed it, and I'm still a
> bit confused about its use of .spec files:  are there two specs, one for the
> SRPMs and a second for the RPMS ?

There are still two major things to be done in srpm-tools, it is the
autotools support and automatic 'version bump' (by that I mean patching
Version: or Release: tag in spec file on the fly).  What I really want to
keep is to have one static (and manually edited) spec file for both CI and
"real builds".  The autotools support is tricky because that makes the
SRPM build turing-complete and it thus brings security consequences to
user.

Because of the lack of this functionality in srpm-tools, in pgjdbc we have
only "postgresql-jdbc.spec.template" file which is instantiated by travis.
Is that what you are asking about?

Comment 6 Miroslav Suchý 2017-02-10 11:51:18 UTC
@john - The problem is *very* general and very big.
While we can add (beside tito and mock-scm) support to e.g. run "make srpm" in git checkout and let people put in makefile the rule how to generate. But surprisingly... lot of people generate even makefile. Ok, lets allow user to enter any command, which will generate srpm. E.g. "./autogen.sh && make srpm". But what packages need to be installed before this command can be run? SRPM has build requires, which we can use. But ok, let people point to commands like "sudo dnf install MYREQUIRES && ./autogen.sh && make srpm". But EL7 use YUM and not DNF, the packages are named differently. Can we have some macro substitution there? ... You see... This is getting awfully big and complicated.

I will give thumbs up for any tool, which generate SRPM by one simple command (and is packaged for Fedora). Supporting general tools like automake is not the right way as everybody use it differently and it will not satisfy everybody (not even major fraction).

Comment 7 John Ellson 2017-02-10 15:29:03 UTC
@Pavel - pull v push - the distro-independent product just has to be accessible to the next stage at a public URL.  So either should be workable.  i.e. it doesn't have to be some third party public space.

But note that, it is the creation of the distro-independent package that is the event that triggers the second stage builds; rather than the original git commit directly.

@Miroslav - I see the product of the src build as being a tar.gz, rather than a SRPM.   I'd like to support Debian, Windows, MacOS, as well as Redhat builds all from the same distro-independent sources.

Building the SRPM for Redhat builders is then just a  "rpmbuild -ts xxx.tar.gz"

We need a convention, but it could be a separate "static" .spec (and perhaps, a "static" Makefile) in the git repo for the generation of the distro-independent-source.  Only the second, i.e. generated, .spec in the distro-independent source will contain version info.

Currently, for graphviz, this "convention" is "./autogen.sh; make dist",  but it is true that this doesn't codify its toolchain requirements.

I'm not hung up on autotools as being the right answer.  I think what we're missing here is agreement that, for many projects, it is a two-stage process to build from git, and that there is a need for a hand-off between the two stages.

In my mind, the definition of the two stages is:
    git -> distro-independant-source-package.
and the second stage is:
    distro-independent-source-package  -> distro-specific-binary-packages.

There is only a need for one first-stage, but there are many second-stages, and they are not all Redhat.

I can see the question coming:  "Why not just a one stage build from git?"
     - a strong desire to not store anything that can be generated in git
     - problem of version number in spec file
     - avoidance of special toolchain requirements on all but one platform
     - avoidance of duplicated build effort for distro-independent products:
           -- autoconf, automake, libtool, 
           -- yacc, lex, swig, 
           -- documentation generators
     - sources needed for binaries != all sources in git repo

Comment 8 Brian J. Murrell 2017-04-24 12:58:09 UTC
https://pagure.io/copr/copr/issue/60 requests much the same, FWIW.

Comment 9 Mike Goodwin 2017-05-06 17:34:12 UTC
For me I'm mostly interested in being able to point copr toward my spec repository, because I have almost no affiliation with most of the things I'm packaging. Therefore I maintain my own separate repo of specs for these packages and at present I'm stuck uploading SRPM files for each build, or at best I could automate locally involving some machine of mine doing a cron and copr-cli. None of that is ideal. 

I basically have an alias in bash that effectively does what I want which is why I fail to see why copr couldn't include a simple option to do it:

alias mockbuild='\
        rpmlint *.spec &&
        spectool -g *.spec &&
        __mockf() {
                local SRPM=$( mock "$@" -v --resultdir . --buildsrpm --spec *.spec --sources . 2>&1 | awk "/ Wrote: /{ print gensub(\".*/([^/]+)$\",\"\\\1\",\"g\",\$3) }" )
                mock "$@" --resultdir . "${SRPM:?ERROR: Missing SRPM}"; }; __mockf'


Surely that doesn't account for a lot of edge cases and I know the behavior of it upfront so the generality of it doesn't bother me, but it accomplishes a few things...

1. Downloads the source with spectool
2. Builds the SRPM with mock (Why can't we at least do this?) 
3. Kicks off the mock build with the resultant SRPM 

This is effectively all I'm looking to do as someone who maintains a _repo of specs_, versus a developer of projects that can place specs within. I stress "repo of specs" because it's clear that a lot of this was not designed around the idea that other people would be packaging others free software in copr, and I think that needs to change.

Comment 10 Mike Goodwin 2017-05-06 17:40:28 UTC
https://bugzilla.redhat.com/show_bug.cgi?id=1397508

This ticket also basically has the same sentiment as mine

Comment 11 clime 2017-05-15 13:03:33 UTC
(In reply to Megh Parikh from comment #0)
> Many projects have a simple spec file which has the source URL specified in
> Source0 and to build them, they just download the sources and make a SRPM
> using fedpkg. While I may be wrong (I am not an experienced packager), can
> the feature be added of simply allowing to upload the SPEC file (and
> fetching the Sources and building srpm using fedpkg automatically)?

It's possible in COPR by using Tito with GitAnnex (https://m0dlx.com/blog/Reproducible_builds_on_Copr_with_tito_and_git_annex.html). However, see bug: https://bugzilla.redhat.com/show_bug.cgi?id=1450950 - might also happen for you. 

Could you test Tito + GitAnnex method on your package? It would be a super-useful feedback.

Comment 12 clime 2017-07-07 20:00:10 UTC
Building .spec files is now possible by https://pagure.io/copr/copr/c/df9d0f994b643f9437b83ae8d7dfb3afd6b7938f?branch=master. You can use URL or upload methods to submit an updated .spec file.

Comment 13 Brian J. Murrell 2017-07-09 13:05:34 UTC
Will this work with tito such that one only needs the .spec file in a tito project/subdir and not the tarball also any more?

Any ETA when this will be live on the Fedora Copr?  :-)  Or how would one know when a release with this feature went live on Fedora Copr so that one would know when one could take advantage of it?

Comment 14 Brian J. Murrell 2017-07-30 13:17:07 UTC
Anyone have any answers about the above (comment #13)?

Comment 15 clime 2017-11-03 05:08:06 UTC
(In reply to Brian J. Murrell from comment #13)
> Will this work with tito such that one only needs the .spec file in a tito
> project/subdir and not the tarball also any more?

Hello, you can use SCM source type for this. Tito has NoTgzBuilder for these purposes. Or you can also use `rpkg` that does not need any additional configuration (like 'tito.props' for tito).

Comment 16 Mike Goodwin 2017-11-03 17:02:59 UTC
The biggest problem is what do you do when you don't own the repo, though?

For wanting to build upstream projects with SCM, it seems that the only option is to fork the project and include your .spec, and then use that fork for SCM building. The problem is, now you're back to having to manage something like merging changes from upstream into the fork, which kind of negates the whole point. 

It would be really nice to be able to point to a git repository and side-car a .spec.

Comment 17 clime 2017-11-06 11:47:56 UTC
The idea is that you can reference the upstream sources from within the .spec file as described here: https://fedoraproject.org/wiki/Packaging:SourceURL#Git_Hosting_Services. Your own repository contains the .spec file and that specfile references the upstream sources and you add patches as needed. Or the second option that you described is with forking the upstream repo and using Git merging instead of downstream patching.

But yes, there is one problem with this...you cannot automate building from upstream like that. It would be nice to have the following workflow available:

1) setup your Git repo with spec (and patches or original unpacked sources) that references upstream
2) setup package for this downstream repo in your COPR project.
3) ask upstream to add a webhook for your COPR project
4) have the upstream being rebuilt by using your downstream repo with the .spec file

Now the problem is that COPR currently does not allow this because if a webhook event is sent from GitHub/GitLab/..., clone URL of the originating repository is checked against the Clone URL in your COPR package definition and these two must fit for the package rebuilt to be triggered. Now we could add an optional "Upstream Clone URL" field in the COPR package definition to be able to specify Upstream URL, which would be checked as well and the rebuilt would be launched if one of the two (Upstream/Downstream) URL matched. If you care about this functionality or have a completely different idea, you can open new RFE. We will be happy to implement this kind of support for really Continuous workflow.