gnunet-developers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [GNUnet-developers] Proposal: Make GNUnet Great Again?


From: Hartmut Goebel
Subject: Re: [GNUnet-developers] Proposal: Make GNUnet Great Again?
Date: Sat, 9 Feb 2019 13:06:28 +0100
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Thunderbird/60.4.0

Am 09.02.19 um 12:18 schrieb Christian Grothoff:

I'm not going into the dependency details as I'm not aware of the
dependencies. But basically a layered approach would make sense IMHO. If
this leads to circular dependencies,
>   Is this really what you believe will attract new developers!?
>   Dozens of repositories with crazy dependency chains? 

This could easily be worked around by providing some "meta-repo"
containing a few scripts for downloading all repos and maybe some
meta-Makefile. This could also be used to create a mono-TGZ for the release.


>   I also do
>   not quite see whatever you want to do with the GitLab CI that
>   could not properly handle the bigger repo we have today
>   (especially given that Buildbot is today fine with it)

I'm looking at this from both the CI-perspective and the
packaging-perspective.

Assume we have smaller (but reasonably sliced) repos.

  * The total number of build-triggers is the same as for a larger repo
    (assuming each push is a trigger).
  * The number of build-triggers for each repo is becoming lower (can't
    estimate how much)
  * The build-time of each repo is shorter.
  * If the build passes, CI/CD could "deploy" some new "binary package"
    which can be used by dependent packages, speeding up build-times of
    dependent packages.
  * When packaging, smaller repos/archives are quicker to be packaged -
    both from build-time and from understanding how to build it.

Assume we have a huge repo:

  * The total number of build-triggers is the same as for a smaller
    repos (assuming each push is a trigger).
  * The build-time of each repo (much) longer, since the whole repo will
    be build from scratch. Since there are no files from a last build,
    everything has to be build.
  * Developers get the CI results later sitting around waiting for the
    result. (One of my projects takes 1:30 to finish CI, which is nerving.)
  * When packaging (.deb,.rpm, guix), huge repos/archives are much more
    nerving to package: Build-time is long, test-time is long, and if
    anything fails or new patches are required, you'll start again.
    (Some of the KDE packages take 15 Minutes to build. Iterating this
    really woes!)

    (When configuring gitlab-CI some of the issues could be solved, see
    
https://docs.gitlab.com/ce/ci/yaml/README.html#onlychanges-and-exceptchanges)

Also from a developers perspective, a huge repo has some drawbacks: E.g.
when switching branches or bi-secting, git will touch a lot of files
which all need to be rebuild, which is taking time.

just my 2c

-- 
Schönen Gruß
Hartmut Goebel
Dipl.-Informatiker (univ), CISSP, CSSLP, ISO 27001 Lead Implementer
Information Security Management, Security Governance, Secure Software
Development

Goebel Consult, Landshut
http://www.goebel-consult.de

Blog:
https://www.goe-con.de/blog/steuerbehoerden-torpetieren-freie-software
Kolumne:
https://www.goe-con.de/hartmut-goebel/cissp-gefluester/2012-01-in-die-cloud-in-die-cloud-aber-wo-soll-die-sein



Attachment: smime.p7s
Description: S/MIME Cryptographic Signature


reply via email to

[Prev in Thread] Current Thread [Next in Thread]