guix-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Maybe a way to get a few more developpers to work on Guix ?


From: Csepp
Subject: Re: Maybe a way to get a few more developpers to work on Guix ?
Date: Sun, 25 Jun 2023 17:53:25 +0200

Nicolas Graves <ngraves@ngraves.fr> writes:

> On 2023-06-24 13:08, Csepp wrote:
>
>> Nicolas Graves via "Development of GNU Guix and the GNU System 
>> distribution." <guix-devel@gnu.org> writes:
>>
>>> https://www.bpifrance.fr/nos-appels-a-projets-concours/appel-a-projets-communs-numeriques-pour-lintelligence-artificielle-generative
>>>
>>> Here's a call for proposal in French which could match a Guix project
>>> with a focus on code generation through LLMs. This could itself help
>>> Guix generate (and fix) package definitions.
>>
>> Mandatory reading for anyone interested in this:
>> https://limited.systems/articles/climate-cost-of-ai-revolution/
>
> I fully agree on that as well. I don't think training a full-fetched LLM
> is nowhere worth it, but I've heard that just fine-tuning an open model
> with quantized weights might get the job done training and energy costs
> down to a few hundred bucks.

>From the "Key points" section right at beginning of the linked article:
"Training of large AI models is not the problem"
"Large-scale use of large AI models would be unsustainable"

It's not the cost in money that I care about, it's the cost in
emissions, which is not included in the few hundred bucks.
(ie.: it's an externality)

However, using other machine learning models that can actually run
efficiently might make sense.
But really, what we need for package availability is more support code
in the form of better importers and updaters, better testing, better
coverage, more accessible workflows - the email workflow could certainly
be improved by *a lot* - better bug tracking, and so on.

There are also very well founded concerns about the quality of the code
that LLMs produce.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]