[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
bug#70175: [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt pr
From: |
Christopher Baines |
Subject: |
bug#70175: [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing |
Date: |
Fri, 05 Apr 2024 12:35:02 +0100 |
User-agent: |
mu4e 1.12.2; emacs 29.3 |
John Fremlin via Guix-patches via <guix-patches@gnu.org> writes:
> OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp
>
> Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
> ---
> gnu/packages/machine-learning.scm | 5 ++++-
> 1 file changed, 4 insertions(+), 1 deletion(-)
Looks good to me, I tweaked the commit message a bit and pushed this to
master as d8a63bbcee616f224c10462dbfb117ec009c50d8.
Chris
signature.asc
Description: PGP signature