[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt p
From: |
John Fremlin |
Subject: |
[bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing |
Date: |
Wed, 3 Apr 2024 23:46:25 -0400 |
OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp
Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
---
gnu/packages/machine-learning.scm | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/gnu/packages/machine-learning.scm
b/gnu/packages/machine-learning.scm
index 225bff0ca2..ea3674ce3e 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -542,6 +542,8 @@ (define-public llama-cpp
(build-system cmake-build-system)
(arguments
(list
+ #:configure-flags
+ '(list "-DLLAMA_BLAS=ON" "-DLLAMA_BLAS_VENDOR=OpenBLAS")
#:modules '((ice-9 textual-ports)
(guix build utils)
((guix build python-build-system) #:prefix python:)
@@ -576,8 +578,9 @@ (define-public llama-cpp
(lambda _
(copy-file "bin/main" (string-append #$output
"/bin/llama")))))))
(inputs (list python))
+ (native-inputs (list pkg-config))
(propagated-inputs
- (list python-numpy python-pytorch python-sentencepiece))
+ (list python-numpy python-pytorch python-sentencepiece openblas))
(home-page "https://github.com/ggerganov/llama.cpp")
(synopsis "Port of Facebook's LLaMA model in C/C++")
(description "This package provides a port to Facebook's LLaMA collection
base-commit: 1441a205b1ebb610ecfae945b5770734cbe8478c
--
2.41.0
- [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing,
John Fremlin <=