* doc/guix.texi (Invoking guix publish): Document zstd compression.
---
doc/guix.texi | 18 ++++++++++++------
guix/scripts/publish.scm | 31 ++++++++++++++++++-------------
tests/publish.scm | 16 ++++++++++++++++
3 files changed, 46 insertions(+), 19 deletions(-)
diff --git a/doc/guix.texi b/doc/guix.texi
index b12cb11bdf..ed38f2e37b 100644
--- a/doc/guix.texi
+++ b/doc/guix.texi
@@ -12329,17 +12329,23 @@ server socket is open and the signing key has been
read.
@item --compression[=@var{method}[:@var{level}]]
@itemx -C [@var{method}[:@var{level}]]
Compress data using the given @var{method} and @var{level}. @var{method} is
-one of @code{lzip} and @code{gzip}; when @var{method} is omitted, @code{gzip}
-is used.
+one of @code{lzip}, @code{zstd}, and @code{gzip}; when @var{method} is
+omitted, @code{gzip} is used.
When @var{level} is zero, disable compression. The range 1 to 9 corresponds
to different compression levels: 1 is the fastest, and 9 is the best
(CPU-intensive). The default is 3.
-Usually, @code{lzip} compresses noticeably better than @code{gzip} for a small
-increase in CPU usage; see
-@uref{https://nongnu.org/lzip/lzip_benchmark.html,benchmarks on the lzip Web
-page}.
+Usually, @code{lzip} compresses noticeably better than @code{gzip} for a
+small increase in CPU usage; see
+@uref{https://nongnu.org/lzip/lzip_benchmark.html,benchmarks on the lzip
+Web page}. However, @code{lzip} achieves low decompression throughput
+(on the order of 50@tie{}MiB/s on modern hardware), which can be a
+bottleneck for someone who downloads over a fast network connection.
+
+The compression ratio of @code{zstd} is between that of @code{lzip} and
+that of @code{gzip}; its main advantage is a
+@uref{https://facebook.github.io/zstd/,high decompression speed}.
Unless @option{--cache} is used, compression occurs on the fly and
the compressed streams are not