gnunet-svn
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[gnurl] 201/411: MANUAL: update examples to resolve without redirects


From: gnunet
Subject: [gnurl] 201/411: MANUAL: update examples to resolve without redirects
Date: Wed, 13 Jan 2021 01:20:16 +0100

This is an automated email from the git hooks/post-receive script.

nikita pushed a commit to branch master
in repository gnurl.

commit 021f2c25fd7fbe2887ed156354f8919c0d06a412
Author: Daniel Gustafsson <daniel@yesql.se>
AuthorDate: Wed Sep 30 21:05:14 2020 +0200

    MANUAL: update examples to resolve without redirects
    
    www.netscape.com is redirecting to a cookie consent form on Aol, and
    cool.haxx.se isn't responding to FTP anymore. Replace with examples
    that resolves in case users try out the commands when reading the
    manual.
    
    Closes #6024
    Reviewed-by: Daniel Stenberg <daniel@haxx.se>
    Reviewed-by: Emil Engler <me@emilengler.com>
---
 docs/MANUAL.md | 16 ++++++++--------
 1 file changed, 8 insertions(+), 8 deletions(-)

diff --git a/docs/MANUAL.md b/docs/MANUAL.md
index 7063dbc8b..e4c7d7971 100644
--- a/docs/MANUAL.md
+++ b/docs/MANUAL.md
@@ -2,9 +2,9 @@
 
 ## Simple Usage
 
-Get the main page from Netscape's web-server:
+Get the main page from a web-server:
 
-    curl http://www.netscape.com/
+    curl https://www.example.com/
 
 Get the README file the user's home directory at funet's ftp-server:
 
@@ -16,7 +16,7 @@ Get a web page from a server using port 8000:
 
 Get a directory listing of an FTP site:
 
-    curl ftp://cool.haxx.se/
+    curl ftp://ftp.funet.fi
 
 Get the definition of curl from a dictionary:
 
@@ -24,7 +24,7 @@ Get the definition of curl from a dictionary:
 
 Fetch two documents at once:
 
-    curl ftp://cool.haxx.se/ http://www.weirdserver.com:8000/
+    curl ftp://ftp.funet.fi/ http://www.weirdserver.com:8000/
 
 Get a file off an FTPS server:
 
@@ -61,13 +61,13 @@ Get a file from an SMB server:
 
 Get a web page and store in a local file with a specific name:
 
-    curl -o thatpage.html http://www.netscape.com/
+    curl -o thatpage.html http://www.example.com/
 
 Get a web page and store in a local file, make the local file get the name of
 the remote document (if no file name part is specified in the URL, this will
 fail):
 
-    curl -O http://www.netscape.com/index.html
+    curl -O http://www.example.com/index.html
 
 Fetch two files and store them with their remote names:
 
@@ -657,11 +657,11 @@ Download with `PORT` but use 192.168.0.10 as our IP 
address to use:
 
 Get a web page from a server using a specified port for the interface:
 
-    curl --interface eth0:1 http://www.netscape.com/
+    curl --interface eth0:1 http://www.example.com/
 
 or
 
-    curl --interface 192.168.1.10 http://www.netscape.com/
+    curl --interface 192.168.1.10 http://www.example.com/
 
 ## HTTPS
 

-- 
To stop receiving notification emails like this one, please contact
gnunet@gnunet.org.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]