bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Re bugs on setup


From: Darshit Shah
Subject: Re: Re bugs on setup
Date: Tue, 08 Mar 2022 15:43:40 +0100
User-agent: Cyrus-JMAP/3.5.0-alpha0-4778-g14fba9972e-fm-20220217.001-g14fba997

What is not working? You've only sent us back the help output of Wget

On Tue, Mar 8, 2022, at 04:32, Leon Munro wrote:
> Email bug reports, questions, discussions to <bug-wget@gnu.org>
>
> I set up termux but once I got here nothing works?
>
>
> Welcome to Termux!
>
> Communities: https://termux.org/community
> Gitter chat: https://gitter.im/termux/termux
> IRC channel: #termux on libera.chat
>
> Working with packages:
>
>  * Search packages:   pkg search <query>
>  * Install a package: pkg install <package>
>  * Upgrade packages:  pkg upgrade
>
> Subscribing to additional repositories:
>
>  * Root:     pkg install root-repo
>  * X11:      pkg install x11-repo
>
> Report issues at https://termux.org/issues
>
>
> You are likely using a very old version of Termux,
> probably installed from the Google Play Store.
> There are plans in the near future to remove the
> Termux apps from the Play Store so that new users
> cannot install them and to **disable** them for
> existing users with app updates to prevent the use
> of outdated app versions. Instead, you are
> encouraged to move to F-Droid or Github sources
> (see [1]). You can backup all your current Termux
> data before uninstallation and then restore it later
> by following instructions in the wiki [2]. Check
> the changelog [3] for all the new features and fixes
> that you are currently missing. Check [4] for why
> this is being done.
>
> [1] https://github.com/termux/termux-app#installation
> [2] https://wiki.termux.com/wiki/Backing_up_Termux
> [3] https://github.com/termux/termux-app/releases
> [4] https://github.com/termux/termux-app#google-play-store-deprecated
>
> ~ $ curl -LO
> https://packages.termux.org/apt/termux-main/pool/main/t/termux-keyring/termux-keyring_2.4_all.deb
> apt install ./termux-keyring_2.4_all.deb
> apt update && apt dist-upgrade -yq
>   % Total    % Received % Xferd  Average Speed   Time    Time     Time
> Current
>                                  Dload  Upload   Total   Spent    Left
> Speed
>   0     0    0     0    0     0      0      0 --  0     0    0     0    
> 0
>    0      0      0 --  0     0    0     0    0     0      0      0 --  0
>  0    0     0    0     0      0      0 --  0     0    0     0    0     0
>   0      0 --100   153  100   153    0     0     40      0  0:00:03
> 0:00:03 --:--:--    40
> Reading package lists... Error!
> E: Invalid archive signature
> E: Internal error, could not locate member
> control.tar{.lz4,.gz,.xz,.bz2,.lzma,}
> E: Could not read meta data from
> /data/data/com.termux/files/home/termux-keyring_2.4_all.deb
> E: The package lists or status file could not be parsed or opened.
> Hit:1 https://termux.mentality.rip/game-packages-24 games InRelease
> Hit:2 https://grimler.se/termux-packages-24 stable InRelease
> Hit:3 https://termux.mentality.rip/science-packages-24 science InRelease
> Reading package lists... Done
> Building dependency tree... Done
> Reading state information... Done
> All packages are up to date.
> Reading package lists...
> Building dependency tree...
> Reading state information...
> Calculating upgrade...
> 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
> ~ $ pkg install wget
> Checking availability of current mirror: ok
> Reading package lists... Done
> Building dependency tree... Done
> Reading state information... Done
> wget is already the newest version (1.21.3-1).
> 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
> ~ $ wget -0 install-nethunter-termux https://offs.ec/2MceZWr
> wget: invalid option -- '0'
> Usage: wget [OPTION]... [URL]...
>
> Try `wget --help' for more options.
> ~ $ wget -o install-nethunter-termux https://offs.ec/2MceZWr
> ~ $ chmod +x install-nethunter-termux
> ~ $ wget --help
> GNU Wget 1.21.3, a non-interactive network retriever.
> Usage: wget [OPTION]... [URL]...
>
> Mandatory arguments to long options are mandatory for short options too.
>
> Startup:
>   -V,  --version                   display the version of Wget and exit
>   -h,  --help                      print this help
>   -b,  --background                go to background after startup
>   -e,  --execute=COMMAND           execute a `.wgetrc'-style command
>
> Logging and input file:
>   -o,  --output-file=FILE          log messages to FILE
>   -a,  --append-output=FILE        append messages to FILE
>   -d,  --debug                     print lots of debugging information
>   -q,  --quiet                     quiet (no output)
>   -v,  --verbose                   be verbose (this is the default)
>   -nv, --no-verbose                turn off verboseness, without being quiet
>        --report-speed=TYPE         output bandwidth as TYPE.  TYPE can be
> bits
>   -i,  --input-file=FILE           download URLs found in local or external
> FILE
>   -F,  --force-html                treat input file as HTML
>   -B,  --base=URL                  resolves HTML input-file links (-i -F)
>                                      relative to URL
>        --config=FILE               specify config file to use
>        --no-config                 do not read any config file
>        --rejected-log=FILE         log reasons for URL rejection to FILE
>
> Download:
>   -t,  --tries=NUMBER              set number of retries to NUMBER (0
> unlimits)
>        --retry-connrefused         retry even if connection is refused
>        --retry-on-http-error=ERRORS    comma-separated list of HTTP errors
> to retry
>   -O,  --output-document=FILE      write documents to FILE
>   -nc, --no-clobber                skip downloads that would download to
>                                      existing files (overwriting them)
>        --no-netrc                  don't try to obtain credentials from
> .netrc
>   -c,  --continue                  resume getting a partially-downloaded
> file
>        --start-pos=OFFSET          start downloading from zero-based
> position OFFSET
>        --progress=TYPE             select progress gauge type
>        --show-progress             display the progress bar in any
> verbosity mode
>   -N,  --timestamping              don't re-retrieve files unless newer than
>                                      local
>        --no-if-modified-since      don't use conditional if-modified-since
> get
>                                      requests in timestamping mode
>        --no-use-server-timestamps  don't set the local file's timestamp by
>                                      the one on the server
>   -S,  --server-response           print server response
>        --spider                    don't download anything
>   -T,  --timeout=SECONDS           set all timeout values to SECONDS
>        --dns-timeout=SECS          set the DNS lookup timeout to SECS
>        --connect-timeout=SECS      set the connect timeout to SECS
>        --read-timeout=SECS         set the read timeout to SECS
>   -w,  --wait=SECONDS              wait SECONDS between retrievals
>                                      (applies if more then 1 URL is to be
> retrieved)
>        --waitretry=SECONDS         wait 1..SECONDS between retries of a
> retrieval
>                                      (applies if more then 1 URL is to be
> retrieved)
>        --random-wait               wait from 0.5*WAIT...1.5*WAIT secs
> between retrievals
>                                      (applies if more then 1 URL is to be
> retrieved)
>        --no-proxy                  explicitly turn off proxy
>   -Q,  --quota=NUMBER              set retrieval quota to NUMBER
>        --bind-address=ADDRESS      bind to ADDRESS (hostname or IP) on
> local host
>        --limit-rate=RATE           limit download rate to RATE
>        --no-dns-cache              disable caching DNS lookups
>        --restrict-file-names=OS    restrict chars in file names to ones OS
> allows
>        --ignore-case               ignore case when matching
> files/directories
>   -4,  --inet4-only                connect only to IPv4 addresses
>   -6,  --inet6-only                connect only to IPv6 addresses
>        --prefer-family=FAMILY      connect first to addresses of specified
> family,
>                                      one of IPv6, IPv4, or none
>        --user=USER                 set both ftp and http user to USER
>        --password=PASS             set both ftp and http password to PASS
>        --ask-password              prompt for passwords
>        --use-askpass=COMMAND       specify credential handler for requesting
>                                      username and password.  If no COMMAND
> is
>                                      specified the WGET_ASKPASS or the
> SSH_ASKPASS
>                                      environment variable is used.
>        --no-iri                    turn off IRI support
>        --local-encoding=ENC        use ENC as the local encoding for IRIs
>        --remote-encoding=ENC       use ENC as the default remote encoding
>        --unlink                    remove file before clobber
>        --xattr                     turn on storage of metadata in extended
> file attributes
>
> Directories:
>   -nd, --no-directories            don't create directories
>   -x,  --force-directories         force creation of directories
>   -nH, --no-host-directories       don't create host directories
>        --protocol-directories      use protocol name in directories
>   -P,  --directory-prefix=PREFIX   save files to PREFIX/..
>        --cut-dirs=NUMBER           ignore NUMBER remote directory components
>
> HTTP options:
>        --http-user=USER            set http user to USER
>        --http-password=PASS        set http password to PASS
>        --no-cache                  disallow server-cached data
>        --default-page=NAME         change the default page name (normally
>                                      this is 'index.html'.)
>   -E,  --adjust-extension          save HTML/CSS documents with proper
> extensions
>        --ignore-length             ignore 'Content-Length' header field
>        --header=STRING             insert STRING among the headers
>        --compression=TYPE          choose compression, one of auto, gzip
> and none. (default: none)
>        --max-redirect              maximum redirections allowed per page
>        --proxy-user=USER           set USER as proxy username
>        --proxy-password=PASS       set PASS as proxy password
>        --referer=URL               include 'Referer: URL' header in HTTP
> request
>        --save-headers              save the HTTP headers to file
>   -U,  --user-agent=AGENT          identify as AGENT instead of Wget/VERSION
>        --no-http-keep-alive        disable HTTP keep-alive (persistent
> connections)
>        --no-cookies                don't use cookies
>        --load-cookies=FILE         load cookies from FILE before session
>        --save-cookies=FILE         save cookies to FILE after session
>        --keep-session-cookies      load and save session (non-permanent)
> cookies
>        --post-data=STRING          use the POST method; send STRING as the
> data
>        --post-file=FILE            use the POST method; send contents of
> FILE
>        --method=HTTPMethod         use method "HTTPMethod" in the request
>        --body-data=STRING          send STRING as data. --method MUST be set
>        --body-file=FILE            send contents of FILE. --method MUST be
> set
>        --content-disposition       honor the Content-Disposition header when
>                                      choosing local file names
> (EXPERIMENTAL)
>        --content-on-error          output the received content on server
> errors
>        --auth-no-challenge         send Basic HTTP authentication
> information
>                                      without first waiting for the server's
>                                      challenge
>
> HTTPS (SSL/TLS) options:
>        --secure-protocol=PR        choose secure protocol, one of auto,
> SSLv2,
>                                      SSLv3, TLSv1, TLSv1_1, TLSv1_2,
> TLSv1_3 and PFS
>        --https-only                only follow secure HTTPS links
>        --no-check-certificate      don't validate the server's certificate
>        --certificate=FILE          client certificate file
>        --certificate-type=TYPE     client certificate type, PEM or DER
>        --private-key=FILE          private key file
>        --private-key-type=TYPE     private key type, PEM or DER
>        --ca-certificate=FILE       file with the bundle of CAs
>        --ca-directory=DIR          directory where hash list of CAs is
> stored
>        --crl-file=FILE             file with bundle of CRLs
>        --pinnedpubkey=FILE/HASHES  Public key (PEM/DER) file, or any number
>                                    of base64 encoded sha256 hashes preceded
> by
>                                    'sha256//' and separated by ';', to
> verify
>                                    peer against
>        --random-file=FILE          file with random data for seeding the
> SSL PRNG
>
>        --ciphers=STR           Set the priority string (GnuTLS) or cipher
> list string (OpenSSL) directly.
>                                    Use with care. This option overrides
> --secure-protocol.
>                                    The format and syntax of this string
> depend on the specific SSL/TLS engine.
> HSTS options:
>        --no-hsts                   disable HSTS
>        --hsts-file                 path of HSTS database (will override
> default)
>
> FTP options:
>        --ftp-user=USER             set ftp user to USER
>        --ftp-password=PASS         set ftp password to PASS
>        --no-remove-listing         don't remove '.listing' files
>        --no-glob                   turn off FTP file name globbing
>        --no-passive-ftp            disable the "passive" transfer mode
>        --preserve-permissions      preserve remote file permissions
>        --retr-symlinks             when recursing, get linked-to files (not
> dir)
>
> FTPS options:
>        --ftps-implicit                 use implicit FTPS (default port is
> 990)
>        --ftps-resume-ssl               resume the SSL/TLS session started
> in the control connection when
>                                          opening a data connection
>        --ftps-clear-data-connection    cipher the control channel only; all
> the data will be in plaintext
>        --ftps-fallback-to-ftp          fall back to FTP if FTPS is not
> supported in the target server
> WARC options:
>        --warc-file=FILENAME        save request/response data to a .warc.gz
> file
>        --warc-header=STRING        insert STRING into the warcinfo record
>        --warc-max-size=NUMBER      set maximum size of WARC files to NUMBER
>        --warc-cdx                  write CDX index files
>        --warc-dedup=FILENAME       do not store records listed in this CDX
> file
>        --no-warc-compression       do not compress WARC files with GZIP
>        --no-warc-digests           do not calculate SHA1 digests
>        --no-warc-keep-log          do not store the log file in a WARC
> record
>        --warc-tempdir=DIRECTORY    location for temporary files created by
> the
>                                      WARC writer
>
> Recursive download:
>   -r,  --recursive                 specify recursive download
>   -l,  --level=NUMBER              maximum recursion depth (inf or 0 for
> infinite)
>        --delete-after              delete files locally after downloading
> them
>   -k,  --convert-links             make links in downloaded HTML or CSS
> point to
>                                      local files
>        --convert-file-only         convert the file part of the URLs only
> (usually known as the basename)
>        --backups=N                 before writing file X, rotate up to N
> backup files
>   -K,  --backup-converted          before converting file X, back up as
> X.orig
>   -m,  --mirror                    shortcut for -N -r -l inf
> --no-remove-listing
>   -p,  --page-requisites           get all images, etc. needed to display
> HTML page
>        --strict-comments           turn on strict (SGML) handling of HTML
> comments
>
> Recursive accept/reject:
>   -A,  --accept=LIST               comma-separated list of accepted
> extensions
>   -R,  --reject=LIST               comma-separated list of rejected
> extensions
>        --accept-regex=REGEX        regex matching accepted URLs
>        --reject-regex=REGEX        regex matching rejected URLs
>        --regex-type=TYPE           regex type (posix|pcre)
>   -D,  --domains=LIST              comma-separated list of accepted domains
>        --exclude-domains=LIST      comma-separated list of rejected domains
>        --follow-ftp                follow FTP links from HTML documents
>        --follow-tags=LIST          comma-separated list of followed HTML
> tags
>        --ignore-tags=LIST          comma-separated list of ignored HTML tags
>   -H,  --span-hosts                go to foreign hosts when recursive
>   -L,  --relative                  follow relative links only
>   -I,  --include-directories=LIST  list of allowed directories
>        --trust-server-names        use the name specified by the redirection
>                                      URL's last component
>   -X,  --exclude-directories=LIST  list of excluded directories
>   -np, --no-parent                 don't ascend to the parent directory
>
> Email bug reports, questions, discussions to <bug-wget@gnu.org>
> and/or open issues at https://savannah.gnu.org/bugs/?func=additem&group=wget
> .
> ~ $



reply via email to

[Prev in Thread] Current Thread [Next in Thread]