[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Bug-wget] [ALPHA] wget parallel branch release 1.14.127-ced4
From: |
Ray Satiro |
Subject: |
Re: [Bug-wget] [ALPHA] wget parallel branch release 1.14.127-ced4 |
Date: |
Mon, 20 May 2013 19:44:42 -0700 (PDT) |
----- Original Message -----
> From: Giuseppe Scrivano <address@hidden>
> To: Ray Satiro <address@hidden>
> Cc: "address@hidden" <address@hidden>
> Sent: Sunday, May 19, 2013 6:47 PM
> Subject: Re: [Bug-wget] [ALPHA] wget parallel branch release 1.14.127-ced4
>
>G iuseppe Scrivano <address@hidden> writes:
>
>> ops... thanks to have reported it. I am going to prepare a new tarball.
>
> and here we go:
>
> ftp://alpha.gnu.org/gnu/wget/wget-1.14.128-8212.tar.gz
> ftp://alpha.gnu.org/gnu/wget/wget-1.14.128-8212.tar.gz.sig
Thanks. I was able to build it with mingw after changing a few things:
http.c: In function 'create_authorization_line':
http.c:4189:29: error: request for member 'ntlm' in something not a structure
or union
http.c:4194:32: error: request for member 'ntlm' in something not a structure
or union
http.c:4198:5: error: duplicate case value
http.c:4188:5: error: previously used here
if (!ntlm_input (&pconn.ntlm, au))
Problem is I was building with enable threads and pconn is a pointer in that
case so &pconn->ntlm
url.c: In function 'url_file_name':
url.c:1621:56: error: '_PC_NAME_MAX' undeclared (first use in this function)
url.c:1621:56: note: each undeclared identifier is reported only once for each
function it appears in
max_length = get_max_length (fnres.base, fnres.tail, _PC_NAME_MAX) -
CHOMP_BUFFER;
Problem is there's no _PC_NAME_MAX in windows. This code has come up before,
background here:
http://lists.gnu.org/archive/html/bug-wget/2012-10/msg00005.html
Filenames are typically limited to 255 characters. So assuming fnres contains
the full path eg C:\whatever\somewebsite.com you could do this:
max_length = MAX_PATH - ( fnres.tail + CHOMP_BUFFER + 2 );
2 is for the path separator and null. then sanity check:
if( max_length > 255 )
max_length = 255;
I've attached a patch that contains basically that and max_length = 0 if
there's no room for a filename. I don't get why not stop if max_length is 0.
> From: Anthony Bryan <address@hidden>
[...]
> on OS X 10.8, I get:
>
> $ wget --metalink LibreOffice_4.0.3_MacOS_x86.dmg.meta4 -d
> DEBUG output created by Wget 1.14.128-8212 on darwin12.3.0.
>
> URI encoding = 'US-ASCII'
> URI content encoding = 'US-ASCII'
> Segmentation fault: 11
I tried that metalink file and it crashed here as well.
> From: Bykov Aleksey <address@hidden>
[...]
> 3. Saves temporary files to "\\filename" (drive's root).
> And does not
> remove it.
[...]
> 5. It's somes error for checksum calculation. Sorry, can't understood
> where. And can't lauch it in gdb. Resulting file SHA256 is correct, but
> differ from calculated by wget value.
I was able to download using .metalinks but I also had the problem where the
hashes were reportedly incorrect and also wget would write to \\ like this:
C:\Users\Internet\Desktop\test>c:\wget\root\bin\wget --metalink
ubuntu-13.04-server-i386.metalink
--2013-05-20 21:13:25--
http://releases.ubuntumirror.dei.uc.pt/13.04/ubuntu-13.04-server-i386.iso
Resolving releases.ubuntumirror.dei.uc.pt (releases.ubuntumirror.dei.uc.pt)...
193.136.212.167
Connecting to releases.ubuntumirror.dei.uc.pt
(releases.ubuntumirror.dei.uc.pt)|193.136.212.167|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 721420288 (688M) [application/octet-stream]
Saving to: '\\s7o0.'
100%[======================================>] 721,420,288 1.19MB/s in 12m 56s
2013-05-20 21:26:22 (908 KB/s) - '\\s7o0.' saved [721420288/721420288]
Verifying(ubuntu-13.04-server-i386.iso) failed: md5 hashes are different.
Verifying(ubuntu-13.04-server-i386.iso) failed.
The file ends up being written to the virtualstore:
C:\Users\Internet\AppData\Local\VirtualStore\s7o0
73d595b804149fca9547ed94db8ff44f *s7o0
and the hash is correct:
73d595b804149fca9547ed94db8ff44f *ubuntu-13.04-server-i386.iso
Are multiple pieces supposed to be downloaded by wget simultaneously? I watched
the TCP during the download and there was only one receiving the whole file.
For anyone in windows who is going to help with the testing you'll need to
build libmetalink and it depends on a parser like expat. Here's what it looked
like when I did it:
tar xvfz expat-2.1.0.tar.gz
cd expat-2.1.0
./configure --prefix=/c/wget/root --disable-shared --enable-static
make
make install
tar xvfj libmetalink-0.1.2.tar.bz2
cd libmetalink-0.1.2
./configure --prefix=/c/wget/root --disable-shared --enable-static
make
make install
Apply the attached patch to wget. Since I used the static libs I had to set
LIBMETALINK_LIBS.
LIBMETALINK_LIBS=`pkg-config --libs --static libmetalink` ./configure
--prefix=/c/wget/root --with-libidn=/c/wget/root
--with-libintl-prefix=/c/wget/root --with-libiconv-prefix=/c/wget/root
--enable-nls --enable-ipv6 --enable-iri --enable-ntlm --disable-rpath
--with-ssl=openssl --enable-metalink --enable-threads=windows > config.out 2>&1
I uploaded the build if you don't want to build it yourself:
http://sourceforge.net/projects/getgnuwin32/files/getgnuwin32/test%20builds/wget-1.14.128-8212__mingw__2013-05-19.zip/download
wget-1.14.128-8212.patch
Description: Binary data