bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Using wget to measure web response times?


From: ohaya
Subject: Re: [Bug-wget] Using wget to measure web response times?
Date: Tue, 22 Jun 2010 16:31:33 -0400

---- Micah Cowan <address@hidden> wrote: 
> On 06/22/2010 08:01 AM, Keisial wrote:
> > Giuseppe Scrivano writes:
> >> <address@hidden> writes
> >>> I have been using wget, together with 'time', with the following command 
> >>> line parameters:
> >>>
> >>> time wget --page-requisites --secure-protocol=SSLV3 --load-cookies 
> >>> cookies.txt --keep-session-cookies 
> >>> https://portal.foo.com/test/appmanager/portal/desktop
> >>>
> >>> However, when I do this, on the same environment where we see > 30 
> >>> seconds response to a browser, I get 'real' response times from the 
> >>> 'time' command of 3-5 seconds.  That would SEEM to indicate that the 
> >>> browser is taking 28+ seconds to render the portal page.
> >>>
> >>> So I was wondering:  Is this a valid test and use of wget?  And, am I 
> >>> interpreting the test results correctly?
> >>>     
> >> Using wget do you get _exactly_ the same information that you get using
> >> the browser?
> >>
> >> Giuseppe
> >>   
> > 
> > Maybe the slow time is in getting some css/javascript/image file
> > included from the web page?
> 
> It's always possible that the server responds differently based on
> something like the User-Agent or even Accept-Encodings header; for
> instance if it sees Accept-Encodings: gzip, it might gzip the content
> before sending or something (though if that adds 27 seconds, it might
> not have been a good idea...); or it might dynamically generate some
> (CSS?) file based on the User-Agent string.
> 
> JavaScript does seem a real possibility, since wget will never interpret
> JavaScript code.
> 
> I'd probably try running tcpdump (Unix systems) to capture web packets,
> to see if it's really the traffic that's taking that long, or if there's
> a significant pause in there somewhere. Might be good to save a log from
> that, and try to scan for GET URLs to see if the browser asks for pages
> that wget does not.
> 
> -- 
> HTH,
> Micah J. Cowan
> http://micah.cowan.name/
> 


Hi All (Micah, Giuseppe, and Keisial),

I did some further analysis/review. 

As further background, the URL I was using was for initial page of a WebLogic 
portal app.  

Also, from reviewing the files that wget retrieved from my tests, there is 
quite a bit of Javascript, and, in particular this uses Dojo, so given what was 
said earlier, i.e., that wget won't 'follow' the Javascript, it sounds like 
wget may not be the best tool for this.  I did do some testing yesterday/last 
night, where I disabled Javascript on the browser (IE), and when I did that, I 
was able to get to that initial page in about 5 secs.

Also as mentioned earlier, the one environment where we're seeing the longer 
response times is our production network, and it's locked down, and, in 
particular, we don't have tcpdump or similar tools available.

So, at this point, I'm not sure what the next step is.  We're also now allowed 
to have things like Firebug, which makes this whole thing more difficult :(...

I appreciate the comments!

Thanks again,
Jim





reply via email to

[Prev in Thread] Current Thread [Next in Thread]