performance - Tools to crawl a site and compile payload statistics for each URL? -


My goal is to crawl a given site, and have log statistics for the total payload of each page on the site. The original document, CSS, JS, image, etc. by Payload means the number of bytes once, ... are being downloaded. I am trying to add a graph which will display the "most heavier" pages on my site so that they can be dealt with in advance.

Do anyone know of any tools or techniques to do this? My preference is something that will be well integrated into PHP or Python with a web app.

I am seen about, they usually appear to have only a few code Is done with.

Comments