I want to use wget (using Linux) to download every single zip file from a site. Nothing else, just the zip files…but I want to do it for every single link on the site. Basically, I want it to spider through the entire site, looking for every single zip file, and download it.
I'm certain this is possible, but I'm not quite sure how…right now I'm using
wget -r -A zip http://comeandlive.com/downloads/
To download all the zip files, but I feel like that's not getting them all, just the ones available straight from the page.