Re: Batch downloads
- Posted by "C. K. Lester" <cklester at yahoo.com> Sep 18, 2003
- 371 views
mhc> I'm trying to find a quick 'n' dirty solution to this problem: mhc> There are about 1000 log files sitting on a remote http server in the mhc> format: "acct%06d.log" (numbered from 1 to 1000ish), that I want a copy of mhc> sitting in a local directory. mhc> They change fairly regularly, so I need to get them on a regular basis - mhc> preferably each morning. mhc> The machine that will do this doesn't have any telnet/ssh access (esp not mhc> anonymous), so an http get is the preferred method. Sounds pretty simple... something like: --pseudo code for t=1 to 1000 do name = sprintf("acct%06d.log") text = get_url( URL & "\\" & name ) save_as_local_file( text , name ) end for You'll want to investigate PatRat's Asynchronous HTTP and Mark Smith's WebShepherd, or just do a search of EUPHORIA's submitted files for "http web."