1. Batch downloads

I'm trying to find a quick 'n' dirty solution to this problem:
There are about 1000 log files sitting on a remote http server in the 
format: "acct%06d.log" (numbered from 1 to 1000ish), that I want a copy of 
sitting in a local directory.
They change fairly regularly, so I need to get them on a regular basis - 
preferably each morning.
The machine that will do this doesn't have any telnet/ssh access (esp not 
anonymous), so an http get is the preferred method.

I'm sure that a problem like this has cropped up before, so if someone has a 
similar program sitting on their boxen...

otherwise how would I go about coding it?
=====================================================
.______<-------------------\__
/ _____<--------------------__|===
||_    <-------------------/
\__| Mr Trick

new topic     » topic index » view message » categorize

2. Re: Batch downloads

mhc> I'm trying to find a quick 'n' dirty solution to this problem:
mhc> There are about 1000 log files sitting on a remote http server in the 
mhc> format: "acct%06d.log" (numbered from 1 to 1000ish), that I want a copy of 
mhc> sitting in a local directory.
mhc> They change fairly regularly, so I need to get them on a regular basis - 
mhc> preferably each morning.
mhc> The machine that will do this doesn't have any telnet/ssh access (esp not 
mhc> anonymous), so an http get is the preferred method.

Sounds pretty simple... something like:

--pseudo code
for t=1 to 1000 do
    name = sprintf("acct%06d.log")
    text = get_url( URL & "\\" & name )
    save_as_local_file( text , name )
end for

You'll want to investigate PatRat's Asynchronous HTTP and Mark Smith's
WebShepherd, or just do a search of EUPHORIA's submitted files for
"http web."

new topic     » goto parent     » topic index » view message » categorize

3. Re: Batch downloads

Hello Mr. Trick,

Take a look at:

  http://www.kiraly.com/software/utilities/graburl/

If you install the graburl.exe file in a directory in your PATH (e.g.
C:\WINDOWS or C:\WINNT) then a Euphoria program can call graburl executable
via a system call.  For example:

  system("graburl http://www.google.com/ > C:\\google.htm", 2)

For quick and dirty try this (untested):

  sequence cmd
  for i = 1 to 1000 do
    cmd = sprintf("graburl http://www.server.com/acct%06d.log >
C:\\acct%06d.log", {i, i})
    system(cmd, 2)
  end for

Note that the "cmd = ..." line is probably wrapped and that you need to put
the correct URL reference in for your purposes but I hope you get the idea.

Hope this helps.

Regards,

Andy Cranston.

At 00:56 19/09/03 +1000, you wrote:
>
>
>I'm trying to find a quick 'n' dirty solution to this problem:
>There are about 1000 log files sitting on a remote http server in the 
>format: "acct%06d.log" (numbered from 1 to 1000ish), that I want a copy of 
>sitting in a local directory.
>They change fairly regularly, so I need to get them on a regular basis - 
>preferably each morning.
>The machine that will do this doesn't have any telnet/ssh access (esp not 
>anonymous), so an http get is the preferred method.
>
>I'm sure that a problem like this has cropped up before, so if someone has a 
>similar program sitting on their boxen...
>
>otherwise how would I go about coding it?
>=====================================================
>.______<-------------------\__
>/ _____<--------------------__|===
>||_    <-------------------/
>\__| Mr Trick
>
>--^----------------------------------------------------------------
>This email was sent to: andy at systemzone.freeserve.co.uk
>
>
>TOPICA - Start your own email discussion group. FREE!
>
>

new topic     » goto parent     » topic index » view message » categorize

Search



Quick Links

User menu

Not signed in.

Misc Menu