1. Small Project for a Lucky Person

Hey! I need a program that will retrieve a web page at set intervals. Anybody
got something like this or want to make it for me? 8) Thanks!!

-=ck
"Programming in a state of Euphoria."
http://www.cklester.com/euphoria/

new topic     » topic index » view message » categorize

2. Re: Small Project for a Lucky Person

cklester wrote:
> 
> Hey! I need a program that will retrieve a web page at set intervals. Anybody
> got something like this or want to make it for me? 8) Thanks!!
> 
> -=ck
> "Programming in a state of Euphoria."
> <a
> href="http://www.cklester.com/euphoria/">http://www.cklester.com/euphoria/</a>

I wrote a program to do that already. Once I find it I'll email it to you or
post it here tomorrow.


The Euphoria Standard Library project :
    http://esl.sourceforge.net/
The Euphoria Standard Library mailing list :
    https://lists.sourceforge.net/lists/listinfo/esl-discussion

new topic     » goto parent     » topic index » view message » categorize

3. Re: Small Project for a Lucky Person

D. Newhall wrote:
> cklester wrote:
> > Hey! I need a program that will retrieve a web page at set intervals.
> > Anybody
> > got something like this or want to make it for me? 8) Thanks!!
> 
> I wrote a program to do that already. Once I find it I'll email it to you or
> post it here tomorrow.

I just knew somebody had one! Thanks!! :)

-=ck
"Programming in a state of Euphoria."
http://www.cklester.com/euphoria/

new topic     » goto parent     » topic index » view message » categorize

4. Re: Small Project for a Lucky Person

cklester wrote:
> 
> Hey! I need a program that will retrieve a web page at set intervals. Anybody
> got something like this or want to make it for me? 8) Thanks!!
> 
> -=ck
> "Programming in a state of Euphoria."
> <a
> href="http://www.cklester.com/euphoria/">http://www.cklester.com/euphoria/</a>

I have a program that will get the webpage once. It shouldn't be hard to make it
repeat the procedure. The guts of it came from urlmon.ew in the archives.

http://www.rapideuphoria.com/cgi-bin/asearch.exu?dos=on&win=on&lnx=on&gen=on&keywords=urlmon

That should be all you need.

Don Cole
 A Bug is an un-documented feature.
A Feature is a documented Bug.

new topic     » goto parent     » topic index » view message » categorize

5. Re: Small Project for a Lucky Person

cklester wrote:
> 
> Hey! I need a program that will retrieve a web page at set intervals. Anybody
> got something like this or want to make it for me? 8) Thanks!!
> 

#!/bin/sh

while [ true ]
do
 wget $1
 sleep $2
done

Could easily be trsnalated to euphoria and run in windows if you download wget
for windows.

Regards, Alexander Toresson

new topic     » goto parent     » topic index » view message » categorize

6. Re: Small Project for a Lucky Person

Alexander Toresson wrote:
> 
> cklester wrote:
> > 
> > Hey! I need a program that will retrieve a web page at set intervals.
> > Anybody
> > got something like this or want to make it for me? 8) Thanks!!
> > 
> 
> #!/bin/sh
> 
> while [ true ]
> do
>  wget $1
>  sleep $2
> done
> 
> Could easily be trsnalated to euphoria and run in windows if you download wget
> for windows.
> 

Sorry, should have been wget -r $1, not wget $1. The latter caused wget to not
replace an existing file, rather rename it. I figure you wanted the former. Btw,
$1 is the first command line option (file to download), $2 is the second one (the
number of seconds to wait between downloads).

Regards, Alexander Toresson

new topic     » goto parent     » topic index » view message » categorize

7. Re: Small Project for a Lucky Person

Alexander Toresson wrote:
> 
> Alexander Toresson wrote:
> > 
> > cklester wrote:
> > > 
> > > Hey! I need a program that will retrieve a web page at set intervals.
> > > Anybody
> > > got something like this or want to make it for me? 8) Thanks!!
> > > 
> > 
> > #!/bin/sh
> > 
> > while [ true ]
> > do
> >  wget $1
> >  sleep $2
> > done
> > 
> > Could easily be trsnalated to euphoria and run in windows if you download
> > wget
> > for windows.
> > 
> 
> Sorry, should have been wget -r $1, not wget $1. The latter caused wget to not
> replace an existing file, rather rename it. I figure you wanted the former.
> Btw, $1 is the first command line option (file to download), $2 is the second
> one (the number of seconds to wait between downloads).
> 

Hehe, was wrong again :) -r does have the desired effect, however, it also
triggers recursive downloading (which is not at all what we want). -N does the
right thing, with the added niceness that it doesn't download the file if the
timestamp on the local file is as new as or older than the file being downloaded.

Regards, Alexander Toresson

new topic     » goto parent     » topic index » view message » categorize

Search



Quick Links

User menu

Not signed in.

Misc Menu