Re: A controversial topic

new topic     » goto parent     » topic index » view thread      » older message » newer message
jimcbrown said...
katsmeow said...

IIRC it wasn't solved, it was disallowed,

Ah right, task_yield() getting compiled out in std/net/http.e's execute_request()

katsmeow said...

which broke the point of handling the blocking of the entire program by http.e if the server hung.

That's .. not quite how it worked. The sockets library has a receive() method that respects a timeout. If the server hung, the receive() method would time out, no data would be received, and the routine would exit out and return.

Also, tasks are a form of cooperative multitasking, but the sockets library didn't play nicely with that - if receive() waited too long before timing out, there was no way for other tasks to continue in the meantime. That would have required preemptive multitasking - or threads.

Or, not reading the recieve queue until there was data in it. I wrote a http fetcher and a daemon in mIRC, from scratch, and the server was threaded. So it hurt personally when such things were refused by OE.

jimcbrown said...

Where the task_yield() was useful was to allow other tasks to run in between grabbing globs of data, if the http library was busy downloading a huge chunk of data in small bites. Other tasks get to run if we're downloading a big multigig DVD or something over the web.

I guess the reason this hasn't been a big problem is that most http stuff I've seen with Eu has been either fully interpreted, or only on nix if translated. On nix, the translated shared library can be loaded, then fork() can be called. The http call then runs inside the child process, without blocking the main code that runs in the parent process.

I was on one of the slowest internet lines in the usa at the time, everything timed out. Lag times of 10 minutes was not uncommon on irc. Setting a 2 minute timeout on http was reasonable. Wget.exe's resume function was useful every day here in the olden daze.

One of my favorite weather sites updated radar images every 15 minutes, sometimes it took longer than 15 minutes to get a 120kbyte gif. I took to passing 20 second timeouts to wget (which i had wrapped in OE, called via system()), because it was faster to kill the whole process at my end and restart, than wait to see if i was still online with normal timeouts.

I took to running a PHP httpfetch script on my web host, to cut news sites' pages down to 20kbytes, because maybe i could get that without timing out.


new topic     » goto parent     » topic index » view thread      » older message » newer message


Quick Links

User menu

Not signed in.

Misc Menu