RE: De-allocating memory - time issue

new topic     » topic index » view thread      » older message » newer message

Well, if you ain't got the RAM, you ain't got the RAM and nothing is 
going to help too much, but I recently wrote some code to simulate 
multi-dimensional sequences in direct memory.

I had a 3-dimensional sequence with dimensions something like 50 x 100 x 
3000 giving me room to store 15 millions atoms. (Which I had enough RAM 
for, so disk-swapping was not the issue.) Just trying to fill it took 10 
minutes or so, even if I pre-allocated it.  And trying to grab or update 
values was incredibly slow.

Seeing as it was of fixed size once created (although the elements 
needed to be updated all the time) I figured I could just poke & peek 
the values directly into memory.  But I needed the ability to reference 
an index in 3 dimensions (e.g. {25,2,1500}) so I wrote some functions 
that calculated the correct index as poked or peeked it for me.

Doing it this way is obviously slower than using a normal sequence of 
normal size, and the emulated sequence needs to be of fixed size, but 
for giant multi-dimensional sequences there is no comparison as my code 
runs the same speed no matter how big the sequence you're emulating (as 
long as you've got the memory).

If anyone is interested, I can prepare the code for public consumption 
and upload it...

-- Andy


petelomax at blueyonder.co.uk wrote:
> I have a *HUGE* object of some 10 million entries. (I now know the
> exact size 'cos I coded a status window to let me know)
> [please read this all the way thru before responding]
> 
> Given that I have 48MB of ram, that 10 million bears resemblance to me
> running out of physical memory (4 bytes per integer, yep), and the ole
> PC begins to disk thrash.
> 
> So, I code a "Cancel" button, press it and the program waddles on
> longer than expected.
> 
> So I trace it to the line
> 
> 	 stack={}
> 
> That line alone takes 3 minutes to execute.
> 
> Now, I kind of understand there are difficulties abound in the general
> case deallocating ten million entries, most of which are copies of
> copies of copies (however many it takes to get to ten million).
> 
> stack is an array of approx repeat({},55000) of which only the middle
> 3200 have been updated, however each such that stack[i] is set to
> stack[i+/-1]&{x,y}.
> 
> So I guess reference counts are off the scale.
> 
> The annoying point here, then, assuming it *is* the problem, is that
> the ref count handling is, in this unusual case, wholly irrelevant.
> 
> I definately doubt there is any way to improve this, it really is not
> any kind of major problem in any sense, but on the off-chance, I am
> just asking if anyone has any suggestions.
> 
> Thanks
> Pete
> 
>

new topic     » topic index » view thread      » older message » newer message

Search



Quick Links

User menu

Not signed in.

Misc Menu