Re: Is There a Maximum malloc Limitation?

new topic     » goto parent     » topic index » view thread      » older message » newer message
bryanso said...

I have a Mac Pro that has 32GB RAM. I'm using the Mac OS version of Euphoria 4.0.4. I found it not possible to allocate 1GB of RAM:

include std/eumem.e 
 
integer g = 1000000000 
printf(1, "  1G = %d\n", g) 
 
atom mem = malloc(g) 

This is not allocating 1GB of RAM. It's trying to allocate a sequence with 1,000,000,000 elements. There is a small (~20 bytes) overhead for each sequence and each element (in 32-bit euphoria) uses 4 bytes. So you're really trying to allocate over 4GB of RAM with this call. That's as much as your OS will allow a 32-bit process to have, total, but of course there are other things allocated in your memory space.

bryanso said...

Seems to be a very restrictive limitation. I'm working on AI best-first-search app that has a huge search space therefore need to use as much memory as possible.

Is it possible to recompile Euphoria source in Mac OS to increase the limit?

Does Euphoria for Linux 64-bit have the same limit?

The limit is different in 64-bit euphoria. You have practically unlimited memory space (48-bits, or 256TB). However, sequences are limited to 2,147,483,647 elements (based on the data type used). If you actually try to use a sequence this big, you'll certainly have performance problems.

bryanso said...

ps I'm not implementing PBR. I am relying on map.e which uses malloc internally.

In 4.1, map.e had some significant upgrades. It's a lot faster, and you might have better luck with it (even on 32-bit), though you'll probably still run into some memory problems, given your description of what you're doing.

Matt

new topic     » goto parent     » topic index » view thread      » older message » newer message

Search



Quick Links

User menu

Not signed in.

Misc Menu