1. Is There a Maximum malloc Limitation?

I have a Mac Pro that has 32GB RAM. I'm using the Mac OS version of Euphoria 4.0.4. I found it not possible to allocate 1GB of RAM:

include std/eumem.e 
 
integer g = 1000000000 
printf(1, "  1G = %d\n", g) 
 
atom mem = malloc(g) 
ram_space[mem][1] = 10 
 
printf(1, "No issue allocating %d mem: %d\n", {g, ram_space[mem][1]}) 
 
free(mem) 

Got a memory error (was successful when I allocate 500M).

  1G = 1000000000 
eui(8014,0xa075b1d4) malloc: *** mach_vm_map(size=4000002048) failed (error code=3) 
*** error: can't allocate region 
*** set a breakpoint in malloc_error_break to debug 
 
/Users/bryanso/EUPHORIA4/include/std/eumem.e:47 in function malloc()  
Your program has run out of memory. 
One moment please...  

Seems to be a very restrictive limitation. I'm working on AI best-first-search app that has a huge search space therefore need to use as much memory as possible.

Is it possible to recompile Euphoria source in Mac OS to increase the limit?

Does Euphoria for Linux 64-bit have the same limit?

Thanks

ps I'm not implementing PBR. I am relying on map.e which uses malloc internally.

new topic     » topic index » view message » categorize

2. Re: Is There a Maximum malloc Limitation?

bryanso said...

I have a Mac Pro that has 32GB RAM. I'm using the Mac OS version of Euphoria 4.0.4. I found it not possible to allocate 1GB of RAM:

include std/eumem.e 
 
integer g = 1000000000 
printf(1, "  1G = %d\n", g) 
 
atom mem = malloc(g) 

This is not allocating 1GB of RAM. It's trying to allocate a sequence with 1,000,000,000 elements. There is a small (~20 bytes) overhead for each sequence and each element (in 32-bit euphoria) uses 4 bytes. So you're really trying to allocate over 4GB of RAM with this call. That's as much as your OS will allow a 32-bit process to have, total, but of course there are other things allocated in your memory space.

bryanso said...

Seems to be a very restrictive limitation. I'm working on AI best-first-search app that has a huge search space therefore need to use as much memory as possible.

Is it possible to recompile Euphoria source in Mac OS to increase the limit?

Does Euphoria for Linux 64-bit have the same limit?

The limit is different in 64-bit euphoria. You have practically unlimited memory space (48-bits, or 256TB). However, sequences are limited to 2,147,483,647 elements (based on the data type used). If you actually try to use a sequence this big, you'll certainly have performance problems.

bryanso said...

ps I'm not implementing PBR. I am relying on map.e which uses malloc internally.

In 4.1, map.e had some significant upgrades. It's a lot faster, and you might have better luck with it (even on 32-bit), though you'll probably still run into some memory problems, given your description of what you're doing.

Matt

new topic     » goto parent     » topic index » view message » categorize

3. Re: Is There a Maximum malloc Limitation?

Thanks for the explanation, Matt.

new topic     » goto parent     » topic index » view message » categorize

4. Re: Is There a Maximum malloc Limitation?

It is not a very good programming practice to have extremely large variables or variables with enormously large number of elements.
I remember at one point in time in my programming, I made separate 26 databases of street names just to reduce the number of records in each database.

new topic     » goto parent     » topic index » view message » categorize

5. Re: Is There a Maximum malloc Limitation?

Steady said...

It is not a very good programming practice to have extremely large variables or variables with enormously large number of elements.
I remember at one point in time in my programming, I made separate 26 databases of street names just to reduce the number of records in each database.

I don't agree if you mean that as a general programming principle or philosophy. May be true for Euphoria 4.

In a best-first-search application, I really hope my programming language's built-in hashmap can make use of all/most of my RAM. I know I can create my own hashmap, much like you could split your database into 26 partitions.

We won't be having this discussion and you wouldn't have to make 26 databases out of street names, if 64-bit computing were the norm when Euphoria got started. We have been working around a programming language limitation. I don't think that should be considered good programming practice.

new topic     » goto parent     » topic index » view message » categorize

6. Re: Is There a Maximum malloc Limitation?

bryanso said...
Steady said...

It is not a very good programming practice to have extremely large variables or variables with enormously large number of elements.
I remember at one point in time in my programming, I made separate 26 databases of street names just to reduce the number of records in each database.

I don't agree if you mean that as a general programming principle or philosophy. May be true for Euphoria 4.

In a best-first-search application, I really hope my programming language's built-in hashmap can make use of all/most of my RAM. I know I can create my own hashmap, much like you could split your database into 26 partitions.

We won't be having this discussion and you wouldn't have to make 26 databases out of street names, if 64-bit computing were the norm when Euphoria got started. We have been working around a programming language limitation. I don't think that should be considered good programming practice.

It is NOT a question of what a language can or cannot do with a large object or variable. It is not a question of whether a particular database can store more records or a variable can have more elements in it.
The problem you have to address is the time taken in accessing a record or an element, in searching and reindexing (database) or extracting (record or element). Th more monolithic a variable or database you are dealing with the more time each operation takes place. You can have any type of search such a a Btree, a Ctree , ... or a Ztree, the time taken will always be more searching in a large single database as opposed to searching in one of the smaller databases you first select. It is a time consuming operation when it comes to compacting in memory a large database.
Perhaps for straightforward understanding , look at the hard drive which has been used for over three years. You can use the defrag software to get it tidies up and compacted, or do as I do, -.-.-.- just copy to another drive or partition and reformat the original drive.
In real life, where thousands of searches, extractions, add, inserts deletes are taking place, it is far better to work with a larger number of variables or databases than with a monolithic memory hogging single variable or database or a variable with billions of elements.

new topic     » goto parent     » topic index » view message » categorize

7. Re: Is There a Maximum malloc Limitation?

Steady said...
bryanso said...
Steady said...

It is not a very good programming practice to have extremely large variables or variables with enormously large number of elements.
I remember at one point in time in my programming, I made separate 26 databases of street names just to reduce the number of records in each database.

I don't agree if you mean that as a general programming principle or philosophy. May be true for Euphoria 4.

In a best-first-search application, I really hope my programming language's built-in hashmap can make use of all/most of my RAM. I know I can create my own hashmap, much like you could split your database into 26 partitions.

We won't be having this discussion and you wouldn't have to make 26 databases out of street names, if 64-bit computing were the norm when Euphoria got started. We have been working around a programming language limitation. I don't think that should be considered good programming practice.

It is NOT a question of what a language can or cannot do with a large object or variable. It is not a question of whether a particular database can store more records or a variable can have more elements in it.
The problem you have to address is the time taken in accessing a record or an element, in searching and reindexing (database) or extracting (record or element). Th more monolithic a variable or database you are dealing with the more time each operation takes place. You can have any type of search such a a Btree, a Ctree , ... or a Ztree, the time taken will always be more searching in a large single database as opposed to searching in one of the smaller databases you first select. It is a time consuming operation when it comes to compacting in memory a large database.
Perhaps for straightforward understanding , look at the hard drive which has been used for over three years. You can use the defrag software to get it tidies up and compacted, or do as I do, -.-.-.- just copy to another drive or partition and reformat the original drive.
In real life, where thousands of searches, extractions, add, inserts deletes are taking place, it is far better to work with a larger number of variables or databases than with a monolithic memory hogging single variable or database or a variable with billions of elements.

There is, however one way to search a vast Dbase that will perform fastest
ie a recursive binary search using any type of variable
and considering it to be a binary value
It will take just max.32 steps for 64bits-values, but could end up as fast as after just 3 steps

new topic     » goto parent     » topic index » view message » categorize

Search



Quick Links

User menu

Not signed in.

Misc Menu