Re: Is There a Maximum malloc Limitation?

new topic     » goto parent     » topic index » view thread      » older message » newer message
Steady said...
bryanso said...
Steady said...

It is not a very good programming practice to have extremely large variables or variables with enormously large number of elements.
I remember at one point in time in my programming, I made separate 26 databases of street names just to reduce the number of records in each database.

I don't agree if you mean that as a general programming principle or philosophy. May be true for Euphoria 4.

In a best-first-search application, I really hope my programming language's built-in hashmap can make use of all/most of my RAM. I know I can create my own hashmap, much like you could split your database into 26 partitions.

We won't be having this discussion and you wouldn't have to make 26 databases out of street names, if 64-bit computing were the norm when Euphoria got started. We have been working around a programming language limitation. I don't think that should be considered good programming practice.

It is NOT a question of what a language can or cannot do with a large object or variable. It is not a question of whether a particular database can store more records or a variable can have more elements in it.
The problem you have to address is the time taken in accessing a record or an element, in searching and reindexing (database) or extracting (record or element). Th more monolithic a variable or database you are dealing with the more time each operation takes place. You can have any type of search such a a Btree, a Ctree , ... or a Ztree, the time taken will always be more searching in a large single database as opposed to searching in one of the smaller databases you first select. It is a time consuming operation when it comes to compacting in memory a large database.
Perhaps for straightforward understanding , look at the hard drive which has been used for over three years. You can use the defrag software to get it tidies up and compacted, or do as I do, -.-.-.- just copy to another drive or partition and reformat the original drive.
In real life, where thousands of searches, extractions, add, inserts deletes are taking place, it is far better to work with a larger number of variables or databases than with a monolithic memory hogging single variable or database or a variable with billions of elements.

There is, however one way to search a vast Dbase that will perform fastest
ie a recursive binary search using any type of variable
and considering it to be a binary value
It will take just max.32 steps for 64bits-values, but could end up as fast as after just 3 steps

new topic     » goto parent     » topic index » view thread      » older message » newer message

Search



Quick Links

User menu

Not signed in.

Misc Menu