Re: Searching for Fastest Way to Input a File

new topic     » goto parent     » topic index » view thread      » older message » newer message
mindwalker said...
mattlewis said...

It's actually a limit of sequences. After all, 1073741823 should be enough sequence elements for everyone!

My point is there are imaginable applications where this sequence size limitation could make the task even more difficult.

I'm pretty sure matt's comment is a joke (in-line with the supposed (and likely untrue) Bill Gates reference that "640K of RAM should be enough for everyone").

In 64bit, I agree. There's no sane reason to have a limit.

In 32bit, I'd still agree in principle. However the 2GB limit of memory available to a 32bit process (though there are ways around that, but at most you can only push the limit back to 4GB, not counting things like PAE) and the fact that the minimum size of a sequence element is 4 bytes, means that in practice, you can't get much further past this size (if at all) anyways. And trying to use multiple sequences to get around the limit is doomed to fail for the same reason.

In short, no 32bit process can model the entire galaxy (or work on similiar sized datasets) while holding all the information in conventional addressing memory space.

If you really want to process datasets this big, you either 1) want to do it using a different algorithm (one that lets you load slices of the dataset one at a time or something), 2) do it with a 64bit process, or 3) both. In light of these restrictions for 32bit processes, it can be argued that the current limitation makes sense by providing a (slightly) more user friendly error message than a mere OOM.

new topic     » goto parent     » topic index » view thread      » older message » newer message

Search



Quick Links

User menu

Not signed in.

Misc Menu