Re: source code speed question

new topic     » goto parent     » topic index » view thread      » older message » newer message
mattlewis said...
Bernie said...

In DLL.E C_INT, C_LONG, etc. are constants initialized in HEX.

When using say a C_LONG does the program convert it to DECIMAL each time its used in the user's code or is it converted to DECIMAL only ONCE when it's defined as a constant ?

I know it is more convenient to use HEX in the source but wouldn't be faster to use all DECIMAL in the constants so no conversion would be required ?

Bernie,

I'm not sure what you're asking. A number is a number. To the computer, it's a bunch of bits, neither decimal nor hexadecimal. The radix only affects the display of a number.

Perhaps its the sign that's the issue? Take a look at the docs for Atoms and Sequences. In particular:

TFM said...

Hex numbers are always positive, unless you add a minus sign in front of the # character. So for instance #FFFFFFFF is a huge positive number (4294967295), not -1, as some machine-language programmers might expect.

Matt

Matt:

Because you are using a character by character scan of a number

For example using the same number:

You see a HEX number in the code as '#' '1' '2' 'B'

You see a DECIMAL number in the code as '2' '9' '9'

Which is faster to scan ?

new topic     » goto parent     » topic index » view thread      » older message » newer message

Search



Quick Links

User menu

Not signed in.

Misc Menu