Re: We're All Doomed (Programmers, That Is)
- Posted by ags <eu at 531pi.co.nz>
Mar 03, 2006
ags wrote:
> I was looking to see whether the define for time_t was an explicit signed int
> or the compiler somehow converted that to the machine specific word size.
>
> My guess is that in the world of C 'signed int' means just that, ie 32 bits.
> Upgrading to a 64-bit architecture might not necessarily automatically
> promote
> this to 'signed long'.
I checked the comp.lang.c FAQ since I remembered something about 'how big an int
actually is' ie (bit messy sorry):
--begin FAQ
From these values, it can be inferred that char is at least 8 bits, short int
and int are at least 16 bits, and long int is at least 32 bits. (The signed and
unsigned versions of each type are guaranteed to have the same size.) Under ANSI
C, the maximum and minimum values for a particular machine can be found in the
header file <limits.h>; here is a summary:
Base type Minimum size (bits) Minimum value (signed) Maximum value (signed)
Maximum value (unsigned)
char 8 -127 127 255
short 16 -32,767 32,767 65,535
int 16 -32,767 32,767 65,535
long 32 -2,147,483,647 2,147,483,647 4,294,967,295
(These values are the minimums guaranteed by the Standard. Many implementations
allow larger values, but portable programs shouldn't depend on it.)
--end FAQ
Checking limits.h on Linux, it says int is 32 bits, and _if_ you're on a 64-bit
processor only long long is 64-bits.
So I'd day time_t would remain 32-bits even on a 512-bit processor...
Gary
|
Not Categorized, Please Help
|
|