how the size of an integer is decided?
- is it based on processor or compiler or OS?
Answer Posted / lanchai
Different hardware systems might have a different size for
an integer. you might get a different number in different
OS because the hardware running the OSes were different to
begin with. also, the "sizeof" command is actually a
compile-time command calculated by the compiler (see
wikipedia on sizeof)
So strictly speaking, its the hardware (or processor) that
determines the size of an integer.
| Is This Answer Correct ? | 11 Yes | 2 No |
Post New Answer View All Answers
When I tried to go into a security sites I am denied access and a message appeared saying 'applet not initialize'. How can I rectify this problem.
Difference between MAC vs. IP Addressing
Explain heap and queue.
What is the use of #define preprocessor in c?
What is the scope of global variable in c?
What are the advantages of c preprocessor?
write a c program to print the next of a particular no without using the arithmetic operator or looping statements?
Explain the red-black trees?
How can this be legal c?
Explain what are bus errors, memory faults, and core dumps?
How many identifiers are there in c?
How to find a missed value, if you want to store 100 values in a 99 sized array?
can we have joblib in a proc ?
What is the significance of an algorithm to C programming?
What are Macros? What are its advantages and disadvantages?