how the size of an integer is decided?
- is it based on processor or compiler or OS?
Answer Posted / lanchai
Different hardware systems might have a different size for
an integer. you might get a different number in different
OS because the hardware running the OSes were different to
begin with. also, the "sizeof" command is actually a
compile-time command calculated by the compiler (see
wikipedia on sizeof)
So strictly speaking, its the hardware (or processor) that
determines the size of an integer.
Is This Answer Correct ? | 11 Yes | 2 No |
Post New Answer View All Answers
Which node is more powerful and can handle local information processing or graphics processing?
Difference between constant pointer and pointer to a constant.
What does c mean in basketball?
Is that possible to store 32768 in an int data type variable?
What is data types?
When should the const modifier be used?
What is the role of && operator in a program code?
What are the data types present in c?
What does d mean?
#include main() { char s[] = "Bouquets and Brickbats"; printf(" %c, ",*(&s[2])); printf("%s, ",s+5); printf(" %s",s); printf(" %c",*(s+2)); }
What oops means?
Is c++ based on c?
What are the 5 elements of structure?
What is c++ used for today?
What is the size of a union variable?