how the size of an integer is decided?
- is it based on processor or compiler or OS?
Answers were Sorted based on User's Feedback
Answer / lanchai
Different hardware systems might have a different size for
an integer. you might get a different number in different
OS because the hardware running the OSes were different to
begin with. also, the "sizeof" command is actually a
compile-time command calculated by the compiler (see
wikipedia on sizeof)
So strictly speaking, its the hardware (or processor) that
determines the size of an integer.
Is This Answer Correct ? | 11 Yes | 2 No |
Answer / akshay
compiler
if compiler is 16 bit compiler then int is of 2 bytes (turbo c)
if compiler is 32 bit then int is of 4 bytes in VC++
but for 32 bit compiler processor should be compatible i.e. of 32 bytes
so processor also decides it
again of OS
if os is 16 bit how you r going to run a 32 bit compiler on it
so a confusing question
Is This Answer Correct ? | 5 Yes | 2 No |
Answer / sandeep
Its CPU (processor) which decides the size.
Please see the link below-
http://en.wikipedia.org/wiki/Integer_(computer_science)
Is This Answer Correct ? | 2 Yes | 1 No |
Answer / hrishikesh
ans is compiler and not os or machine dependent or
processor b'coz u can port difft os on same processor .
when in tc it is 2
while in gcc it is 4 byte for int .
so her neither u r processor changes nor kernel as all are
originated from
UNIX.
Is This Answer Correct ? | 2 Yes | 1 No |
Answer / mangesh
every book says it depends on machine.......some may have
same size for "int" and "long int".......
by machine they mean processor only.......
coz it cant be either compiler or OS......as they are part
of virtual...i mean software...not machine...
Is This Answer Correct ? | 8 Yes | 8 No |
Answer / rakesh
It should be compiler, bcoz turbo c and VC hav different sizes for integer.
of course OS and processor interfere, but only for letting compiler get Installed or not... after which its compilers work.
Is This Answer Correct ? | 2 Yes | 2 No |
shorting algorithmS
what is pointer?
Which of the following is not a valid declaration for main ()? 1) int main() 2) int main(int argc, char *argv[]) 3) They both work
write an interactive C program that will encode or decode a line of text.To encode a line of text,proceed as follows. 1.convert each character,including blank spaces,to its ASCII equivalent. 2.Generate a positive random integer.add this integer to the ASCII equivalent of each character.The same random integer will be used for the entire line of text. 3.Suppose that N1 represents the lowest permissible value in the ASCII code,and N2 represents the highest permissible value.If the number obtained in step 2 above(i.e.,the original ASCII equivalent plus the random integer)exceeds N2,then subtract the largest possible multiple of N2 from this number,and add the remainder to N1.Hence the encoded number will always fall between N1 and N2,and will therefore always represent some ASCII character. 4.Dislay the characters that correspond to the encoded ASCII values. The procedure is reversed when decoding a line of text.Be certain,however,that the same random number is used in decodingas was used in encoding.
When do we get logical errors?
What is your stream meaning?
write a progrmm in c language take user interface generate table using for loop?
what is the maximum limit of row and column of a matrix in c programming. in linux .
List some of the dynamic data structures in C?
main() { int i=400,j=300; printf("%d..%d"); }
What is the significance of an algorithm to C programming?
What is the difference between typedef and #define?