how the size of an integer is decided?
- is it based on processor or compiler or OS?
Answer Posted / vishal
In this case,
Processor : now if we consider size of ALU (16 bit or 32
bit) then int size get differed.
OS: in case of OS size of int varies in windows & linux.
Compiler : In case of compiler, C compiler has size of int 2
bytes while on same OS & processor Java compiler has 4 bytes
Finally conclusion is as per operation capacity of ALU,
operating systems are designed.
while in case of compiler i think they work in somewhat
upper layers so it doesn't matter with internal operation.
| Is This Answer Correct ? | 0 Yes | 1 No |
Post New Answer View All Answers
Explain with the aid of an example why arrays of structures don’t provide an efficient representation when it comes to adding and deleting records internal to the array.
Explain threaded binary trees?
When should you not use a type cast?
What is the difference between abs() and fabs() functions?
What is difference between main and void main?
Write a program to identify if a given binary tree is balanced or not.
What is a nested loop?
What is pass by value in c?
What is string length in c?
Can we declare variables anywhere in c?
a parameter passed between a calling program and a called program a) variable b) constant c) argument d) all of the above
what do the 'c' and 'v' in argc and argv stand for?
How can I avoid the abort, retry, fail messages?
How can I swap two values without using a temporary?
Describe the order of precedence with regards to operators in C.