what happens when we add the objects morethan the size limit
to a hashmap
Answer Posted / kanu butani
When the number of entries in the hash table exceeds the
product of the load factor and the current capacity, the
hash table is rehashed (that is, internal data structures
are rebuilt) so that the hash table has approximately twice
the number of buckets.
As a general rule, the default load factor (.75) offers a
good tradeoff between time and space costs. Higher values
decrease the space overhead but increase the lookup cost
(reflected in most of the operations of the HashMap class,
including get and put). The expected number of entries in
the map and its load factor should be taken into account
when setting its initial capacity, so as to minimize the
number of rehash operations. If the initial capacity is
greater than the maximum number of entries divided by the
load factor, no rehash operations will ever occur.
| Is This Answer Correct ? | 3 Yes | 0 No |
Post New Answer View All Answers
Hi all, I am dng a mini project on FileSplitter application which splits the GBs of logfile into Smaller chunks(mbs) depending on the split size." How to handle GBs file? I am getting OutOfMemoryException, when I input such GB sized file. Thx
What is use of a abstract variable?
When does a class need a virtual destructor?
How a string is stored in memory?
How can you set an applet’s height and width as a percentage?
Are floats faster than doubles?
What is thread safe singleton?
Does java vector allow null?
What is difference between final and finally in java?
What is variable length arguments in java?
Which class should you use to obtain design information about an object in java programming?
What is math floor in java?
Can we use synchronized block for primitives?
How to optimize the javac output?
List out benefits of object oriented programming language?