(1) HashMap series: load factor 0.75

(2) HashMap series: Tree threshold 8, degradation threshold 6

(3) HashMap series: 2 power expansion

(4) HashMap series: Put elements (you will regret it forever!)

Red black tree series:

First, “Algorithm – Simple” N fork tree introduction

Two, “Algorithm – simple” red black tree rotation

Three, “Algorithm – Simple” red black tree insertion

4. Delete the red black tree of algorithm – Simple and Simple

One, foreword

It is recommended that you review the above table of contents before moving on to this article to make the idea of HashMap easier to understand.

Lazy initialization of HashMap

We usually use HashMap:

Map<K, V> map = new HashMap();
Copy the code

HashMap default constructor:

public HashMap() {
    this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted
}
Copy the code

We see that in the constructor, only the load factor is initialized: 0.75; The array is not initialized (that is, allocated memory).

A HashMap is initialized only when the first key-value pair is placed (both PUT/putIfAbsent calls putVal) :

public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { Node<K,V>[] tab; Node<K,V> p; int n, i; if ((tab = table) == null || (n = tab.length) == 0) n = (tab = resize()).length; // When table is null, call resize for initial initialization...... }}Copy the code

A HashMap. Resize:

public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { static final int DEFAULT_INITIAL_CAPACITY = 1 << 4; // aka 16 static final float DEFAULT_LOAD_FACTOR = 0.75f; final Node<K,V>[] resize() { Node<K,V>[] oldTab = table; int oldCap = (oldTab == null) ? 0 : oldTab.length; int oldThr = threshold; int newCap, newThr = 0; if (oldCap > 0) { ...... } else if (oldThr > 0) // initial capacity was placed in threshold newCap = oldThr; else { // zero initial threshold signifies using defaults newCap = DEFAULT_INITIAL_CAPACITY; newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY); }... @SuppressWarnings({"rawtypes","unchecked"}) Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap]; // Create a new object array..... return newTab; }}Copy the code

Add objects

3.1 hashmap. put/putIfAbsent

Public class HashMap<K,V> implements Map<K,V> implements Map<K,V>, Cloneable, Serializable Public V put(K key, V value) {return putVal(hash(key), key, value, false, true); } // Return the existing value if it exists, @Override public V putIfAbsent(K key, V value) {return putVal(hash(key), key, value, true, true); }}Copy the code

3.2, a HashMap. PutVal

public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { Node<K,V>[] tab; Node<K,V> p; int n, i; if ((tab = table) == null || (n = tab.length) == 0) n = (tab = resize()).length; If ((p = TAB [I = (n-1) & hash]) == null) TAB [I] = newNode(hash, key, value, null); else { Node<K,V> e; K k; if (p.hash == hash &&((k = p.key) == key || (key ! = null && key.equals(k)))) e = p; else if (p instanceof TreeNode) e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value); else { for (int binCount = 0; ; ++binCount) { if ((e = p.next) == null) { p.next = newNode(hash, key, value, null); if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st treeifyBin(tab, hash); break; } if (e.hash == hash && ((k = e.key) == key || (key ! = null && key.equals(k)))) break; p = e; } } if (e ! = null) { // existing mapping for key V oldValue = e.value; if (! onlyIfAbsent || oldValue == null) e.value = value; afterNodeAccess(e); return oldValue; } } ++modCount; if (++size > threshold) resize(); afterNodeInsertion(evict); return null; }}Copy the code

The code looks more and more complex, let me break it down and analyze one by one:

  • Determine whether it is the first initialization (this has been analyzed in the section of delayed initialization above and will not be analyzed here)
public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { Node<K,V>[] tab; Node<K,V> p; int n, i; if ((tab = table) == null || (n = tab.length) == 0) n = (tab = resize()).length; . }}Copy the code
  • If the array pit corresponding to the hash subscript is NULL, add it directly
public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { ...... If ((p = TAB [I = (n-1) & hash]) == null) TAB [I] = newNode(hash, key, value, null); . }}Copy the code

See the notes below:

  • Case A: After the array is hash, the index of the array already contains elements. Check whether the elements are the same (hash, key, memory address, and key value).
  • Case B: Red-black tree search (found directly return)/insert node (not found);
  • Case C: single linked list search (find and judge case A: whether it is the same element); If it is not found, insert it into the linked list first, and then judge whether it needs to turn into a red-black tree;
public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { ...... / / calculate the array subscript = > I, obtain corresponding object = > subscript p if ((p = TAB [I = (n - 1) & hash]) = = null)... else { Node<K,V> e; K k; / / A / / the same hash, key, and values are equal if (p.h ash = = hash && ((k = p.k ey) = = key | | (key! = null && key.equals(k)))) e = p; // putTreeVal: // putTreeVal: // 1. If found, return the object. // 2. Else if (p instanceof TreeNode) e = ((TreeNode<K,V>)p).puttreeval (this, TAB, hash, key, value); For (int binCount = 0; for (int binCount = 0; ; ++binCount) { if ((e = p.next) == null) { p.next = newNode(hash, key, value, null); If (binCount >= treeIFy_threshthreshold -1) // -1 for 1st treeifyBin(TAB, hash); if (binCount >= treeify_threshthreshold -1); break; / / break, e = p.n ext = null} / / if found, you need to determine (A) if (e.h ash = = hash && ((k = e.k ey) = = key | | (key! = null && key.equals(k)))) break; // break e = p.ext! = null p = e; // p = p.ext, continue the loop}}...... }... }}Copy the code
  • The processing of elements that already exist
public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { ...... else { ...... // If (e! = 0, e! = 0, e! = 0, e! = 0, e! = null) {V oldValue = e.value; // Get the old if (! OnlyIfAbsent | | oldValue = = null) / / if allowed to replace or old value is null, then replace with the new value (cover) e.v alue = value; afterNodeAccess(e); Return oldValue; return oldValue; // Return the old value if no substitution is allowed, otherwise return the new value}}...... }}Copy the code
  • After new elements to determine whether need expansion (not enough barrels) considering expansion, but as long as a new element, the size is 1 \ color {red} {not enough barrels) considering expansion, but as long as a new element, the size is 1} consider capacity is not enough barrels), but as long as a new element, Add 1 to size
Public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { // Return a value, indicating a previous existence; final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { ...... // If an element is added to a bucket, a single linked list, or a red-black tree, the value of the element is increased by 1. Capacity if (++size > threshold) resize(); afterNodeInsertion(evict); // LinkedHashMap (); return null; }}Copy the code

4. HashMap Extreme Test (must see!)

Seeing the last point, I thought there was something wrong with my source code, so I wrote a test case to verify my guess (again: my source is JDK1.8)

Here’s my guess:

  • The premise is that the source code is equal to size + 1 regardless of where the new object is added (bucket, linked list, or red-black tree);
  • Construct: I create a special class whose hash value is always fixed, but whose equals does not want to wait; In this way, a hash conflict can be created;
  • Conclusion: I will say the conclusion first, yes resize, but there is a surprise;

Structural special classes:

public class SpecialClass { @Override public int hashCode() { return 1; } @override public Boolean equals(Object obj) {return false; // Force two objects to be different}}Copy the code

Test code:

import java.lang.reflect.Method; import java.util.HashMap; public class Main { public static void main(String[] args) { HashMap<SpecialClass, SpecialClass> map = new HashMap<>(); Method capacity = null; Try {// reflection: resize capacity = map.getClass().getDeclaredMethod("capacity"); capacity.setAccessible(true); for (int i = 0; i < 13; i++) { SpecialClass specialClass = new SpecialClass(); map.put(specialClass, specialClass); System.out.println("[" + i + "] map.size = " + map.size() + ", capacity = " + capacity.invoke(map)); } } catch (Exception e) { e.printStackTrace(); }}}Copy the code

Actual output:

 [0] map.size = 1, capacity = 16
 [1] map.size = 2, capacity = 16
 [2] map.size = 3, capacity = 16
 [3] map.size = 4, capacity = 16
 [4] map.size = 5, capacity = 16
 [5] map.size = 6, capacity = 16
 [6] map.size = 7, capacity = 16
 [7] map.size = 8, capacity = 16
 [8] map.size = 9, capacity = 32
 [9] map.size = 10, capacity = 64
[10] map.size = 11, capacity = 64
[11] map.size = 12, capacity = 64
[12] map.size = 13, capacity = 64
Copy the code

Each time a new element is added, size does increase by 1.

  • [0] in a bucket;
  • [1] ~ [7] in a single linked list;
  • [8] Insert the list first, according to the source code, should be converted to a red-black tree, but we said in “HashMap series: tree threshold 8, degradation threshold 6”, the tree open capacity = 64, then you find that the printed log has been expanded;
  • [9] This again leads to capacity expansion;
  • [10 ~ 12] after the expansion no longer;

You must have been eager to hear my analysis, OK, GO!

The above analysis, the problem must be a single list to red black tree here, so we directly look at the “tree” code:

public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { ...... / / calculate the array subscript = > I, obtain corresponding object = > subscript p if ((p = TAB [I = (n - 1) & hash]) = = null)... Else {// case A // case B (red-black tree)...... For (int binCount = 0; for (int binCount = 0; ; ++binCount) { if ((e = p.next) == null) { p.next = newNode(hash, key, value, null); // After the list is inserted, If (binCount >= treeify_threshold-1) // -1 for 1st // hash); break; // break, e = p.ext = null}...... }}... }... }}Copy the code

Single linked list to Red black tree (treeifyBin)

public class HashMap<K,V> extends AbstractMap<K,V> implements Map<K,V>, Cloneable, Serializable { final void treeifyBin(Node<K,V>[] tab, int hash) { int n, index; Node<K,V> e; / / key here: if the size of the current barrel, namely capacity < 64, has expanded the if (TAB = = null | | (n = TAB. Length) < MIN_TREEIFY_CAPACITY) the resize (); // Tree process..... }}Copy the code

And then you notice a strange thing:

  • Capacity = 32; Capacity = 32; Capacity = 32; Capacity = 32; But! After expansion, the elements in the linked list still remain in the unified linked list, and there is no distinction between high and low linked lists. Therefore, the length of the single linked list is (8-1 = 7, excluding the first element in the bucket).
  • Current Capacity = 32: Capacity = 64. Similarly, all elements are still in the list. At this point, the length of the list is (9-1 = 8, excluding the first element in the bucket).
  • Current Capacity = 64: add another element to the list, append it to the end of the table, and apply for a red-black tree, then meet the tree condition, so the list is changed to a red-black tree.
  • Subsequent elements with the same conflict are added directly to the red-black tree;

Maybe you did not notice, maybe you know did not say, I, found, must share with you!

If you have any questions, you can leave a message.