preface

Data structure categories include: linear tables, linked lists, queues, stacks, hash tables, trees, graph theory, and so on. Will involve binary search algorithm, sorting algorithm, index algorithm, greedy algorithm and so on (can refer to online materials or enjoy the class data structure and algorithm elite course), this chapter is the introduction, involving linear tables, linked lists, hash tables.

What is performance tuning?

Intuitive feeling is the interaction is very smooth, especially in the performance of the machine is not good enough to ensure a better user experience. What about the actual criteria?

  • 1. Fluency, such as fast startup speed, fast page display and quick response
  • 2. Stability avoids ANR and crash
  • 3, resource saving such as memory size APK size save electricity and flow

How can data structures be optimized for performance?

1. Scenario considerations

We have various table structures for data storage. How do we choose?

  • 1. Arrays are physically contiguous and the size is clearly defined. This 2 property means that adding/removing is harder to implement but finding/updating is faster
  • 2. Sequential tables (such as ArrayLists) are physically contiguous, logically contiguous, and dynamically sized. This feature means that adding/removing is time-consuming (moving an entire block of data) but finding/updating is fast
  • 3. LinkedList is physically discontiguous and logically contiguous. Nodes can be added/removed dynamically.
  • 4. Hash array + linked list has the common advantages of linked list + array
  • 5. SparseArray 2 arrays are combined to form key in one array and value in another array. The one-to-one correspondence relationship between remove data only deletes the position mark but does not delete the new data

2. Structural analysis

1, the ArrayList

Core methods add, remove directly paste source code

add -------------------- public boolean add(E e) { ensureCapacityInternal(size + 1); // Increments modCount!! elementData[size++] = e; Return true; } public void add(int index, E element) { if (index > size || index < 0) throw new IndexOutOfBoundsException(outOfBoundsMsg(index)); ensureCapacityInternal(size + 1); // Increments modCount!! Arraycopy (elementData, index, elementData, index + 1, size-index); // ArrayCopy (elementData, index, elementData, index + 1, size-index); elementData[index] = element; // the element is added directly to the tail of size++; Int newCapacity = oldCapacity + (oldCapacity >> 1); Private void grow(int minCapacity) {// overflow-conscious code int oldCapacity = elementdata.length; int newCapacity = oldCapacity + (oldCapacity >> 1); if (newCapacity - minCapacity < 0) newCapacity = minCapacity; if (newCapacity - MAX_ARRAY_SIZE > 0) newCapacity = hugeCapacity(minCapacity); // minCapacity is usually close to size, so this is a win: elementData = Arrays.copyOf(elementData, newCapacity); } // The add method moves all the data after the index position one bit back // The source array of parameter 1 // the source array of parameter 2 is subscript // the target array of parameter 3 // the target array of parameter 4 is subscript // The array length of parameter 5 is moved System.arraycopy(elementData, index, elementData, index + 1, size - index);Copy the code

2, LinkedList

Core methods add, remove directly paste source code

public boolean add(E e) { linkLast(e); return true; } public void add(int index, E element) { checkPositionIndex(index); if (index == size) linkLast(element); else linkBefore(element, node(index)); } --------------- void linkBefore(E e, Node<E> succ) { // assert succ ! = null; final Node<E> pred = succ.prev; final Node<E> newNode = new Node<>(pred, e, succ); succ.prev = newNode; if (pred == null) first = newNode; else pred.next = newNode; size++; modCount++; }Copy the code

hashMap

HashMap has an expansion factor of 0.75, but also hash collisions. HashMap is used to replace new data if it is added to the corresponding corner. HashMap, LinkedHashMap, ConcurrentHashMap, Hashtable usage situation and difference ConcurrentHashMap(thread-safe, core uses CAS optimistic lock) hashTable (thread-safe, poor performance with PESSIMISTIC lock KE Y value cannot be null) put method

public V put(K key, V value) { return putVal(hash(key), key, value, false, true); } ------- // Each obj has a hashcode variable of type int and uses the corresponding bit operation to get the hash value static final int hash(Object key) {int h; return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16); } --------- final V putVal(int hash, K key, V value, boolean onlyIfAbsent, boolean evict) { Node<K,V>[] tab; Node<K,V> p; int n, i; if ((tab = table) == null || (n = tab.length) == 0) n = (tab = resize()).length; if ((p = tab[i = (n - 1) & hash]) == null) tab[i] = newNode(hash, key, value, null); Else {Node<K,V> e; K k; if (p.hash == hash && ((k = p.key) == key || (key ! = null && key.equals(k)))) e = p; else if (p instanceof TreeNode) e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value); else { for (int binCount = 0; ; ++binCount) { if ((e = p.next) == null) { p.next = newNode(hash, key, value, null); if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st treeifyBin(tab, hash); break; } if (e.hash == hash && ((k = e.key) == key || (key ! = null && key.equals(k)))) break; p = e; } } if (e ! = null) { // existing mapping for key V oldValue = e.value; if (! onlyIfAbsent || oldValue == null) e.value = value; afterNodeAccess(e); return oldValue; } } ++modCount; if (++size > threshold) resize(); afterNodeInsertion(evict); return null; }Copy the code

spareArray