This is the 14th day of my participation in the November Gwen Challenge. Check out the event details: The last Gwen Challenge 2021

Today grilled 2 good future interview questions, still calculate more common and classic questions.

1. Talk about dropout

Dropout goal: Ease overfitting and reduce convergence time

The Dropout principle can be summarized as bits: when the network propagates forward, the activation value of a neuron stops working at a certain probability value P, which makes the network less dependent on certain local features and makes the model more generalized.

P: represents the probability of discarding, even if the neuron is inactivated; To ensure the same distribution of neurons during training and testing, there are two modes: upscale_in_train and ‘downscale_in_infer’.

Upscale_in_train: Increases output during training. Train: out = input * mask/(1.0-p) inference: out = input downscale_in_infer: Out = input * mask inference: out = input * (1.0-p)Copy the code

For example, if the network has 10 trainable parameters and the discard probability is set to 0.2 during training, there are 8 parameters to train. However, the test will have 10 parameters without using Dropout. To ensure that the two are equal, the weight of the 8 neurons retained during training can be multiplied by 1/(1-0.2)=8 or the weight parameters of all neurons during the test can be multiplied by 0.8 (10*0.8=8).

2. Algorithm: determine whether it is a balanced binary tree

Definition of balanced binary tree: the height difference between the left and right subtrees of each node of a binary tree is no more than 1.

The root node and its children are all balanced binary trees, so we need to recursively calculate the height of the left and right subtrees of each node, and then determine whether the height difference is less than 1. The reference code is as follows:

# Definition for a binary tree node. # class TreeNode: # def __init__(self, val=0, left=None, right=None): # self.val = val # self.left = left # self.right = right class Solution: def isBalanced(self, root: TreeNode) -> bool: if not root : return True result = abs(self.getHeight(root.right) - self.getHeight(root.left)) <= 1 and self.isBalanced(root.left) and  self.isBalanced(root.right) return result def getHeight(self, root: TreeNode): if not root: return 0 return max(self.getHeight(root.left), self.getHeight(root.right)) + 1Copy the code