This is the seventh day of my participation in the August More Text Challenge. For details, see: August More Text Challenge

1. The programming question rotates the ordered array to find whether the elements exist

Ideas:

    1. Brute force: iterates through an array to see if an element exists.
    2. Binary search: the local array is still ordered after rotation, so the binary search algorithm can still be used.

Reference code:

class Solution: def search(self, nums: List[int], target: int) -> int: # return nums.intdex(target) if target in nums else -1 # return nums.intdex(target) if target in nums else -1 return -1 left, right = 0, len(nums) - 1 while left <= right: medium = (left + right) // 2 if nums[medium] == target: return medium if nums[left] <= nums[medium]: if nums[left] <= target < nums[medium]: right = medium - 1 else: left = medium + 1 else: if nums[medium] < target <= nums[right]: left = medium + 1 else: right = medium - 1 returnCopy the code

2.Realize cosine similarity calculation

Cosine similarity: the degree of similarity is judged by the Angle between two vectors;

      • The larger the Angle between the two vectors, the farther away they are, and the maximum distance is 180 degrees between the two vectors;
      • The smaller the Angle between the vectors, the closer they are, the minimum distance between the two vectors is 0 degrees, exactly coincidences.

So the greater the cosine similarity, the more similar the vectors are;

Calculation formula:

\cos (\theta)=\frac{a \cdot b}{|A||B|}=\frac{\sum_{i=1}^{n} a_{i} \times b_{i}}{\sqrt{\sum_{i=1}^{n}\left(a_{i}\right)^{2}} \times \sqrt{\sum_{i=1}^{n}\left(b_{i}\right)^{2}}}

Method for calculating cosine similarity:

    • Numpy:
def cal_cosineSimilarity_numpy(a,b):
    dot = a*b 
    a_len = np.linalg.norm(a,axis=1)
    b_len = np.linalg.norm(b,axis=1)
    cosineSimilarity = dot.sum(axis=1)/(a_len*b_len)
Copy the code
    • Pytorch
def methodOne(a,b): d = torch.mul(a, Norm (a,dim=1) =1)#2 norm(b,dim=1) =1 Dim = 1)/(a_len * b_len) # def get similarity methodTwo (a, b) : cosineSimilarity = torch. Cosine_similarity (a, b, dim = 1)Copy the code
    • Sklearn
from sklearn.metrics.pairwise import cosine_similarity
def cal_cosineSimilarity_sklearn(a, b)
cosineSimilarity = cosine_similarity(a,b)
Copy the code

3. Verify binary search tree (BST)

Binary search trees have the following characteristics:

      • The left subtree of a node contains only the number smaller than the current node.
      • The right subtree of a node contains only the number greater than the current node.
      • All left and right subtrees must themselves also be binary search trees.

Ideas:

According to the characteristics of binary search tree, if the left subtree of the binary tree is not empty, then the value of all nodes in the left subtree is less than the value of its root node. If its right subtree is not empty, then all the nodes in the right subtree are greater than the value of its root node; And its left and right subtrees are also binary search trees.

You can design a recursive function check(root, max_num, min_num) to consider a subtree with root as the root and determine whether all nodes in the subtree are within the range of (max_num, min_num). If (max_num, min_num) is not in the range of (max_num, min_num), then return False. If (max_num, min_num) is not in the range of (max_num, min_num), then return False.

Note: When recursively calling the left subtree, you need to change the upper bound max_num to root.val; To recursively call the right subtree, you need to change the lower bound min_num to root.val.

The entry point for recursive function calls is check(root, float(“inf”), -float(“inf”)), float(” INF “) represents an infinite value.

Reference code:

# Definition for a binary tree node.
# class TreeNode:
#     def __init__(self, val=0, left=None, right=None):
#         self.val = val
#         self.left = left
#         self.right = right
class Solution:
    def isValidBST(self, root: TreeNode) -> bool:
        return  self.valTree(root, float("inf"), -float("inf"))
    def valTree(self, root: TreeNode, max_num, min_num):
        if not root: return True
        val = root.val 
        if val >= max_num or val <= min_num: return False
        if not self.valTree(root.left, val, min_num): return False
        if not self.valTree(root.right, max_num, val):return False
        return True
Copy the code

4.Use randomInt(5) to achieve randomInt(7), just talk about the idea

RandomInt (5) : equal probability to generate integer [1, 2, 3, 4, 5]

RandomInt (7): equal probability to generate integer [1, 2, 3, 4, 5, 6, 7]

Ideas:

      • (randomInt(5) -1) : [0, 1, 2, 3, 4],
      • (randomInt(5) -1)*5) : [0, 5, 10, 15, 20],
      • The two sets of integers above can be constructed as new arrays with equal probability: [0, 1, 2, 3, 4, 5, 6, 7,…., 24];

(If the second array is 2 or 3 times, 4 times will not be able to construct a new equal probability array)

      • Select a new array [0, 1, 2, 3…., 20] with 21 tuples to construct an equal probability array [1, 2, 3, 4, 5, 6, 7]

Reference code:

import random def rand5(): return random.randint(1, 5) def rand7(): while True: tmp_num = (rand5() - 1)*5 + (rand5() - 1) if tmp_num <= 20: Break return tmp_num % 7 + 1 # Break return tmp_num % 7 + 1 # ls = [] for _ in range(10000): ls.append(rand7()) for i in range(1, 8): Print (ls.count(I)/len(ls), end=" ") # out: 0.144 0.1459 0.14 0.1438 0.1444 0.1373 0.1446 #Copy the code

5. Programming question: Split the linked list problem

Split the list:

Given the head node of a linked list and a specific value x, the list is separated so that all nodes less than x appear before nodes greater than or equal to x. Retain the initial relative position of each node in both partitions.

As shown in the figure below:

Ideas:

Two linked lists, small and large, need to be maintained. The small list stores all nodes less than x in order, and the large list stores all nodes greater than or equal to x in order. After traversing the original linked list, we only need to point the tail node of small linked list to the head node of large linked list to complete the separation of the linked list.

Reference code:

# Definition for singly-linked list.
# class ListNode:
#     def __init__(self, val=0, next=None):
#         self.val = val
#         self.next = next
class Solution:
    def partition(self, head: ListNode, x: int) -> ListNode:
        small = ListNode()
        large = ListNode()
        smallHead, largeHead = small, large
        while head:
            if head.val < x:
                small.next = head
                small = small.next
            else:
                large.next = head
                large = large.next
            head = head.next
        large.next = None
        small.next = largeHead.next
        return smallHead.next
Copy the code

6. How to solve overfitting? How do I augment the image?

Common ways to alleviate overfitting:

      • Reduced model complexity
      • Add more training data: Train the model with a larger data set
      • Data to enhance
      • Regularization: L1, L2, add BN layer
      • Adding Dropout Strategy
      • Early Stopping
      • Re-cleaning data: Remove data that is obviously abnormal
      • Use ensemble learning: multiple models are integrated together to reduce the risk of overfitting a single model

Common data augmentation methods:

      • Flip horizontally/vertically
      • Random rotation
      • Random scaling
      • Random shear
      • Color, contrast enhancement
      • CutOut
      • CutMix
      • Mixup
      • Mosaic
      • Random Erasing

7. What are the gradient descent methods?

There are three kinds of gradient descent algorithms as follows:

      • Stochastic gradient descent method: SGD
      • Batch gradient descent method: BGD
      • Min-batch gradient descent method: MBGD

8.What are the features of Sigmoid? Activation function?

Sigmod function properties:

      • Domain :(−∞, +∞);
      • Range :(0, 1);
      • Function is continuous smooth function in the domain of definition;
      • It can be differentiated everywhere, and its derivative is:

;

      • The value of the function is between 0 and 1, and is symmetrical at 0.5, and the slope of the value increases as you approach x = 0.

Common activation functions:

      • Sigmoid
      • Tanh
      • Relu, Leaky Relu, P-relu (Parametric Relu)
      • Elu, Gelu
      • Swich
      • Selu

Dry goods group partners, the latest upgrade of the “AI interview 100” e-book, can be sent to the need of free partners, need to reply in the comment area: [100 questions], see the direct private you.