Hi, everyone. I’m the third in line. I recently quit naked.

Two days ago, I had an interview, which ended after only ten minutes

Here’s what happened:

Interviewer: Can you talk about the data structure of HashMap?

Third: array + linked list + red black tree, abba abba……

Interviewer: What is the search complexity of a red-black tree?

Third: O(logn).

Interviewer: What is the base of the complexity?

Third: time complexity O(logn) has a base?

Interviewer: No?

Embarrassed to live…

Interviewer: What about the time complexity of quicksort? What’s the base?

The third smiled awkwardly…

Interviewer: Well, I have no further questions. That’s all for this interview.


After the interview, the third idea is difficult to flat, hurriedly check.

Order logn has a base!

Take a look at the definition of time complexity:

In algorithm analysis, the total number of statements executed T (n) is a function of the problem size n. The change of T (n) with n is further analyzed and the magnitude of T (n) is determined. The time complexity of the algorithm, that is, the time measure of the algorithm, is denoted as T (n) = O(f(n)). It means that with the increase of problem size N, the growth rate of algorithm execution time is the same as that of F (n), which is called the asymptotic time complexity of the algorithm, or time complexity for short. Where f (n) is some function of the size n of the problem.

It’s a little abstract, right? Let’s just do an example, and we’ll see what that means.

int n=10; int count=1; while (count<n){ count=count*2; System.out.println(count); }Copy the code

And let’s see, this operation, every time count is multiplied by 2, it gets one point closer to n. In other words:

Solved. Order logn does have a base.

What determines this base?

The log-level time complexity in the algorithm is all due to the divide-and-conquer idea, and the base is directly determined by the divide-and-conquer complexity. If you take the dichotomy, you have base 2, and the rule of thirds has base 3, and so on.

Base O of logn doesn’t mean much!

So the question is, why don’t we always write base numbers?

Can’t just because the base is too difficult…

We note that time complexity is defined as T (n) = O(f(n)). It means that with the increase of problem size N, the growth rate of algorithm execution time is the same as that of F (n), which is called the asymptotic time complexity of the algorithm, or time complexity for short.

Suppose we want to compare the growth of two functions, f(n) and g(n), how do we do it?

You can use the limit in calculus:

Old three high number forget ha ha, will not derive, in short, the final result is a constant.

That is, if n is very large, a logarithm function of any base is just a constant multiple.

So no matter what the base is, the log level asymptotics mean the same thing. That is to say, the increase in the time complexity of the algorithm is the same as the increase in the amount of data to process.

Anyway, O (logn) is enough to express the logarithm of all bases.


It took an hour, and the useless knowledge increased.

To summarize, order logn has a base, but you don’t have to worry about it.


Reference:

[1]. Relearning data structures (Preface: Overview)

[2]. Offer — what is the O(logN) base in algorithm complexity

[3]. How to understand the time complexity representation of algorithms, such as O(n²), O(n), O(1), O(nlogn), etc.?