Meaning of time complexity

What exactly is time complexity? Let’s imagine a scenario:

One day, gray and rhubarb joined a company……

A day later, gray and rhubarb delivered their own code, both of which did the same thing.

Rhubarb’s code takes 100 milliseconds to run and takes up 5MB of memory.

Gray’s code takes 100 seconds to run and takes up 500MB of memory.

So…

As you can see, there are two very important metrics for measuring good code:

1. Running time

2. Take up space

Number of basic operations performed

To illustrate the number of times a basic operation is performed, let’s use four real-life scenarios:

Scene 1. Give Xiao Hui a piece of bread 10 inches long, xiao Hui eats 1 inch every 3 days, how many days does it take to eat the whole bread?

Of course, the answer is 3 X 10 = 30 days.

What if the bread is N inches long?

At this point, it takes 3 X n = 3n days to eat the whole bread.

So if we express this relative time as a function, we can call it T (n) = 3n.

Scene 2. Xiao Hui is given a loaf of bread 16 inches long. Xiao Hui eats half of the rest of the loaf every 5 days So how many days does it take for Xiao Hui to eat the bread down to 1 inch?

How many times do you divide 16 by 2, and you get 1? So we’re talking about logarithms in mathematics, logarithms of base two, of 16, which is just log base 16.

So it takes 5 X log16 = 5 X 4 = 20 days to eat bread down to 1 inch.

What if the bread is N inches long?

It takes 5 X logn is equal to 5logn days, let’s call that T of n is equal to 5logn.

Scene 3. Give Xiao Hui a piece of bread 10 inches long and a chicken leg. Xiao Hui eats a chicken leg every two days. So how many days does it take Ash to eat the whole leg?

The answer, of course, is two days. Because eating a chicken leg has nothing to do with a 10-inch loaf of bread.

What if the bread is N inches long?

No matter how long the bread is, the time it takes to eat the drumstick is still 2 days, which is T (n) = 2.

Scene 4. Give xiao Hui a piece of bread 10 inches long. It takes 1 day for xiao Hui to eat the first inch, 2 days for the second inch, and 3 days for the third inch….. Every extra inch you eat adds a day to your time. So how many days does it take for Ash to eat the whole loaf?

The answer is the sum from 1 to 10, which is 55 days.

What if the bread is N inches long?

To eat the whole loaf, you need 1+2+3+…… +n -1 +n = (1+n)*n/2 = 0.5n^2 + 0.5n.

Let’s call it T (n) = 0.5n^2 + 0.5n.

The same idea applies to the number of times a program’s basic operations are performed. The four scenarios above correspond to the four most common execution modes of programs:

Scenario 1, T (n) = 3n, and the number of executions is linear.

  1. void eat1(int n){

  2. for(int i=0; i<n; i++){;

  3. System.out.println(” wait a day “);

  4. System.out.println(” wait a day “);

  5. System.out.println(” eat an inch of bread “);

  6. }

  7. }

vo

Scenario 2, T (n) = 5logn, the number of executions is logarithmic.

  1.  

  2.  

  3. void eat2(int n){

  4. for(int i=1; i<n; i*=2){

  5. System.out.println(” wait a day “);

  6. System.out.println(” wait a day “);

  7. System.out.println(” wait a day “);

  8. System.out.println(” wait a day “);

  9. System.out.println(” eat half the bread “);

  10. }

  11. }

  12.  

  13.  

Scenario 3, T (n) = 2, the number of executions is constant.

  1.  

  2.  

  3. void eat3(int n){

  4. System.out.println(” wait a day “);

  5. System.out.println(” eat a chicken leg “);

  6. }

  7.  

  8.  

Scenario 4, T (n) = 0.5n^2 + 0.5n, the number of executions is a polynomial.

  1.  

  2.  

  3. void eat4(int n){

  4. for(int i=0; i<n; i++){

  5. for(int j=0; j<i; j++){

  6. System.out.println(” wait a day “);

  7. }

  8. System.out.println(” eat an inch of bread “);

  9. }

  10. }

Progressive time complexity

Is it possible to analyze and compare the running time of a piece of code with a function T (n) for the base number of operations performed? There are still some difficulties.

Let’s say the relative time of algorithm A is T (n) = 100n, and the relative time of algorithm B is T (n) = 5n^2, which one is longer? It depends on what n is.

A class of methods for intimating time complectiy with a maximum of three methods are described below:

F (n) is said to be of the same order of magnitude as T (n) if there exists a function f (n) such that the limit value of T (n)/f (n) is a constant that is not equal to zero as n approaches infinity.

Denoted as T (n) = O (f (n)), O (f (n)) is called the asymptotic time complexity of the algorithm, or time complexity for short.

Progressive time complexity is expressed in capital O, so it is also known as the big O notation.

How do you derive time complexity? There are several principles:

  1. If the running time is a constant order of magnitude, it’s a constant 1.
  2. Only the highest order terms in the time function are retained
  3. If the highest order term exists, the coefficient in front of the highest order term is omitted.

Let’s go back to the last four scenarios.

Scenario 1:

T of n is equal to 3n

The highest order term is 3n, eliminating coefficient 3, and the time complexity of transformation is:

T (n) = order n.

Scenario 2:

T of n is equal to 5logn

The highest order term is 5logn, eliminating coefficient 5, and the time complexity of transformation is:

T of n is order logn.

Scenario 3:

T of n is equal to 2

Only a constant order of magnitude, the time complexity of transformation is:

T (n) = O (1)

Scenario 4:

T (n) = 0.5n^2 + 0.5n

The highest order term is 0.5n^2, saving coefficient 0.5, and the time complexity of transformation is:

T of n is O of n squared.

Which of these four time complexities takes longer and which saves time? A little thought leads to the conclusion:

O (1) < O (logn) < O (n^2)

There are a variety of algorithms in the programming world, and in addition to the four scenarios above, there are many different forms of time complexity, such as:

O (nlogn), O (n^3), O (m*n), O (2^n), O (n!).

As we travel through the ocean of code, we will continue to encounter such time-complexity algorithms.

Huge differences in time complexity

Let’s take one chestnut:

The relative time scale of algorithm A is T (n) = 100N and the time complexity is O(n).

The relative time scale of algorithm B is T (n) = 5n^2, the time complexity is O(n^2),

Algorithm A runs on an old computer in Gray’s home, and Algorithm B runs on A supercomputer 100 times faster than the old computer.

So, as the input size n increases, which of the two algorithms will run faster?

As can be seen from the table, when the value of n is very small, the running time of algorithm A is much larger than that of algorithm B. When the value of n reaches about 1000, the running time of algorithm A and algorithm B is close. When the value of N becomes larger and larger, reaching one hundred thousand or one million, the advantage of algorithm A begins to appear, while algorithm B becomes slower and slower, and the gap becomes more and more obvious.

This is the difference in time complexity.

A few points to add:

It was only when Xiao Grey wrote this time-complicated popular science article that he realized that it was much more difficult to explain basic concepts than specific algorithms. Thank you for your valuable comments on this article!