• 周四. 11 月 21st, 2024

5G编程聚合网

5G时代下一个聚合的编程学习网

热门标签

Chapter 1 of revisiting data structure — Introduction and algorithm of data structure

King Wang

1 月 16, 2022
Preface

I bought two books before , a copy 《 Big talk design patterns 》, a copy 《 Big talk data structure 》, In fact, I have read both these books , But after reading it, I looked confused , Look fast , Forget it . Our teachers have taught us before , A good memory is not as good as a pen . So I decided to read it again and take a note . The notes in this column are from 《 Big talk data structure 》.

Basic concepts and terms of data

Here I show it in the form of a chart , Here’s the picture :
 Insert picture description here

Classification diagram of logical structure :

 Insert picture description here

Physical structure classification diagram :

 Insert picture description here

Abstract data types

data type :

Refer to A set of values with the same properties and some operations defined on this set The general term of .
stay C In language , According to different values , There are two types of data :

  • Type of atom : It is a basic type that can no longer be decomposed , Including the integer 、 Real type 、 Character type, etc .
  • Structure type : It is composed of thousands of types , It can be decomposed again . for example , An integer array is composed of several integer data .

Abstract data types

Abstract data types (Abstract Data Type, ADT): It refers to a mathematical model and a set of operations defined on the model . The definition of an abstract data type depends on only one set of its logical characteristics , It has nothing to do with how it is represented and implemented inside the computer .

summary :

 Insert picture description here

Algorithm

Definition

The algorithm is to solve 《 Description of specific problem solving steps 》, A finite sequence of instructions in a computer , And each instruction represents one or more operations .

Characteristics of the algorithm

 Insert picture description here

Algorithm design requirements

 Insert picture description here

The measurement of the algorithm

 Insert picture description here

The asymptotic growth of a function

Given two functions f(n) and g(n), If there is an integer N, Such that for all n> N, f(n) It’s always better than g(n) Big , that , We said f(n) The growth of is gradually faster than g (n).

The time complexity of the algorithm

When doing algorithm analysis , The total number of statements executed T (n) It’s about the scale of the problem n Function of , And then analyze T(n) along with n Change and determine T(n) The order of magnitude . The time complexity of the algorithm , That is, the time measurement of the algorithm , Write it down as : T(n)= 0(f(n). It means that with the scale of the problem n The increase of , The growth rate of algorithm execution time and f(n) The same growth rate , It’s called the asymptotic time complexity of the algorithm , Referred to as time complexity . among f(n) It’s the scale of the problem n A function of .
Use capital letters like this O() To reflect the time complexity of the algorithm , We call it big O Notation .

Derivation is great O Order method

1. With constant 1 Replace all the addition constants in runtime .
2. In the modified run times function , Only the highest order terms .
3. If the highest order term exists and is not 1, Then remove the constant multiplied by this term . The result is big O rank .

Constant order

int sum = 0,n=100;/* Do it once */
sum = (1+n)*n/2;/* Do it once */
printf("%d",sum);/* Do it once */

  • 1.
  • 2.
  • 3.

The number of times the algorithm runs is f (n) =3. According to our derivation 0 Order method , The first step is to put the constant term 3 Change it to 1. Found while retaining the highest order term , It doesn’t have the highest order term at all , So the time complexity of this algorithm is O(1)

If the statements in this algorithm sum= (1+n) *n/2 Yes 10 sentence , Here’s the picture :

int sum = 0,n=100;/* perform 1 Time */
sum = (1+n)*n/2;/* perform 2 Time */
sum = (1+n)*n/2;/* perform 3 Time */
sum = (1+n)*n/2;/* perform 4 Time */
sum = (1+n)*n/2;/* perform 5 Time */
sum = (1+n)*n/2;/* perform 6 Time */
sum = (1+n)*n/2;/* perform 7 Time */
sum = (1+n)*n/2;/* perform 8 Time */
sum = (1+n)*n/2;/* perform 9 Time */
sum = (1+n)*n/2;/* perform 10 Time */
sum = (1+n)*n/2;/* perform 11 Time */
printf("%d",sum);/* perform 12 Time */

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.

In fact, no matter n How much? ,. The above two pieces of code are 3 Secondary sum 12 The difference of secondary execution . This has nothing to do with the size of the problem (n The amount of ), Algorithm with constant execution time , We call it having 0(1]) Time complexity of , Also called constant order .
Be careful : No matter what the constant is , We all remember it as 0(1]), And can’t be 0(3)、0(12] Wait for any other number , This is a mistake that beginners often make .
So come to the conclusion : The number of executions is constant , Write it down as O(1).

Linear order

The cyclic structure of linear order is much more complicated . To determine the order of an algorithm , We often need to determine how many times a particular statement or a set of statements will run . therefore , We want to analyze the complexity of the algorithm , The key is to analyze the operation of the circulation structure .
Here’s the code , The time complexity of its cycle is 0(n), Because the code in the loop body needs to execute n Time :

int i=0;
for ( i=0 ; i < n ; i++ )
{
/* The time complexity is 0(1) The sequence of procedure steps
}

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.

The code in the loop body needs to be executed n Time , Write it down as O(n)

Logarithmic order

int count = 1;
while (count < n)
{
coutn = count * 2;/* The time complexity is 0(1) The sequence of procedure steps */
}

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.

Because every time count multiply 2 after , Just distance n A little closer . in other words , How many 2 Multiply by more than n, It exits the loop . from 2x=n obtain x=log2n. So the time complexity of this cycle is O(logn).

Square order

int i,j;
for ( i=0 ; i<n ; i++ )
{
for( j=0 ;j <n ;j ++)
{
/* The time complexity is 0(1) The sequence of procedure steps */
}
}

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.

And for the outer cycle , But the internal time complexity is 0(n) The sentence of , recycling n Time . So the time complexity of this code is 0(n2).

Common time complexity

 The picture is from 《 Big talk data structure 》
 The picture comes from the big talk data structure

Worst case vs. average

The worst

The worst-case run time is one – Kind of guarantee , That is, the running time will not be broken anymore . In the application , This is one of the most important needs , Usually , Unless otherwise specified , All of the runtimes we’re talking about are worst-case runtimes .

On average

The average running time is the most meaningful of all , Because it’s the expected run time . in other words , We run a When using program code , You want to see the average running time . But in reality , The average running time is difficult to obtain through analysis , Usually by running a certain number Estimated after the amount of experimental data .

The spatial complexity of the algorithm

The spatial complexity of the algorithm is realized by calculating the storage space required by the algorithm , The calculation formula of algorithm space complexity : S(n)= O(f(n)), among ,n For the scale of the problem ,f(n) For the statement about n The function of the storage space occupied .

发表回复