“This is the fourth day of my participation in the Gwen Challenge in November. Check out the details: The Last Gwen Challenge in 2021.”

preface

Hello! Friend!!!

Thank you very much for reading haihong’s article, if there are any mistakes in the article, please point out ~

 

Self-introduction ଘ(੭, ᵕ)੭

Nickname: Haihong

Tag: programmer monkey | C++ contestant | student

Introduction: because of C language to get acquainted with programming, then transferred to the computer major, had the honor to get some national awards, provincial awards… Has been confirmed. Currently learning C++/Linux/Python

Learning experience: solid foundation + more notes + more code + more thinking + learn English well!

 

Machine learning little White stage

The article is only used as my own study notes for the establishment of knowledge system and review

Know what is, know why!

6.4 Linear Transformation

Define the four

Let there be two non-empty sets A,BA,BA,B. If for any element α\alphaα in AAA, by certain rules there is always A definite element β\betaβ in BBB corresponding to it

So, this rule is called a mapping from set AAA to set BBB, let’s call this mapping TTT, and let’s call it


Beta. = T ( Alpha. ) or Beta. = T Alpha. ( Alpha. A . Beta. B ) \beta=T(\alpha) \quad or \quad\beta=T\alpha\quad(\alpha \in A,\beta \in B)

Set alpha 1 ∈ a. T (alpha 1) = 1 \ alpha_1 \ in A beta, T ∈ (\ alpha_1) = \ beta_1 alpha 1 A, T (alpha 1) = beta 1, which means that the element mapping TTT alpha 1 \ alpha_1 alpha 1 into beta 1 \ beta_1 beta 1

  • β1\beta_1β1 is called the image of α1\alpha_1α1 under mapped TTT
  • α1\alpha_1α1 is called the source of β1\beta_1β1 under mapping TTT
  • AAA is called the source set of the mapped TTT
  • As A collection of all posed called as set, as T T T (A) (A) (A), T (A) = {beta = T (alpha) | alpha ∈ a.} T (A) = \ {\ beta = T (\ alpha) | \ alpha \ A \} in T (A) = {beta = T (alpha) | alpha ∈ a.}, Wherein T(A)⊂BT(A) \subset BT(A)⊂B

Definition 5: Linear transformation

Let Vn,UmV_n,U_mVn,Um be NNN and MMM dimensional linear Spaces respectively, and T be a mapping from VnV_nVn to UmU_mUm if the mapping TTT satisfies

(1) let α1,α2∈V\alpha_1,\alpha_2 \in Vα1,α2∈V, T (alpha alpha 1 + 2) = T (alpha 1) + T (alpha 2) T (+ \ \ alpha_1 alpha_2) = T (\ alpha_1) + T (\ alpha_2) T (alpha alpha 1 + 2) = T (alpha 1) + T (alpha 2)

(2) as to the alpha ∈.vn, lambda ∈ R \ alpha \ in V_n, \ lambda \ in \ mathbb {R} alpha ∈.vn, lambda ∈ R, T (lambda alpha) = lambda T (alpha), T (\ lambda \ alpha) = \ lambda T (\ alpha) T (lambda alpha) = lambda T (alpha)

So TTT is called a linear mapping from VnV_nVn to UmU_mUm


Linear transformations have some properties:

(1) T0 = 0, T – (alpha) = alpha T0 – T = 0, T = (- \ alpha) -t \ alphaT0 = 0, T – (alpha) = alpha – T

(2) if β=k1α1+k2α2+… + km alpha m + k_2 \ \ beta = k_1 \ alpha_1 alpha_2 +… + k_m \ alpha_m k2 alpha beta = k1 alpha 1 + 2 +… + km alpha m, T k2T alpha beta = k1T alpha 1 + 2 +… + kmT alpha mT + k_2T \ \ beta = k_1T \ alpha_1 alpha_2 +… + k_mT \ alpha_mT k2T alpha beta = k1T alpha 1 + 2 +… + kmT alpha m

(3) If α1,α2… , alpha m \ alpha_1 \ alpha_2,… , \ alpha_m alpha 1, alpha 2,… ,αm linearly correlated, then T(α1),T(α2)… T, T (alpha m) (\ alpha_1), T (\ alpha_2),… T (\ alpha_m) T (alpha 1), T (alpha 2),… T(αm) is also linearly dependent

Note: If α1,α2… , alpha m \ alpha_1 \ alpha_2,… , \ alpha_m alpha 1, alpha 2,… ,αm is linearly independent, then T(α1),T(α2)… T, T (alpha m) (\ alpha_1), T (\ alpha_2),… T (\ alpha_m) T (alpha 1), T (alpha 2),… T(αm) is not necessarily linearly independent. For example, if TTT transformation results in zero vectors, then let α1,α2… , alpha m \ alpha_1 \ alpha_2,… , \ alpha_m alpha 1, alpha 2,… ,αm, then finally T(α1),T(α2)… T, T (alpha m) (\ alpha_1), T (\ alpha_2),… T (\ alpha_m) T (alpha 1), T (alpha 2),… T(alpha m) is zero, so it’s not linearly dependent

(4) The image set of linear transformation TTT T(Vn)T(V_n)T(Vn) is a linear space, called the image space of linear transformation TTT

Proof:

1, the beta beta 2 T (.vn) ∈ \ beta_1, \ beta_2 \ T (V_n) in beta 1, beta 2 ∈ T (.vn)

There are


Beta. 1 = T ( Alpha. 1 ) . Beta. 2 = T ( Alpha. 2 ) . Among them Alpha. 1 . Alpha. 2 V n T (\ \ beta_1 = alpha_1), \ beta_2 = T (\ alpha_2), including \ alpha_1, \ alpha_2 \ V_n in

Proof closure of addition operation:


Beta. 1 + Beta. 2 = T ( Alpha. 1 ) + T ( Alpha. 2 ) = T ( Alpha. 1 + Alpha. 2 ) T ( V n ) \beta_1+\beta_2=T(\alpha_1)+T(\alpha_2)=T(\alpha_1+\alpha_2)\in T(V _n)

Closure of syndrome multiplication operation:

Set lambda ∈ R, beta 1 T (.vn) ∈ \ lambda \ in \ mathbb {R}, \ beta_1 \ T (V_n) in lambda ∈ R, beta 1 ∈ T (.vn)

There are


Lambda. Beta. 1 = Lambda. T ( Alpha. 1 ) = T ( Lambda. Alpha. 1 ) T ( V n ) \lambda\beta_1=\lambda T(\alpha_1)=T(\lambda\alpha_1)\in T(V_n)

Eight operations are also true, and I won’t go into details here

In summary, the image set T(Vn)T(V_n)T(Vn) of the linear transformation TTT is a linear space

Alpha (5) T = 0 T \ alpha = 0 T alpha = 0 alpha \ alpha alpha all ST = {alpha | alpha ∈.vn, T alpha = 0} S_T = \ {\ alpha | \ alpha \ in V_n, T \ alpha = 0 \} ST = {alpha | alpha ∈.vn, T alpha = 0} is a linear space, STS_TST is called the kernel of the linear transformation TTT

Proof:

Proof closure of addition operation:

Let α1,α2∈ST\alpha_1,\alpha_2\in S_Tα1,α2∈ST, have


T ( Alpha. 1 ) = 0 . T ( Alpha. 2 ) = 0 T(\alpha_1)=0,T(\alpha_2)=0

then


T ( Alpha. 1 + Alpha. 2 ) = T ( Alpha. 1 ) + T ( Alpha. 2 ) = 0 + 0 = 0 T(\alpha_1+\alpha_2)=T(\alpha_1)+T(\alpha_2)=0+0=0

So the alpha (alpha 1 + 2) ∈ ST (\ \ alpha_2 alpha_1 +) \ S_T in alpha (alpha 1 + 2) ∈ ST

Closure of syndrome multiplication operation:

Set lambda ∈ R, alpha 1 ∈ ST \ lambda \ \ mathbb in {R}, \ alpha_1 \ in S_T lambda ∈ R, alpha 1 ∈ ST, there is


T ( Lambda. Alpha. 1 ) = Lambda. T ( Alpha. ) = 0 T(\lambda\alpha_1)=\lambda T(\alpha)=0

So lambda alpha ∈ ST \ lambda \ alpha \ S_T lambda alpha ∈ in ST

To sum up, STS_TST is a linear space

For example,

Example 10

Let’s say the NNN order matrix


A = [ a 11 a 12 . . . a 1 n a 21 a 22 . . . a 2 n . . . . . . a n 1 a n 2 . . . a n n ] = ( a 1 . a 2 . . . . . a n ) A=\begin{bmatrix} a_{11} & a_{12} &… & a_{1n}\\ a_{21} & a_{22} & … &a_{2n}\\ . & . & & . \\ . & . & & . \\ a_{n1} & a_{n2} &… & a_{nn}\\ \end{bmatrix}=(a_1,a_2,… ,a_n)

Among them


a i = [ a 1 i a 2 i . . . a n i ] a_{i}=\begin{bmatrix} a_{1i}\\ a_{2i}\\ .\\ .\\ .\\ a_{ni} \end{bmatrix}

Define the transformation y=T(x)y=T(x)y=T(x) in Rn\mathbb{R}^nRn as


T ( x ) = A x ( x R n ) T(x)=Ax(x\in \mathbb{R}^n)

Let’s say that TTT is a linear transformation

Note: Rn\mathbb{R}^nRn is an N ×1n×1n×1 dimensional matrix


Note that TTT is a linear transformation, then we need to prove: T (alpha alpha 1 + 2) = T (alpha 1) + T (alpha 2) and T (lambda alpha) = lambda T (alpha) T (+ \ \ alpha_1 alpha_2) = T (\ alpha_1) + T (\ alpha_2), T (\ lambda \ alpha) = \ lambda T (\ alpha) T (alpha alpha 1 + 2) = T (alpha 1) + T (alpha 2) and T (lambda alpha lambda = T (alpha)

prove

Let a,b∈Rna,b \in \mathbb{R}^na,b∈Rn, then


T ( a + b ) = A ( a + b ) = A a + A b = T ( a ) + T ( b ) T(a+b)=A(a+b)=Aa+Ab=T(a)+T(b)


T ( Lambda. a ) = A ( Lambda. a ) = Lambda. A a = Lambda. T ( a ) T(\lambda a)=A(\lambda a)=\lambda Aa=\lambda T(a)

So TTT is a linear transformation

supplement


y = T ( x ) = A x = ( a 1 . a 2 . . . . . a n ) [ x 1 x 2 . . . x n ] = x 1 a 1 + x 2 a 2 + . . . + x n a n y=T(x)=Ax=(a_1,a_2,… ,a_n)\begin{bmatrix} x_1\\ x_2\\ .\\ .\\ .\\ x_n \end{bmatrix}=x_1a_1+x_2a_2+… +x_na_n

So, there are


T ( R n ) = { y = x 1 a 1 + x 2 a 2 + . . . + x n a n | x 1 . x 2 . . . . . x n R } T(\mathbb{R}^n)=\{y=x_1a_1+x_2a_2+… + x_na_n | x_1, x_2,… ,x_n\in\mathbb{R}\}

In other words:

  • The image space of the linear transformation TTT is actually made up of a1, A2… ,ana_1,a_2,… ,a_na1,a2,… The vector space produced by an
  • The kernel STS_TST of TTT is the solution space of homogeneous linear equations Ax=0Ax=0Ax=0

conclusion

Description:

  • Refer to textbook “linear algebra” fifth edition tongji University mathematics department
  • With the book concept explanation combined with some of their own understanding and thinking

The essay is just a study note, recording a process from 0 to 1

Hope to help you, if there is a mistake welcome small partners to correct ~

I am haihong ଘ(੭, ᵕ)੭

If you think it’s ok, please give it a thumbs up

Thanks for your support ❤️