"When you perform elementary row operations on the augmented matrix [A|In], you are essentially applying the same operations simultaneously to both A and In."

"From an algebraic perspective, this means you are left-multiplying the entire augmented matrix by the same sequence of elementary matrices:"

Ekโ‹ฏE2E1[A|In]

"According to the definition of matrix multiplication (specifically, how it applies to block matrices), this is equivalent to:"

[(Ekโ‹ฏE1)A|(Ekโ‹ฏE1)In]

How to prove this step:

This step is crucial for understanding how the augmented matrix method for finding the inverse works. It is not a direct application of the distributive property (as matrix multiplication over matrices does not distribute in this sense for blocks), but rather based on the definition of block matrix multiplication.

Let's first prove the case for a single elementary matrix E.

Proof of E[A|In]=[EA|EIn]:

Let A be an nร—n matrix, and In be the nร—n identity matrix.
The augmented matrix [A|In] is an nร—(n+n), i.e., nร—(2n) matrix.
Let E be an nร—n elementary matrix (representing a single row operation).

We can consider the augmented matrix [A|In] as a single nร—(2n) matrix, whose columns are split into two blocks:

Let C=[A|In]. We can write C in terms of its column vectors:
C=[c1c2โ‹ฏcn|cn+1โ‹ฏc2n]
Where c1,โ€ฆ,cn are the column vectors of A, and cn+1,โ€ฆ,c2n are the column vectors of In.

Recall the fundamental definition of matrix multiplication: when a matrix E left-multiplies another matrix C, each column of the resulting matrix EC is E multiplied by the corresponding column of C.
So,

EC=[Ec1Ec2โ‹ฏEcn|Ecn+1โ‹ฏEc2n]

Now, let's examine the blocks of the resulting matrix EC:

  1. The first n columns of EC: These are Ec1,Ec2,โ€ฆ,Ecn. By the definition of matrix multiplication, these are precisely the column vectors that form the matrix product EA.
    Therefore, the left block of the result is EA.

  2. The last n columns of EC: These are Ecn+1,Ecn+2,โ€ฆ,Ec2n. Similarly, these are precisely the column vectors that form the matrix product EIn.
    Therefore, the right block of the result is EIn.

Combining these two observations, we have demonstrated that:

E[A|In]=[EA|EIn]

Generalization to a Sequence of Elementary Matrices:

This principle extends straightforwardly to a sequence of elementary matrices E1,E2,โ€ฆ,Ek. We can apply the proven principle iteratively:

  1. Start with E1: E1[A|In]=[E1A|E1In]
  2. Apply E2 to the result: E2(E1[A|In])=E2[E1A|E1In]=[E2(E1A)|E2(E1In)]
  3. Continue this process for all k elementary matrices:(Ekโ‹ฏE2E1)[A|In]=[(Ekโ‹ฏE2E1)A|(Ekโ‹ฏE2E1)In]

This is why, when you perform a sequence of elementary row operations on the augmented matrix [A|In], the left block A is transformed into the product (Ekโ‹ฏE1)A, and the right block In is simultaneously transformed into (Ekโ‹ฏE1)In. If the process successfully transforms A into In (i.e., (Ekโ‹ฏE1)A=In), then the product (Ekโ‹ฏE1) must be Aโˆ’1, and the right side becomes Aโˆ’1In=Aโˆ’1.