Mathematics Asked on February 20, 2021
I’ve been asked to prove that the following statements are equivalent for any $n$x$n$ matrix $A$:
(Note that A and B both are $n$x$n$)
How do I show this equivalence? I’m able to see that if $(1)$ holds, the columns of A are linearly dependent and the rows of B are linearly dependent. How do I go ahead?
Thanks!
P.S.
The course I’m currently doing, however, has only covered concepts of matrix multiplication, elementary matrices, and system of equations – so it’d be great if you could provide a proof along those lines!
Also, I was wondering if there’s a general closed-form or some way we can describe matrices A that satisfy $(1)$ and $(2)$?
If $AB = 0$ for some nonzero $B$, then $Ax = 0$ for some nonzero $x$ (in particular, one of the nonzero columns of $B$). But since this implies $A$ has linearly dependent columns, and $A$ is square, it means that $A^T$ also has linearly dependent columns, and so this implies that $A^Ty = 0$ for some nonzero $y$. Now consider $C^T = [y mid 0 mid cdots mid 0]$. Then $A^TC^T = 0$. Taking transposes, $CA = 0$, as required.
Note that this shows both directions: if $CA = 0$ for some nonzero $C$, then $A^T B = 0$ for $B = C^T$. Now apply the first direction again, which says that there exists $D$ nonzero such that $DA^T = 0$. Transpose this, $AD^T = 0$, which proves the other direction.
A closed form way to describe such matrices is as the set of singular $n times n$ matrices: ${A : det(A) = 0}$
Correct answer by Drew Brady on February 20, 2021
Let's not use rank, determinants or linear spaces.
If somehow, we can construct a $C$ such that $CA=0$, then we are done.
$AB=0implies$ Columns of $A$ are linearly dependent.
Convert $A$ to Row reduced echelon form. Note that $A$ and $A^T$ will have same no. of pivots (leading entries). Let the no. of pivots =$mlt n$
Therefore, columns of $A^T$ will also be linearly dependent. Hence, there exist nos. $c_1,c_2,cdots,c_n$ (not all zeros) such that $A^Tbegin{bmatrix}c_1\c_2\...\...\c_nend{bmatrix}=0$
Let $P$ be a matrix whose one of the columns is $begin{bmatrix}c_1\c_2\...\...\c_nend{bmatrix}$ and rest of the columns are zero columns.
Hence, we have $A^TP=0implies (A^TP)^T=0^Timplies P^TA=0$
Put $P^T=C$ and we are done.
Answered by Koro on February 20, 2021
If there exists $B neq 0$ such that $AB = 0$, then this implies $mathrm{rank}(A) < n$, for otherwise $A$ is nonsingular, forces $AX = 0$ only admits zero solution. Hence the columns of $A$ are linearly dependent, i.e., there exists a nonzero row vector $c$ such that $cA = 0$. Vertically expand $c$ (just adding zeros, for example) to an order $n$ matrix $C$ completes the proof.
Answered by Zhanxiong on February 20, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP