If we have an n by n matrix called A. How do we know if there is an inverse matrix A^-1 such that the product A * A^-1 is the n by n identity matrix?

*

If the determinant of the matrix A (detA) is not zero, then this matrix has an inverse matrix. This property of a matrix can be found in any textbook on higher algebra or in a textbook on the theory of matrices.

You are watching: How to check if matrix is invertible



Yes Jose Vegas is right. Infact, what you should have if det(A) is non-zero A.Inv(A) = Inv(A).A = I the identity matrix. However, for large value of n it is difficult to find det(A). If you apply, Gauss elimination method, then during elimintion process t some point your diagonal element becomes zero can not be made non-zero by elementary row exchange then the matrix is singular and the inverse does not exist.
*

If A is square you need det(A) nomzero. If A is not square and you don' t need the other condition A*A^(-1)=I, then you are looking for "left inverses", and that is a long story. I would check Wikipedia for help. Cheers!

Yes Jose Vegas is right. Infact, what you should have if det(A) is non-zero A.Inv(A) = Inv(A).A = I the identity matrix. However, for large value of n it is difficult to find det(A). If you apply, Gauss elimination method, then during elimintion process t some point your diagonal element becomes zero can not be made non-zero by elementary row exchange then the matrix is singular and the inverse does not exist.
It depends on the matrix. If it is of type integer, then you can do Gauss-Jordan elimination. If you don't end up with a zero row, then your matrix is invertible. Of course computation of determinant for small n is more efficient. Other method is to try to find eigenvalues, if zero is not among them, then again A is invertible. There exist almost ten different equivalent ways for your task.
How you choose to show existence of an inverse really depends on the matrix. There are instances where finding det(A) is far more difficult than proving .
In other case, the product of the matrices A and, in this case A^-1, will give you a matrix of a rank equal to the minimial rank of the matrices A and A^-1. Therefore, you cannot obtain the identity matrix which have full rank equal n.

See more: Corrosion Of Conformity Clean My Wounds Lyrics, Corrosion Of Conformity


I such way, even if the size of the matrix is large, we may determine if the matrix A is invertible without computation of its determinant.
R. Mittal's answer was the sufficient one in the 19th century (where nobody considered large matrices). Today the only reasonable way is to make a singular value decomposition (SVD) and inspect the singular values (i.e. let the computer do that). If at least one of them is zero (I don't discuss the question 'what is zero' for results of lengthy numerical computations) then the matrix has no classical inverse. It has a pseudo-inverse (Penrose inverse) though (which is closely related to the SVD), which can replace the classical inverse in many applications where classically one used the inverse. Peter's contribution to that side of the problem is valuable. Of course, all this stuff is in Wikipedia, and everybody who works with matrices in all but the most trivial applications need to know it!