top | item 37112895

(no title)

necroforest | 2 years ago

kind of. you can decompose an arbitrary matrix into symmetric and antisymmetric components: R = S + A. Since A = -A^H (anti-symmetric), for any vector x, <x, Ax> = -<x, Ax> => <x, Ax> = 0. So for any matrix where <x, Rx> > 0, you can add an arbitrary anti-symmetric matrix and keep the same induced quadratic form. So people typically enforce symmetry in their definitions because it is the only part that contributes to the quadratic form and is "nicer" to work with (always diagonalizable, positive eigenvalues, etc.)

This should generalize easily to the complex/Hermitian case.

discuss

order

dataflow|2 years ago

Thanks! But I think you might've missed a subtlety here:

> This should generalize easily to the complex/Hermitian case.

This doesn't seem to be true, in that it's actually impossible to have a non-Hermitian matrix C such that x†Cx > 0 over the complex numbers for all x. Whereas over the real numbers, with a matrix R, you can have x'Rx > 0 such that R is asymmetric.

The subtlety here is that x itself can be complex in the complex case, which further constraints C to be Hermitian - see the Wikipedia link I posted above.

In other words, "complex definiteness" is actually a stronger condition than "real definiteness", even for matrices without an imaginary part.

necroforest|2 years ago

>it's actually impossible to have a non-Hermitian matrix C such that x†Cx > 0

Nice catch, it's been a few years since I had to think about these details.

hgomersall|2 years ago

I spent ages looking for a proof of the complex cases during my PhD. Most proofs of CG begin "assuming a real, positive definite matrix".