我正在使用numpy来计算圆形矩阵的特征值和特征向量。这是我的代码(j = 1,2 ... 6的Hji是预定义的):
>>> import numpy as np
>>> H = np.array([H1i, H2i, H3i, H4i, H5i, H6i])
>>> H
array([[ 0., 1., 0., 0., 0., 1.],
[ 1., 0., 1., 0., 0., 0.],
[ 0., 1., 0., 1., 0., 0.],
[ 0., 0., 1., 0., 1., 0.],
[ 0., 0., 0., 1., 0., 1.],
[ 1., 0., 0., 0., 1., 0.]])
>>> from numpy import linalg as LA
>>> w, v = LA.eig(H)
>>> w
array([-2., 2., 1., -1., -1., 1.])
>>> v
array([[ 0.40824829, -0.40824829, -0.57735027, 0.57732307, 0.06604706,
0.09791921],
[-0.40824829, -0.40824829, -0.28867513, -0.29351503, -0.5297411 ,
-0.4437968 ],
[ 0.40824829, -0.40824829, 0.28867513, -0.28380804, 0.46369403,
-0.54171601],
[-0.40824829, -0.40824829, 0.57735027, 0.57732307, 0.06604706,
-0.09791921],
[ 0.40824829, -0.40824829, 0.28867513, -0.29351503, -0.5297411 ,
0.4437968 ],
[-0.40824829, -0.40824829, -0.28867513, -0.28380804, 0.46369403,
0.54171601]])
特征值是正确的。然而,对于特征向量,我发现它们不是线性独立的
>>> V = np.zeros((6,6))
>>> for i in range(6):
... for j in range(6):
... V[i,j] = np.dot(v[:,i], v[:,j])
...
>>> V
array([[ 1.00000000e+00, -2.77555756e-17, -2.49800181e-16,
-3.19189120e-16, -1.11022302e-16, 2.77555756e-17],
[ -2.77555756e-17, 1.00000000e+00, -1.24900090e-16,
-1.11022302e-16, -8.32667268e-17, 0.00000000e+00],
[ -2.49800181e-16, -1.24900090e-16, 1.00000000e+00,
-1.52655666e-16, 8.32667268e-17, -1.69601044e-01],
[ -3.19189120e-16, -1.11022302e-16, -1.52655666e-16,
1.00000000e+00, 1.24034735e-01, -8.32667268e-17],
[ -1.11022302e-16, -8.32667268e-17, 8.32667268e-17,
1.24034735e-01, 1.00000000e+00, -1.66533454e-16],
[ 2.77555756e-17, 0.00000000e+00, -1.69601044e-01,
-8.32667268e-17, -1.66533454e-16, 1.00000000e+00]])
>>>
你可以看到有非对角线项(检查V [2,5] = -1.69601044e-01),这意味着它们不是线性独立向量。由于这是一个Hermitian矩阵,它的特征向量如何变得依赖?
顺便说一句,我也使用matlab来计算它并返回正确的值
V =
0.4082 -0.2887 -0.5000 0.5000 0.2887 -0.4082
-0.4082 -0.2887 0.5000 0.5000 -0.2887 -0.4082
0.4082 0.5774 0 0 -0.5774 -0.4082
-0.4082 -0.2887 -0.5000 -0.5000 -0.2887 -0.4082
0.4082 -0.2887 0.5000 -0.5000 0.2887 -0.4082
-0.4082 0.5774 0 0 0.5774 -0.4082
D =
-2.0000 0 0 0 0 0
0 -1.0000 0 0 0 0
0 0 -1.0000 0 0 0
0 0 0 1.0000 0 0
0 0 0 0 1.0000 0
0 0 0 0 0 2.0000
答案 0 :(得分:2)
eig
返回的结果非常好。
np.allclose(v.dot(np.diag(w)).dot(LA.inv(v)),H)
True
请注意,eig
的输出对应于v * diag(w) * inv(v)
形式的输入矩阵的分解,其适用于通用的可对角化矩阵。由于eig
将H
视为没有特殊结构,因此预期返回的特征向量不具有特殊结构,例如正交。 (不要将正交性与线性独立性混淆 - v
的列确实是线性独立的,因为可以通过非零LA.det(v)
进行简单验证。)
函数eigh
知道输入矩阵是hermitian并返回一组更方便的,即正交的特征向量。
答案 1 :(得分:1)
对于Hermitian和对称矩阵,您应该使用另一个函数:eigh
。
import numpy as np
from numpy import linalg as LA
H = np.array([[ 0., 1., 0., 0., 0., 1.],
[ 1., 0., 1., 0., 0., 0.],
[ 0., 1., 0., 1., 0., 0.],
[ 0., 0., 1., 0., 1., 0.],
[ 0., 0., 0., 1., 0., 1.],
[ 1., 0., 0., 0., 1., 0.]])
w, v = LA.eigh(H)
V = np.zeros((6,6))
for i in range(6):
for j in range(6):
V[i,j] = np.dot(v[:,i], v[:,j])
w
Out[19]: array([-2., -1., -1., 1., 1., 2.])
v
Out[20]:
array([[-0.40824829, -0.57735027, 0. , 0. , 0.57735027,
0.40824829],
[ 0.40824829, 0.28867513, -0.5 , -0.5 , 0.28867513,
0.40824829],
[-0.40824829, 0.28867513, 0.5 , -0.5 , -0.28867513,
0.40824829],
[ 0.40824829, -0.57735027, 0. , 0. , -0.57735027,
0.40824829],
[-0.40824829, 0.28867513, -0.5 , 0.5 , -0.28867513,
0.40824829],
[ 0.40824829, 0.28867513, 0.5 , 0.5 , 0.28867513,
0.40824829]])
V
Out[21]:
array([[ 1.00000000e+00, 8.32667268e-17, 2.77555756e-17,
8.32667268e-17, -2.08166817e-16, 0.00000000e+00],
[ 8.32667268e-17, 1.00000000e+00, 5.55111512e-17,
5.55111512e-17, -2.22044605e-16, -1.11022302e-16],
[ 2.77555756e-17, 5.55111512e-17, 1.00000000e+00,
0.00000000e+00, 2.77555756e-17, 1.11022302e-16],
[ 8.32667268e-17, 5.55111512e-17, 0.00000000e+00,
1.00000000e+00, 8.32667268e-17, 5.55111512e-17],
[ -2.08166817e-16, -2.22044605e-16, 2.77555756e-17,
8.32667268e-17, 1.00000000e+00, 0.00000000e+00],
[ 0.00000000e+00, -1.11022302e-16, 1.11022302e-16,
5.55111512e-17, 0.00000000e+00, 1.00000000e+00]])