Proving v1+v2 is not an eigenvector of A
up vote
3
down vote
favorite
Let $lambda_1$ and $lambda_2$ be two distinct eigenvalues of an $n times n$ matrix $A$, $v_1$ and $v_2$ are the corresponding eigenvectors. Prove that $v_1 + v_2$ is not an eigenvector of $A$.
Is this how you set this up? Unsure where to begin.
$A(v_1+v_2) = Av_1 + Av_2$
$A(v_1+v_2) = lambda_1v_1 + lambda_2v_2$
...
linear-algebra matrices
New contributor
add a comment |
up vote
3
down vote
favorite
Let $lambda_1$ and $lambda_2$ be two distinct eigenvalues of an $n times n$ matrix $A$, $v_1$ and $v_2$ are the corresponding eigenvectors. Prove that $v_1 + v_2$ is not an eigenvector of $A$.
Is this how you set this up? Unsure where to begin.
$A(v_1+v_2) = Av_1 + Av_2$
$A(v_1+v_2) = lambda_1v_1 + lambda_2v_2$
...
linear-algebra matrices
New contributor
Have you already encountered a proof that with the same assumptions ($lambda_1$ and $lambda_2$ are two distinct eigenvalues of an $ntimes n$ matrix $A$, $v_1$ and $v_2$ are their eigenvectors), $v_1$ and $v_2$ are linearly independent? If so, then use that. If not, then that is the place to start.
– Misha Lavrov
Nov 27 at 23:20
add a comment |
up vote
3
down vote
favorite
up vote
3
down vote
favorite
Let $lambda_1$ and $lambda_2$ be two distinct eigenvalues of an $n times n$ matrix $A$, $v_1$ and $v_2$ are the corresponding eigenvectors. Prove that $v_1 + v_2$ is not an eigenvector of $A$.
Is this how you set this up? Unsure where to begin.
$A(v_1+v_2) = Av_1 + Av_2$
$A(v_1+v_2) = lambda_1v_1 + lambda_2v_2$
...
linear-algebra matrices
New contributor
Let $lambda_1$ and $lambda_2$ be two distinct eigenvalues of an $n times n$ matrix $A$, $v_1$ and $v_2$ are the corresponding eigenvectors. Prove that $v_1 + v_2$ is not an eigenvector of $A$.
Is this how you set this up? Unsure where to begin.
$A(v_1+v_2) = Av_1 + Av_2$
$A(v_1+v_2) = lambda_1v_1 + lambda_2v_2$
...
linear-algebra matrices
linear-algebra matrices
New contributor
New contributor
edited Nov 27 at 23:34
platty
2,323215
2,323215
New contributor
asked Nov 27 at 23:15
jake
211
211
New contributor
New contributor
Have you already encountered a proof that with the same assumptions ($lambda_1$ and $lambda_2$ are two distinct eigenvalues of an $ntimes n$ matrix $A$, $v_1$ and $v_2$ are their eigenvectors), $v_1$ and $v_2$ are linearly independent? If so, then use that. If not, then that is the place to start.
– Misha Lavrov
Nov 27 at 23:20
add a comment |
Have you already encountered a proof that with the same assumptions ($lambda_1$ and $lambda_2$ are two distinct eigenvalues of an $ntimes n$ matrix $A$, $v_1$ and $v_2$ are their eigenvectors), $v_1$ and $v_2$ are linearly independent? If so, then use that. If not, then that is the place to start.
– Misha Lavrov
Nov 27 at 23:20
Have you already encountered a proof that with the same assumptions ($lambda_1$ and $lambda_2$ are two distinct eigenvalues of an $ntimes n$ matrix $A$, $v_1$ and $v_2$ are their eigenvectors), $v_1$ and $v_2$ are linearly independent? If so, then use that. If not, then that is the place to start.
– Misha Lavrov
Nov 27 at 23:20
Have you already encountered a proof that with the same assumptions ($lambda_1$ and $lambda_2$ are two distinct eigenvalues of an $ntimes n$ matrix $A$, $v_1$ and $v_2$ are their eigenvectors), $v_1$ and $v_2$ are linearly independent? If so, then use that. If not, then that is the place to start.
– Misha Lavrov
Nov 27 at 23:20
add a comment |
2 Answers
2
active
oldest
votes
up vote
5
down vote
By contradiction:
If $v_1 + v_2$ is an eigenvector of A then there exists and eigenvalue $lambda$ so that $$ A(v_1 + v_2) = lambda(v_1 + v_2) = lambda v_1 + lambda v_2.$$
However since $v_1$ and $v_2$ are eigenvectors and $A$ is linear we have
$$ A(v_1 + v_2) = A(v_1) + A(v_2) = lambda_1 v_1 + lambda_2v_2.$$
Therefore
$$ lambda v_1 + lambda v_2 = lambda_1 v_1 + lambda_2v_2$$
$$ iff$$
$$ (lambda - lambda_1) v_1 + (lambda - lambda_2)v_2 = 0. $$
Since $lambda_1 neq lambda_2$, $v_1$ and $v_2$ are linearly independent so
$$ lambda - lambda_1 = 0 qquad lambda-lambda_2 = 0.$$
So $ lambda = lambda_1 = lambda_2 $ which is a contradiction.
add a comment |
up vote
3
down vote
What you have above is true. It may help to write $lambda_2 = lambda_1 + c$ where $c neq 0$. Then you can write $A(v_1 + v_2) = lambda_1 (v_1 + v_2) + cv_2$. From here, you should be able to argue that $v_2$ is not parallel to $v_1 + v_2$, based on the assumption that $lambda_1 neq lambda_2$, so $v_1 + v_2$ cannot be an eigenvector of $A$.
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
5
down vote
By contradiction:
If $v_1 + v_2$ is an eigenvector of A then there exists and eigenvalue $lambda$ so that $$ A(v_1 + v_2) = lambda(v_1 + v_2) = lambda v_1 + lambda v_2.$$
However since $v_1$ and $v_2$ are eigenvectors and $A$ is linear we have
$$ A(v_1 + v_2) = A(v_1) + A(v_2) = lambda_1 v_1 + lambda_2v_2.$$
Therefore
$$ lambda v_1 + lambda v_2 = lambda_1 v_1 + lambda_2v_2$$
$$ iff$$
$$ (lambda - lambda_1) v_1 + (lambda - lambda_2)v_2 = 0. $$
Since $lambda_1 neq lambda_2$, $v_1$ and $v_2$ are linearly independent so
$$ lambda - lambda_1 = 0 qquad lambda-lambda_2 = 0.$$
So $ lambda = lambda_1 = lambda_2 $ which is a contradiction.
add a comment |
up vote
5
down vote
By contradiction:
If $v_1 + v_2$ is an eigenvector of A then there exists and eigenvalue $lambda$ so that $$ A(v_1 + v_2) = lambda(v_1 + v_2) = lambda v_1 + lambda v_2.$$
However since $v_1$ and $v_2$ are eigenvectors and $A$ is linear we have
$$ A(v_1 + v_2) = A(v_1) + A(v_2) = lambda_1 v_1 + lambda_2v_2.$$
Therefore
$$ lambda v_1 + lambda v_2 = lambda_1 v_1 + lambda_2v_2$$
$$ iff$$
$$ (lambda - lambda_1) v_1 + (lambda - lambda_2)v_2 = 0. $$
Since $lambda_1 neq lambda_2$, $v_1$ and $v_2$ are linearly independent so
$$ lambda - lambda_1 = 0 qquad lambda-lambda_2 = 0.$$
So $ lambda = lambda_1 = lambda_2 $ which is a contradiction.
add a comment |
up vote
5
down vote
up vote
5
down vote
By contradiction:
If $v_1 + v_2$ is an eigenvector of A then there exists and eigenvalue $lambda$ so that $$ A(v_1 + v_2) = lambda(v_1 + v_2) = lambda v_1 + lambda v_2.$$
However since $v_1$ and $v_2$ are eigenvectors and $A$ is linear we have
$$ A(v_1 + v_2) = A(v_1) + A(v_2) = lambda_1 v_1 + lambda_2v_2.$$
Therefore
$$ lambda v_1 + lambda v_2 = lambda_1 v_1 + lambda_2v_2$$
$$ iff$$
$$ (lambda - lambda_1) v_1 + (lambda - lambda_2)v_2 = 0. $$
Since $lambda_1 neq lambda_2$, $v_1$ and $v_2$ are linearly independent so
$$ lambda - lambda_1 = 0 qquad lambda-lambda_2 = 0.$$
So $ lambda = lambda_1 = lambda_2 $ which is a contradiction.
By contradiction:
If $v_1 + v_2$ is an eigenvector of A then there exists and eigenvalue $lambda$ so that $$ A(v_1 + v_2) = lambda(v_1 + v_2) = lambda v_1 + lambda v_2.$$
However since $v_1$ and $v_2$ are eigenvectors and $A$ is linear we have
$$ A(v_1 + v_2) = A(v_1) + A(v_2) = lambda_1 v_1 + lambda_2v_2.$$
Therefore
$$ lambda v_1 + lambda v_2 = lambda_1 v_1 + lambda_2v_2$$
$$ iff$$
$$ (lambda - lambda_1) v_1 + (lambda - lambda_2)v_2 = 0. $$
Since $lambda_1 neq lambda_2$, $v_1$ and $v_2$ are linearly independent so
$$ lambda - lambda_1 = 0 qquad lambda-lambda_2 = 0.$$
So $ lambda = lambda_1 = lambda_2 $ which is a contradiction.
answered Nov 27 at 23:29
Digitalis
384114
384114
add a comment |
add a comment |
up vote
3
down vote
What you have above is true. It may help to write $lambda_2 = lambda_1 + c$ where $c neq 0$. Then you can write $A(v_1 + v_2) = lambda_1 (v_1 + v_2) + cv_2$. From here, you should be able to argue that $v_2$ is not parallel to $v_1 + v_2$, based on the assumption that $lambda_1 neq lambda_2$, so $v_1 + v_2$ cannot be an eigenvector of $A$.
add a comment |
up vote
3
down vote
What you have above is true. It may help to write $lambda_2 = lambda_1 + c$ where $c neq 0$. Then you can write $A(v_1 + v_2) = lambda_1 (v_1 + v_2) + cv_2$. From here, you should be able to argue that $v_2$ is not parallel to $v_1 + v_2$, based on the assumption that $lambda_1 neq lambda_2$, so $v_1 + v_2$ cannot be an eigenvector of $A$.
add a comment |
up vote
3
down vote
up vote
3
down vote
What you have above is true. It may help to write $lambda_2 = lambda_1 + c$ where $c neq 0$. Then you can write $A(v_1 + v_2) = lambda_1 (v_1 + v_2) + cv_2$. From here, you should be able to argue that $v_2$ is not parallel to $v_1 + v_2$, based on the assumption that $lambda_1 neq lambda_2$, so $v_1 + v_2$ cannot be an eigenvector of $A$.
What you have above is true. It may help to write $lambda_2 = lambda_1 + c$ where $c neq 0$. Then you can write $A(v_1 + v_2) = lambda_1 (v_1 + v_2) + cv_2$. From here, you should be able to argue that $v_2$ is not parallel to $v_1 + v_2$, based on the assumption that $lambda_1 neq lambda_2$, so $v_1 + v_2$ cannot be an eigenvector of $A$.
answered Nov 27 at 23:21
platty
2,323215
2,323215
add a comment |
add a comment |
jake is a new contributor. Be nice, and check out our Code of Conduct.
jake is a new contributor. Be nice, and check out our Code of Conduct.
jake is a new contributor. Be nice, and check out our Code of Conduct.
jake is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3016461%2fproving-v1v2-is-not-an-eigenvector-of-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Have you already encountered a proof that with the same assumptions ($lambda_1$ and $lambda_2$ are two distinct eigenvalues of an $ntimes n$ matrix $A$, $v_1$ and $v_2$ are their eigenvectors), $v_1$ and $v_2$ are linearly independent? If so, then use that. If not, then that is the place to start.
– Misha Lavrov
Nov 27 at 23:20