Linear operator and inner product
$begingroup$
Theorem: Let $V$ be an inner product finite space with an orthonormal basis $mathcal B$. Let $L$ be an operator on $V$, and let $A = [L]_mathcal{B}$, the matrix associate to $L$. Then the matrix elements of $A$ are $$A_{ij} = langle b_i, Lb_jrangle.$$
If the basis $mathcal B$ is only orthogonal, is it true that$$A_{ij}=frac{langle b_i, Lb_jrangle}{langle b_i, b_irangle}?$$
linear-algebra vector-spaces linear-transformations norm
$endgroup$
add a comment |
$begingroup$
Theorem: Let $V$ be an inner product finite space with an orthonormal basis $mathcal B$. Let $L$ be an operator on $V$, and let $A = [L]_mathcal{B}$, the matrix associate to $L$. Then the matrix elements of $A$ are $$A_{ij} = langle b_i, Lb_jrangle.$$
If the basis $mathcal B$ is only orthogonal, is it true that$$A_{ij}=frac{langle b_i, Lb_jrangle}{langle b_i, b_irangle}?$$
linear-algebra vector-spaces linear-transformations norm
$endgroup$
$begingroup$
@AlexVong Sorry, it should be finite, but I would also like to know if that is true or not when $V$ is of infinite dimension.
$endgroup$
– user398843
Dec 13 '18 at 0:41
add a comment |
$begingroup$
Theorem: Let $V$ be an inner product finite space with an orthonormal basis $mathcal B$. Let $L$ be an operator on $V$, and let $A = [L]_mathcal{B}$, the matrix associate to $L$. Then the matrix elements of $A$ are $$A_{ij} = langle b_i, Lb_jrangle.$$
If the basis $mathcal B$ is only orthogonal, is it true that$$A_{ij}=frac{langle b_i, Lb_jrangle}{langle b_i, b_irangle}?$$
linear-algebra vector-spaces linear-transformations norm
$endgroup$
Theorem: Let $V$ be an inner product finite space with an orthonormal basis $mathcal B$. Let $L$ be an operator on $V$, and let $A = [L]_mathcal{B}$, the matrix associate to $L$. Then the matrix elements of $A$ are $$A_{ij} = langle b_i, Lb_jrangle.$$
If the basis $mathcal B$ is only orthogonal, is it true that$$A_{ij}=frac{langle b_i, Lb_jrangle}{langle b_i, b_irangle}?$$
linear-algebra vector-spaces linear-transformations norm
linear-algebra vector-spaces linear-transformations norm
edited Dec 13 '18 at 0:45
Saucy O'Path
5,9491627
5,9491627
asked Dec 13 '18 at 0:18
user398843user398843
648216
648216
$begingroup$
@AlexVong Sorry, it should be finite, but I would also like to know if that is true or not when $V$ is of infinite dimension.
$endgroup$
– user398843
Dec 13 '18 at 0:41
add a comment |
$begingroup$
@AlexVong Sorry, it should be finite, but I would also like to know if that is true or not when $V$ is of infinite dimension.
$endgroup$
– user398843
Dec 13 '18 at 0:41
$begingroup$
@AlexVong Sorry, it should be finite, but I would also like to know if that is true or not when $V$ is of infinite dimension.
$endgroup$
– user398843
Dec 13 '18 at 0:41
$begingroup$
@AlexVong Sorry, it should be finite, but I would also like to know if that is true or not when $V$ is of infinite dimension.
$endgroup$
– user398843
Dec 13 '18 at 0:41
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Assume that $mathcal{B}={b_1,b_2,ldots,b_n}$, where $n:= dim(V)$. Since $$L(b_j)=sum_{k=1}^n,A_{k,j},b_ktext{ for each }jin{1,2,ldots,n}=:[n],,$$
we have
$$biglangle b_i,L(b_j)bigrangle=sum_{k=1}^n,A_{k,j},langle b_i,b_krangletext{ for all }i,jin[n],.$$
If $mathcal{B}$ is an orthogonal basis, then
$$biglangle b_i,L(b_j)bigrangle=A_{i,j},langle b_i,b_irangletext{ for all }i,jin[n],,$$ proving your claim.
In general, let $langle_,_rangle$ be a nondegenerate symmetric bilinear form on $V$ and ${beta_1,beta_2,ldots,beta_n}$ the dual basis of ${b_1,b_2,ldots,b_n}$. Then, $langle beta_i,b_jrangle =delta_{i,j}$ for all $i,jin[n]$, where $delta$ is the Kronecker delta. Then, the matrix $[A_{i,j}]_{i,jin[n]}$ of $L$ in the basis $mathcal{B}={b_1,b_2,ldots,b_n}$ is given by
$$A_{i,j}=biglangle beta_i,L(b_j)bigrangletext{ for all }i,jin [n],.$$
In your particular case,
$$beta_i=frac{b_i}{langle b_i,b_irangle}text{ for every }iin[n],.$$
Remark. In the case the base field is $mathbb{C}$, we can also take $langle _,_rangle$ to be a nondegenerate sesquilinear form on $V$ that is antilinear in the first entry, and linear in the second entry. The work is the same.
$endgroup$
add a comment |
$begingroup$
Yes, it is, because $[L]_{mathcal B}=[id]^{mathcal B'}_{mathcal B}[L]_{mathcal B'}[id]_{mathcal B'}^{mathcal B}$, where $mathcal B'=left{frac{b_i}{sqrt{langle b_i,b_irangle}}right}_{i=1}^n$.
By the previous lemma, $left([L]_{mathcal B'}right)_{ij}=frac{langle b_i,Lb_jrangle}{sqrt{langle b_j,b_jranglelangle b_i,b_irangle}}$. Moreover, $left([id]^{mathcal B'}_{mathcal B}right)_{ij}=delta_{ij}frac1{sqrt{langle b_i,b_irangle}}$ and $left([id]_{mathcal B'}^{mathcal B}right)_{ij}=delta_{ij}sqrt{langle b_i,b_irangle}$, so $$left([L]_{mathcal B}right)_{ij}=sum_{k,h}left([id]^{mathcal B'}_{mathcal B}right)_{ik}left([L]_{mathcal B'}right)_{kh}left([id]_{mathcal B'}^{mathcal B}right)_{hj}=\=sum_{k,h}delta_{ik}frac1{sqrt{langle b_i,b_irangle}}frac{langle b_k,Lb_hrangle}{sqrt{langle b_k,b_kranglelangle b_h,b_hrangle}}delta_{hj}sqrt{langle b_h,b_hrangle}=frac{langle b_i,Lb_jrangle}{langle b_i,b_irangle}$$
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037430%2flinear-operator-and-inner-product%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Assume that $mathcal{B}={b_1,b_2,ldots,b_n}$, where $n:= dim(V)$. Since $$L(b_j)=sum_{k=1}^n,A_{k,j},b_ktext{ for each }jin{1,2,ldots,n}=:[n],,$$
we have
$$biglangle b_i,L(b_j)bigrangle=sum_{k=1}^n,A_{k,j},langle b_i,b_krangletext{ for all }i,jin[n],.$$
If $mathcal{B}$ is an orthogonal basis, then
$$biglangle b_i,L(b_j)bigrangle=A_{i,j},langle b_i,b_irangletext{ for all }i,jin[n],,$$ proving your claim.
In general, let $langle_,_rangle$ be a nondegenerate symmetric bilinear form on $V$ and ${beta_1,beta_2,ldots,beta_n}$ the dual basis of ${b_1,b_2,ldots,b_n}$. Then, $langle beta_i,b_jrangle =delta_{i,j}$ for all $i,jin[n]$, where $delta$ is the Kronecker delta. Then, the matrix $[A_{i,j}]_{i,jin[n]}$ of $L$ in the basis $mathcal{B}={b_1,b_2,ldots,b_n}$ is given by
$$A_{i,j}=biglangle beta_i,L(b_j)bigrangletext{ for all }i,jin [n],.$$
In your particular case,
$$beta_i=frac{b_i}{langle b_i,b_irangle}text{ for every }iin[n],.$$
Remark. In the case the base field is $mathbb{C}$, we can also take $langle _,_rangle$ to be a nondegenerate sesquilinear form on $V$ that is antilinear in the first entry, and linear in the second entry. The work is the same.
$endgroup$
add a comment |
$begingroup$
Assume that $mathcal{B}={b_1,b_2,ldots,b_n}$, where $n:= dim(V)$. Since $$L(b_j)=sum_{k=1}^n,A_{k,j},b_ktext{ for each }jin{1,2,ldots,n}=:[n],,$$
we have
$$biglangle b_i,L(b_j)bigrangle=sum_{k=1}^n,A_{k,j},langle b_i,b_krangletext{ for all }i,jin[n],.$$
If $mathcal{B}$ is an orthogonal basis, then
$$biglangle b_i,L(b_j)bigrangle=A_{i,j},langle b_i,b_irangletext{ for all }i,jin[n],,$$ proving your claim.
In general, let $langle_,_rangle$ be a nondegenerate symmetric bilinear form on $V$ and ${beta_1,beta_2,ldots,beta_n}$ the dual basis of ${b_1,b_2,ldots,b_n}$. Then, $langle beta_i,b_jrangle =delta_{i,j}$ for all $i,jin[n]$, where $delta$ is the Kronecker delta. Then, the matrix $[A_{i,j}]_{i,jin[n]}$ of $L$ in the basis $mathcal{B}={b_1,b_2,ldots,b_n}$ is given by
$$A_{i,j}=biglangle beta_i,L(b_j)bigrangletext{ for all }i,jin [n],.$$
In your particular case,
$$beta_i=frac{b_i}{langle b_i,b_irangle}text{ for every }iin[n],.$$
Remark. In the case the base field is $mathbb{C}$, we can also take $langle _,_rangle$ to be a nondegenerate sesquilinear form on $V$ that is antilinear in the first entry, and linear in the second entry. The work is the same.
$endgroup$
add a comment |
$begingroup$
Assume that $mathcal{B}={b_1,b_2,ldots,b_n}$, where $n:= dim(V)$. Since $$L(b_j)=sum_{k=1}^n,A_{k,j},b_ktext{ for each }jin{1,2,ldots,n}=:[n],,$$
we have
$$biglangle b_i,L(b_j)bigrangle=sum_{k=1}^n,A_{k,j},langle b_i,b_krangletext{ for all }i,jin[n],.$$
If $mathcal{B}$ is an orthogonal basis, then
$$biglangle b_i,L(b_j)bigrangle=A_{i,j},langle b_i,b_irangletext{ for all }i,jin[n],,$$ proving your claim.
In general, let $langle_,_rangle$ be a nondegenerate symmetric bilinear form on $V$ and ${beta_1,beta_2,ldots,beta_n}$ the dual basis of ${b_1,b_2,ldots,b_n}$. Then, $langle beta_i,b_jrangle =delta_{i,j}$ for all $i,jin[n]$, where $delta$ is the Kronecker delta. Then, the matrix $[A_{i,j}]_{i,jin[n]}$ of $L$ in the basis $mathcal{B}={b_1,b_2,ldots,b_n}$ is given by
$$A_{i,j}=biglangle beta_i,L(b_j)bigrangletext{ for all }i,jin [n],.$$
In your particular case,
$$beta_i=frac{b_i}{langle b_i,b_irangle}text{ for every }iin[n],.$$
Remark. In the case the base field is $mathbb{C}$, we can also take $langle _,_rangle$ to be a nondegenerate sesquilinear form on $V$ that is antilinear in the first entry, and linear in the second entry. The work is the same.
$endgroup$
Assume that $mathcal{B}={b_1,b_2,ldots,b_n}$, where $n:= dim(V)$. Since $$L(b_j)=sum_{k=1}^n,A_{k,j},b_ktext{ for each }jin{1,2,ldots,n}=:[n],,$$
we have
$$biglangle b_i,L(b_j)bigrangle=sum_{k=1}^n,A_{k,j},langle b_i,b_krangletext{ for all }i,jin[n],.$$
If $mathcal{B}$ is an orthogonal basis, then
$$biglangle b_i,L(b_j)bigrangle=A_{i,j},langle b_i,b_irangletext{ for all }i,jin[n],,$$ proving your claim.
In general, let $langle_,_rangle$ be a nondegenerate symmetric bilinear form on $V$ and ${beta_1,beta_2,ldots,beta_n}$ the dual basis of ${b_1,b_2,ldots,b_n}$. Then, $langle beta_i,b_jrangle =delta_{i,j}$ for all $i,jin[n]$, where $delta$ is the Kronecker delta. Then, the matrix $[A_{i,j}]_{i,jin[n]}$ of $L$ in the basis $mathcal{B}={b_1,b_2,ldots,b_n}$ is given by
$$A_{i,j}=biglangle beta_i,L(b_j)bigrangletext{ for all }i,jin [n],.$$
In your particular case,
$$beta_i=frac{b_i}{langle b_i,b_irangle}text{ for every }iin[n],.$$
Remark. In the case the base field is $mathbb{C}$, we can also take $langle _,_rangle$ to be a nondegenerate sesquilinear form on $V$ that is antilinear in the first entry, and linear in the second entry. The work is the same.
edited Dec 13 '18 at 0:50
answered Dec 13 '18 at 0:42
BatominovskiBatominovski
1
1
add a comment |
add a comment |
$begingroup$
Yes, it is, because $[L]_{mathcal B}=[id]^{mathcal B'}_{mathcal B}[L]_{mathcal B'}[id]_{mathcal B'}^{mathcal B}$, where $mathcal B'=left{frac{b_i}{sqrt{langle b_i,b_irangle}}right}_{i=1}^n$.
By the previous lemma, $left([L]_{mathcal B'}right)_{ij}=frac{langle b_i,Lb_jrangle}{sqrt{langle b_j,b_jranglelangle b_i,b_irangle}}$. Moreover, $left([id]^{mathcal B'}_{mathcal B}right)_{ij}=delta_{ij}frac1{sqrt{langle b_i,b_irangle}}$ and $left([id]_{mathcal B'}^{mathcal B}right)_{ij}=delta_{ij}sqrt{langle b_i,b_irangle}$, so $$left([L]_{mathcal B}right)_{ij}=sum_{k,h}left([id]^{mathcal B'}_{mathcal B}right)_{ik}left([L]_{mathcal B'}right)_{kh}left([id]_{mathcal B'}^{mathcal B}right)_{hj}=\=sum_{k,h}delta_{ik}frac1{sqrt{langle b_i,b_irangle}}frac{langle b_k,Lb_hrangle}{sqrt{langle b_k,b_kranglelangle b_h,b_hrangle}}delta_{hj}sqrt{langle b_h,b_hrangle}=frac{langle b_i,Lb_jrangle}{langle b_i,b_irangle}$$
$endgroup$
add a comment |
$begingroup$
Yes, it is, because $[L]_{mathcal B}=[id]^{mathcal B'}_{mathcal B}[L]_{mathcal B'}[id]_{mathcal B'}^{mathcal B}$, where $mathcal B'=left{frac{b_i}{sqrt{langle b_i,b_irangle}}right}_{i=1}^n$.
By the previous lemma, $left([L]_{mathcal B'}right)_{ij}=frac{langle b_i,Lb_jrangle}{sqrt{langle b_j,b_jranglelangle b_i,b_irangle}}$. Moreover, $left([id]^{mathcal B'}_{mathcal B}right)_{ij}=delta_{ij}frac1{sqrt{langle b_i,b_irangle}}$ and $left([id]_{mathcal B'}^{mathcal B}right)_{ij}=delta_{ij}sqrt{langle b_i,b_irangle}$, so $$left([L]_{mathcal B}right)_{ij}=sum_{k,h}left([id]^{mathcal B'}_{mathcal B}right)_{ik}left([L]_{mathcal B'}right)_{kh}left([id]_{mathcal B'}^{mathcal B}right)_{hj}=\=sum_{k,h}delta_{ik}frac1{sqrt{langle b_i,b_irangle}}frac{langle b_k,Lb_hrangle}{sqrt{langle b_k,b_kranglelangle b_h,b_hrangle}}delta_{hj}sqrt{langle b_h,b_hrangle}=frac{langle b_i,Lb_jrangle}{langle b_i,b_irangle}$$
$endgroup$
add a comment |
$begingroup$
Yes, it is, because $[L]_{mathcal B}=[id]^{mathcal B'}_{mathcal B}[L]_{mathcal B'}[id]_{mathcal B'}^{mathcal B}$, where $mathcal B'=left{frac{b_i}{sqrt{langle b_i,b_irangle}}right}_{i=1}^n$.
By the previous lemma, $left([L]_{mathcal B'}right)_{ij}=frac{langle b_i,Lb_jrangle}{sqrt{langle b_j,b_jranglelangle b_i,b_irangle}}$. Moreover, $left([id]^{mathcal B'}_{mathcal B}right)_{ij}=delta_{ij}frac1{sqrt{langle b_i,b_irangle}}$ and $left([id]_{mathcal B'}^{mathcal B}right)_{ij}=delta_{ij}sqrt{langle b_i,b_irangle}$, so $$left([L]_{mathcal B}right)_{ij}=sum_{k,h}left([id]^{mathcal B'}_{mathcal B}right)_{ik}left([L]_{mathcal B'}right)_{kh}left([id]_{mathcal B'}^{mathcal B}right)_{hj}=\=sum_{k,h}delta_{ik}frac1{sqrt{langle b_i,b_irangle}}frac{langle b_k,Lb_hrangle}{sqrt{langle b_k,b_kranglelangle b_h,b_hrangle}}delta_{hj}sqrt{langle b_h,b_hrangle}=frac{langle b_i,Lb_jrangle}{langle b_i,b_irangle}$$
$endgroup$
Yes, it is, because $[L]_{mathcal B}=[id]^{mathcal B'}_{mathcal B}[L]_{mathcal B'}[id]_{mathcal B'}^{mathcal B}$, where $mathcal B'=left{frac{b_i}{sqrt{langle b_i,b_irangle}}right}_{i=1}^n$.
By the previous lemma, $left([L]_{mathcal B'}right)_{ij}=frac{langle b_i,Lb_jrangle}{sqrt{langle b_j,b_jranglelangle b_i,b_irangle}}$. Moreover, $left([id]^{mathcal B'}_{mathcal B}right)_{ij}=delta_{ij}frac1{sqrt{langle b_i,b_irangle}}$ and $left([id]_{mathcal B'}^{mathcal B}right)_{ij}=delta_{ij}sqrt{langle b_i,b_irangle}$, so $$left([L]_{mathcal B}right)_{ij}=sum_{k,h}left([id]^{mathcal B'}_{mathcal B}right)_{ik}left([L]_{mathcal B'}right)_{kh}left([id]_{mathcal B'}^{mathcal B}right)_{hj}=\=sum_{k,h}delta_{ik}frac1{sqrt{langle b_i,b_irangle}}frac{langle b_k,Lb_hrangle}{sqrt{langle b_k,b_kranglelangle b_h,b_hrangle}}delta_{hj}sqrt{langle b_h,b_hrangle}=frac{langle b_i,Lb_jrangle}{langle b_i,b_irangle}$$
edited Dec 13 '18 at 0:40
answered Dec 13 '18 at 0:24
Saucy O'PathSaucy O'Path
5,9491627
5,9491627
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037430%2flinear-operator-and-inner-product%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
@AlexVong Sorry, it should be finite, but I would also like to know if that is true or not when $V$ is of infinite dimension.
$endgroup$
– user398843
Dec 13 '18 at 0:41