General question on norm and distance in a normed vector space E and a continuous linear form u.
$begingroup$
Let E be a normed vector space and u be a linear, continuous form on E with kernel H.
We show that: $forall x in E setminus H, | u | le frac{|u(x)|}{d(x, H)}.$
$forall lambda in mathbb{R} setminus {0}, forall y in H,$
$frac{|u(lambda x + y)|}{|lambda x + y|} = frac{|lambda||u(x)|}{|lambda||x+frac{1}{lambda}y|} = frac{|u(x)|}{|x+frac{1}{lambda}y|} le frac{|u(x)|}{d(x,H)}.$
My question is on the last inequality (in particular, the denominator), comparing the norm on E and the distance d(x,H):
So, if I am correct:
$d(x,H) le d(x,frac{1}{lambda}y) = |x-frac{1}{lambda}y| le |x+frac{1}{lambda}y|$
So here, to compare the distance and the norm on E, we assume that $|w |:= d(0,w)$, $forall w in E$ ; that is the norm on the normed vector space E is the distance from a point w and the origin 0? (My confusion is E is a general normed vector space). Can someone please elaborate, thanks.
functional-analysis normed-spaces
$endgroup$
add a comment |
$begingroup$
Let E be a normed vector space and u be a linear, continuous form on E with kernel H.
We show that: $forall x in E setminus H, | u | le frac{|u(x)|}{d(x, H)}.$
$forall lambda in mathbb{R} setminus {0}, forall y in H,$
$frac{|u(lambda x + y)|}{|lambda x + y|} = frac{|lambda||u(x)|}{|lambda||x+frac{1}{lambda}y|} = frac{|u(x)|}{|x+frac{1}{lambda}y|} le frac{|u(x)|}{d(x,H)}.$
My question is on the last inequality (in particular, the denominator), comparing the norm on E and the distance d(x,H):
So, if I am correct:
$d(x,H) le d(x,frac{1}{lambda}y) = |x-frac{1}{lambda}y| le |x+frac{1}{lambda}y|$
So here, to compare the distance and the norm on E, we assume that $|w |:= d(0,w)$, $forall w in E$ ; that is the norm on the normed vector space E is the distance from a point w and the origin 0? (My confusion is E is a general normed vector space). Can someone please elaborate, thanks.
functional-analysis normed-spaces
$endgroup$
add a comment |
$begingroup$
Let E be a normed vector space and u be a linear, continuous form on E with kernel H.
We show that: $forall x in E setminus H, | u | le frac{|u(x)|}{d(x, H)}.$
$forall lambda in mathbb{R} setminus {0}, forall y in H,$
$frac{|u(lambda x + y)|}{|lambda x + y|} = frac{|lambda||u(x)|}{|lambda||x+frac{1}{lambda}y|} = frac{|u(x)|}{|x+frac{1}{lambda}y|} le frac{|u(x)|}{d(x,H)}.$
My question is on the last inequality (in particular, the denominator), comparing the norm on E and the distance d(x,H):
So, if I am correct:
$d(x,H) le d(x,frac{1}{lambda}y) = |x-frac{1}{lambda}y| le |x+frac{1}{lambda}y|$
So here, to compare the distance and the norm on E, we assume that $|w |:= d(0,w)$, $forall w in E$ ; that is the norm on the normed vector space E is the distance from a point w and the origin 0? (My confusion is E is a general normed vector space). Can someone please elaborate, thanks.
functional-analysis normed-spaces
$endgroup$
Let E be a normed vector space and u be a linear, continuous form on E with kernel H.
We show that: $forall x in E setminus H, | u | le frac{|u(x)|}{d(x, H)}.$
$forall lambda in mathbb{R} setminus {0}, forall y in H,$
$frac{|u(lambda x + y)|}{|lambda x + y|} = frac{|lambda||u(x)|}{|lambda||x+frac{1}{lambda}y|} = frac{|u(x)|}{|x+frac{1}{lambda}y|} le frac{|u(x)|}{d(x,H)}.$
My question is on the last inequality (in particular, the denominator), comparing the norm on E and the distance d(x,H):
So, if I am correct:
$d(x,H) le d(x,frac{1}{lambda}y) = |x-frac{1}{lambda}y| le |x+frac{1}{lambda}y|$
So here, to compare the distance and the norm on E, we assume that $|w |:= d(0,w)$, $forall w in E$ ; that is the norm on the normed vector space E is the distance from a point w and the origin 0? (My confusion is E is a general normed vector space). Can someone please elaborate, thanks.
functional-analysis normed-spaces
functional-analysis normed-spaces
asked Dec 28 '18 at 16:06
metalder9metalder9
647
647
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
You are definitely on the right track, simply adding a minus will solve the problem:
$$d(x,H)leq d(x,-frac1{lambda}y)=|x+frac1{lambda}y|$$
$endgroup$
$begingroup$
I see, the way you did it is more direct. So, does the distance between two vectors in E equal the norm of the difference between the two vectors? How do we know this when the norm is not specified in the question (E a normed vector space)? I was interested if you could talk about the connection between the distance as shown and the norm on E.
$endgroup$
– metalder9
Dec 28 '18 at 20:25
1
$begingroup$
Every norm naturally induces a metric by $d(x,y):=|x-y|$. You can check that the defining properties of a norm imply the defining properties of a metric here. Therefore, whenever we talk about a normed vector space, the induced metric is always assumed, unless specified otherwise.
$endgroup$
– SmileyCraft
Dec 28 '18 at 20:30
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055019%2fgeneral-question-on-norm-and-distance-in-a-normed-vector-space-e-and-a-continuou%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You are definitely on the right track, simply adding a minus will solve the problem:
$$d(x,H)leq d(x,-frac1{lambda}y)=|x+frac1{lambda}y|$$
$endgroup$
$begingroup$
I see, the way you did it is more direct. So, does the distance between two vectors in E equal the norm of the difference between the two vectors? How do we know this when the norm is not specified in the question (E a normed vector space)? I was interested if you could talk about the connection between the distance as shown and the norm on E.
$endgroup$
– metalder9
Dec 28 '18 at 20:25
1
$begingroup$
Every norm naturally induces a metric by $d(x,y):=|x-y|$. You can check that the defining properties of a norm imply the defining properties of a metric here. Therefore, whenever we talk about a normed vector space, the induced metric is always assumed, unless specified otherwise.
$endgroup$
– SmileyCraft
Dec 28 '18 at 20:30
add a comment |
$begingroup$
You are definitely on the right track, simply adding a minus will solve the problem:
$$d(x,H)leq d(x,-frac1{lambda}y)=|x+frac1{lambda}y|$$
$endgroup$
$begingroup$
I see, the way you did it is more direct. So, does the distance between two vectors in E equal the norm of the difference between the two vectors? How do we know this when the norm is not specified in the question (E a normed vector space)? I was interested if you could talk about the connection between the distance as shown and the norm on E.
$endgroup$
– metalder9
Dec 28 '18 at 20:25
1
$begingroup$
Every norm naturally induces a metric by $d(x,y):=|x-y|$. You can check that the defining properties of a norm imply the defining properties of a metric here. Therefore, whenever we talk about a normed vector space, the induced metric is always assumed, unless specified otherwise.
$endgroup$
– SmileyCraft
Dec 28 '18 at 20:30
add a comment |
$begingroup$
You are definitely on the right track, simply adding a minus will solve the problem:
$$d(x,H)leq d(x,-frac1{lambda}y)=|x+frac1{lambda}y|$$
$endgroup$
You are definitely on the right track, simply adding a minus will solve the problem:
$$d(x,H)leq d(x,-frac1{lambda}y)=|x+frac1{lambda}y|$$
answered Dec 28 '18 at 19:37
SmileyCraftSmileyCraft
3,749519
3,749519
$begingroup$
I see, the way you did it is more direct. So, does the distance between two vectors in E equal the norm of the difference between the two vectors? How do we know this when the norm is not specified in the question (E a normed vector space)? I was interested if you could talk about the connection between the distance as shown and the norm on E.
$endgroup$
– metalder9
Dec 28 '18 at 20:25
1
$begingroup$
Every norm naturally induces a metric by $d(x,y):=|x-y|$. You can check that the defining properties of a norm imply the defining properties of a metric here. Therefore, whenever we talk about a normed vector space, the induced metric is always assumed, unless specified otherwise.
$endgroup$
– SmileyCraft
Dec 28 '18 at 20:30
add a comment |
$begingroup$
I see, the way you did it is more direct. So, does the distance between two vectors in E equal the norm of the difference between the two vectors? How do we know this when the norm is not specified in the question (E a normed vector space)? I was interested if you could talk about the connection between the distance as shown and the norm on E.
$endgroup$
– metalder9
Dec 28 '18 at 20:25
1
$begingroup$
Every norm naturally induces a metric by $d(x,y):=|x-y|$. You can check that the defining properties of a norm imply the defining properties of a metric here. Therefore, whenever we talk about a normed vector space, the induced metric is always assumed, unless specified otherwise.
$endgroup$
– SmileyCraft
Dec 28 '18 at 20:30
$begingroup$
I see, the way you did it is more direct. So, does the distance between two vectors in E equal the norm of the difference between the two vectors? How do we know this when the norm is not specified in the question (E a normed vector space)? I was interested if you could talk about the connection between the distance as shown and the norm on E.
$endgroup$
– metalder9
Dec 28 '18 at 20:25
$begingroup$
I see, the way you did it is more direct. So, does the distance between two vectors in E equal the norm of the difference between the two vectors? How do we know this when the norm is not specified in the question (E a normed vector space)? I was interested if you could talk about the connection between the distance as shown and the norm on E.
$endgroup$
– metalder9
Dec 28 '18 at 20:25
1
1
$begingroup$
Every norm naturally induces a metric by $d(x,y):=|x-y|$. You can check that the defining properties of a norm imply the defining properties of a metric here. Therefore, whenever we talk about a normed vector space, the induced metric is always assumed, unless specified otherwise.
$endgroup$
– SmileyCraft
Dec 28 '18 at 20:30
$begingroup$
Every norm naturally induces a metric by $d(x,y):=|x-y|$. You can check that the defining properties of a norm imply the defining properties of a metric here. Therefore, whenever we talk about a normed vector space, the induced metric is always assumed, unless specified otherwise.
$endgroup$
– SmileyCraft
Dec 28 '18 at 20:30
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3055019%2fgeneral-question-on-norm-and-distance-in-a-normed-vector-space-e-and-a-continuou%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown