Integral inequality of length of curve
$begingroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
I think mean value theorem kills it but can't do it ...even try Cauchy-Schwarz inequality but nothing conclution
real-analysis inequality arc-length
$endgroup$
add a comment |
$begingroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
I think mean value theorem kills it but can't do it ...even try Cauchy-Schwarz inequality but nothing conclution
real-analysis inequality arc-length
$endgroup$
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
9 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
9 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
6 hours ago
add a comment |
$begingroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
I think mean value theorem kills it but can't do it ...even try Cauchy-Schwarz inequality but nothing conclution
real-analysis inequality arc-length
$endgroup$
Let $f:mathbb{R}to mathbb{R}$ be a continuously differentiable function. Prove that for any $a.bin mathbb{R}$
$$left (int_a^bsqrt{1+(f'(x))^2},dxright)^2ge (a-b)^2+(f(b)-f(a))^2$$.
I think mean value theorem kills it but can't do it ...even try Cauchy-Schwarz inequality but nothing conclution
real-analysis inequality arc-length
real-analysis inequality arc-length
edited 30 mins ago
Martin Sleziak
44.8k10119272
44.8k10119272
asked 10 hours ago
RAM_3RRAM_3R
608214
608214
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
9 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
9 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
6 hours ago
add a comment |
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
9 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
9 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
6 hours ago
3
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
9 hours ago
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
9 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
9 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
9 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
6 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
6 hours ago
add a comment |
4 Answers
4
active
oldest
votes
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
1
$begingroup$
This really nice!
$endgroup$
– Nastar
7 hours ago
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
6 hours ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
6 hours ago
$begingroup$
This is slick. I wish I could upvote this answer twice....
$endgroup$
– Matematleta
4 hours ago
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3132801%2fintegral-inequality-of-length-of-curve%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
1
$begingroup$
This really nice!
$endgroup$
– Nastar
7 hours ago
add a comment |
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
1
$begingroup$
This really nice!
$endgroup$
– Nastar
7 hours ago
add a comment |
$begingroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
$endgroup$
Notice that the function $y mapsto sqrt{1+y^2}$ is strictly convex. So by the Jensen's inequality,
$$ frac{1}{b-a} int_{a}^{b} sqrt{1 + f'(x)^2} , mathrm{d}x geq sqrt{1 + left(frac{1}{b-a}int_{a}^{b} f'(x) , mathrm{d}xright)^2} = sqrt{1 + left(frac{f(b) - f(a)}{b-a} right)^2}. $$
Multiplying both sides by $b-a$ and squaring proves the desired inequality. Moreover, by the strict convexity, the equality holds if and only if $f'$ is constant over $[a, b]$.
answered 7 hours ago
Sangchul LeeSangchul Lee
95.2k12170278
95.2k12170278
1
$begingroup$
This really nice!
$endgroup$
– Nastar
7 hours ago
add a comment |
1
$begingroup$
This really nice!
$endgroup$
– Nastar
7 hours ago
1
1
$begingroup$
This really nice!
$endgroup$
– Nastar
7 hours ago
$begingroup$
This really nice!
$endgroup$
– Nastar
7 hours ago
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
6 hours ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
6 hours ago
$begingroup$
This is slick. I wish I could upvote this answer twice....
$endgroup$
– Matematleta
4 hours ago
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
6 hours ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
6 hours ago
$begingroup$
This is slick. I wish I could upvote this answer twice....
$endgroup$
– Matematleta
4 hours ago
add a comment |
$begingroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
$endgroup$
Note that for every complex valued integrable function $phi :[a,b]to Bbb C$, it holds that
$$
left|int_a^b phi(x) dxright|le int_a^b|phi(x)| dx.
$$ Let $phi(x)=1+if'(x)$. Then we can see that
$$begin{align*}
left|int_a^b phi(x) dxright|&=left|(b-a)+i(f(b)-f(a))right|\&=sqrt{(b-a)^2+(f(b)-f(a))^2}
end{align*}$$ and
$$
int_a^b|phi(x)| dx=int_a^b sqrt{1+(f'(x))^2} dx.
$$ Now, the desired inequality follows.
answered 6 hours ago
SongSong
16.2k1739
16.2k1739
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
6 hours ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
6 hours ago
$begingroup$
This is slick. I wish I could upvote this answer twice....
$endgroup$
– Matematleta
4 hours ago
add a comment |
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
6 hours ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
6 hours ago
$begingroup$
This is slick. I wish I could upvote this answer twice....
$endgroup$
– Matematleta
4 hours ago
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
6 hours ago
$begingroup$
(+1) Amazing, this should the accepted answer! Anyway, is there any reason to work with $mathbb{C}$ rather than $mathbb{R}^2$ with $phi(x) = gamma'(x)$ and $gamma(x) = (x, f(x))$?
$endgroup$
– Sangchul Lee
6 hours ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
6 hours ago
$begingroup$
Umm, sorry, I see no specific reason, since both are essentially the same version of the triangle inequality in integral form. But I just prefered $Bbb C$-version because it can be easily derived from the real triangle inequality; if $f$ is real-valued, integrable, $pm int_a^b fle int_a^b |f|$. Thank you!
$endgroup$
– Song
6 hours ago
$begingroup$
This is slick. I wish I could upvote this answer twice....
$endgroup$
– Matematleta
4 hours ago
$begingroup$
This is slick. I wish I could upvote this answer twice....
$endgroup$
– Matematleta
4 hours ago
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
add a comment |
$begingroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
$endgroup$
An easy way to do this is to note that since distance is invariant under rotations, without loss of generality, we may assume that $f(a)=f(b).$ And now, since $sqrt{1-f'(x)}ge 0$ on $[a,b]$, the function in $C^1([a,b])$ that minimizes the integral coincides with the function $f$ that minimizes the integrand, and clearly, this happens when $f'(x)=0$ for all $xin [a,b].$ That is, when $f$ is constant on $[a,b].$ Then, $f(x)=f(a)$ and the result follows.
If you want to do this without the wlog assumption, then argue as follows:
Let $epsilon>0, fin C^1([a,b])$ and choose a partition $P={a,x_1,cdots,x_{n-2},b}$.
The length of the polygonal path obtained by joining the points
$(x_i,f(x_i))$ is $sum_i sqrt{(Delta x_i)^2+(Delta y_i)^2}$ and this is clearly $ge (b-a)^2+(f(b)-f(a))^2$. (You can make this precise by using an induction argument on $n$.)
And this is true for $textit{any}$ partition $P$.
But the above sum is also $sum_isqrt{1+frac{Delta y_i}{Delta x_i}}Delta x_i $ and now, upon applying the MVT, we see that what we have is a Riemann sum for $sqrt{1+f'(x)}$.
To finish, choose $P$ such that $left |int^b_asqrt{1+f'(x)}dx- sum_isqrt{1+f'(c_i)}Delta x_i right |<epsilon $. (The $c_i$ are the numbers $x_i<c_i<x_{i-1}$ obtained from the MVT). Then,
$(b-a)^2+(f(b)-f(a))^2le sum_isqrt{1+f'(c)}Delta x_i<int^b_asqrt{1+f'(x)}+epsilon.$
Since $epsilon$ is arbitrary, the result follows.
For a slick way to do this, use a variational argument: assuming a minimum $f$ exists, consider $f+tphi$ where $t$ is a real parameter and $phi$ is arbitrary $C^1([a,b])$.
Subsitute it into the integral:
$l(t)=int_a^b sqrt{1+(f'+tphi')^2}dx$.
Since $f$ minimizes this integral, the derivative of $l$ at $t=0$ must be equal to zero. Then,
$0=l'(0)= int_a^b dfrac{f'phi'}{sqrt{1+(f')^2}}dx$.
After an integration by parts, we get
$dfrac{f'}{sqrt{1+(f')^2}} = c$ for some constant $cin mathbb R,$ from which it follows that $f'=c$. And this means, of course, that the graph of $f$ is a straight line connecting $(a,f(a))$ and $(b,f(b)).$ The desired inequality follows.
edited 7 hours ago
answered 9 hours ago
MatematletaMatematleta
11.5k2920
11.5k2920
add a comment |
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
add a comment |
$begingroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
$endgroup$
Expanding upon what @Conrad said, the shortest distance between two points is the distance of the line between, which is what your RHS is measuring (it is actually the square of the distance from $(a, f(a))$ to $(b, f(b))$.
Now if we assume $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 < (a-b)^2+(f(b)-f(a))^2$, then we have contradicted the fact that that the shortest distance between $(a, f(a))$ and $(b, f(b))$ is $sqrt{(a-b)^2+(f(b)-f(a))^2}$. Therefore, it must be the case that $left (int_a^bsqrt{1+(f'(x))^2},dxright)^2 geq (a-b)^2+(f(b)-f(a))^2$
New contributor
New contributor
answered 9 hours ago
se2018se2018
873
873
New contributor
New contributor
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3132801%2fintegral-inequality-of-length-of-curve%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
$begingroup$
the smallest distance between the two points $(a, f(a))$ and $(b, f(b))$ is the straight line distance which is your RHS (the square root of that of course but same applies to LHS ; conclude...
$endgroup$
– Conrad
9 hours ago
$begingroup$
@Conrad But this is exactly what is to be proved, since the LHS is the definition of arc length.
$endgroup$
– Matematleta
9 hours ago
$begingroup$
This is classic stuff - can do it locally using Taylor approximation so make the curve piecewise linear and use elementary geometry or as done in various answers with various inequalities
$endgroup$
– Conrad
6 hours ago