The mean value inequality and equalities
In this file we prove the following facts:
convex.norm_image_sub_le_of_norm_deriv_le
: iff
is differentiable on a convex sets
and the norm of its derivative is bounded byC
, thenf
is Lipschitz continuous ons
with constantC
; also a variant in which what is bounded byC
is the norm of the difference of the derivative from a fixed linear map.image_le_of*
,image_norm_le_of_*
: several similar lemmas deducingf x ≤ B x
or∥f x∥ ≤ B x
from upper estimates onf'
or∥f'∥
, respectively. These lemmas differ by their assumptions:of_liminf_*
lemmas assume that limit inferior of some ratio is less thanB' x
;of_deriv_right_*
,of_norm_deriv_right_*
lemmas assume that the right derivative or its norm is less thanB' x
;of_*_lt_*
lemmas assume a strict inequality wheneverf x = B x
or∥f x∥ = B x
;of_*_le_*
lemmas assume a non-strict inequality everywhere on[a, b)
;- name of a lemma ends with
'
if (1) it assumes thatB
is continuous on[a, b]
and has a right derivative at every point of[a, b)
, and (2) the lemma has a counterpart assuming thatB
is differentiable everywhere onℝ
norm_image_sub_le_*_segment
: if derivative off
on[a, b]
is bounded above by a constantC
, then∥f x - f a∥ ≤ C * ∥x - a∥
; several versions deal with right derivative and derivative within[a, b]
(has_deriv_within_at
orderiv_within
).convex.is_const_of_fderiv_within_eq_zero
: if a function has derivative0
on a convex sets
, then it is a constant ons
.exists_ratio_has_deriv_at_eq_ratio_slope
andexists_ratio_deriv_eq_ratio_slope
: Cauchy's Mean Value Theorem.exists_has_deriv_at_eq_slope
andexists_deriv_eq_slope
: Lagrange's Mean Value Theorem.domain_mvt
: Lagrange's Mean Value Theorem, applied to a segment in a convex domain.convex.image_sub_lt_mul_sub_of_deriv_lt
,convex.mul_sub_lt_image_sub_of_lt_deriv
,convex.image_sub_le_mul_sub_of_deriv_le
,convex.mul_sub_le_image_sub_of_le_deriv
, if∀ x, C (</≤/>/≥) (f' x)
, thenC * (y - x) (</≤/>/≥) (f y - f x)
wheneverx < y
.convex.mono_of_deriv_nonneg
,convex.antimono_of_deriv_nonpos
,convex.strict_mono_of_deriv_pos
,convex.strict_antimono_of_deriv_neg
: if the derivative of a function is non-negative/non-positive/positive/negative, then the function is monotone/monotonically decreasing/strictly monotone/strictly monotonically decreasing.convex_on_of_deriv_mono
,convex_on_of_deriv2_nonneg
: if the derivative of a function is increasing or its second derivative is nonnegative, then the original function is convex.strict_fderiv_of_cont_diff
: a C^1 function over the reals is strictly differentiable. (This is a corollary of the mean value inequality.)
One-dimensional fencing inequalities
General fencing theorem for continuous functions with an estimate on the derivative.
Let f
and B
be continuous functions on [a, b]
such that
f a ≤ B a
;B
has right derivativeB'
at every point of[a, b)
;- for each
x ∈ [a, b)
the right-side limit inferior of(f z - f x) / (z - x)
is bounded above by a functionf'
; - we have
f' x < B' x
wheneverf x = B x
.
Then f x ≤ B x
everywhere on [a, b]
.
General fencing theorem for continuous functions with an estimate on the derivative.
Let f
and B
be continuous functions on [a, b]
such that
f a ≤ B a
;B
has derivativeB'
everywhere onℝ
;- for each
x ∈ [a, b)
the right-side limit inferior of(f z - f x) / (z - x)
is bounded above by a functionf'
; - we have
f' x < B' x
wheneverf x = B x
.
Then f x ≤ B x
everywhere on [a, b]
.
General fencing theorem for continuous functions with an estimate on the derivative.
Let f
and B
be continuous functions on [a, b]
such that
f a ≤ B a
;B
has right derivativeB'
at every point of[a, b)
;- for each
x ∈ [a, b)
the right-side limit inferior of(f z - f x) / (z - x)
is bounded above byB'
.
Then f x ≤ B x
everywhere on [a, b]
.
General fencing theorem for continuous functions with an estimate on the derivative.
Let f
and B
be continuous functions on [a, b]
such that
f a ≤ B a
;B
has right derivativeB'
at every point of[a, b)
;f
has right derivativef'
at every point of[a, b)
;- we have
f' x < B' x
wheneverf x = B x
.
Then f x ≤ B x
everywhere on [a, b]
.
General fencing theorem for continuous functions with an estimate on the derivative.
Let f
and B
be continuous functions on [a, b]
such that
f a ≤ B a
;B
has derivativeB'
everywhere onℝ
;f
has right derivativef'
at every point of[a, b)
;- we have
f' x < B' x
wheneverf x = B x
.
Then f x ≤ B x
everywhere on [a, b]
.
General fencing theorem for continuous functions with an estimate on the derivative.
Let f
and B
be continuous functions on [a, b]
such that
f a ≤ B a
;B
has derivativeB'
everywhere onℝ
;f
has right derivativef'
at every point of[a, b)
;- we have
f' x ≤ B' x
on[a, b)
.
Then f x ≤ B x
everywhere on [a, b]
.
Vector-valued functions f : ℝ → E
General fencing theorem for continuous functions with an estimate on the derivative.
Let f
and B
be continuous functions on [a, b]
such that
∥f a∥ ≤ B a
;B
has right derivative at every point of[a, b)
;- for each
x ∈ [a, b)
the right-side limit inferior of(∥f z∥ - ∥f x∥) / (z - x)
is bounded above by a functionf'
; - we have
f' x < B' x
whenever∥f x∥ = B x
.
Then ∥f x∥ ≤ B x
everywhere on [a, b]
.
General fencing theorem for continuous functions with an estimate on the norm of the derivative.
Let f
and B
be continuous functions on [a, b]
such that
∥f a∥ ≤ B a
;f
andB
have right derivativesf'
andB'
respectively at every point of[a, b)
;- the norm of
f'
is strictly less thanB'
whenever∥f x∥ = B x
.
Then ∥f x∥ ≤ B x
everywhere on [a, b]
. We use one-sided derivatives in the assumptions
to make this theorem work for piecewise differentiable functions.
General fencing theorem for continuous functions with an estimate on the norm of the derivative.
Let f
and B
be continuous functions on [a, b]
such that
∥f a∥ ≤ B a
;f
has right derivativef'
at every point of[a, b)
;B
has derivativeB'
everywhere onℝ
;- the norm of
f'
is strictly less thanB'
whenever∥f x∥ = B x
.
Then ∥f x∥ ≤ B x
everywhere on [a, b]
. We use one-sided derivatives in the assumptions
to make this theorem work for piecewise differentiable functions.
General fencing theorem for continuous functions with an estimate on the norm of the derivative.
Let f
and B
be continuous functions on [a, b]
such that
∥f a∥ ≤ B a
;f
andB
have right derivativesf'
andB'
respectively at every point of[a, b)
;- we have
∥f' x∥ ≤ B x
everywhere on[a, b)
.
Then ∥f x∥ ≤ B x
everywhere on [a, b]
. We use one-sided derivatives in the assumptions
to make this theorem work for piecewise differentiable functions.
General fencing theorem for continuous functions with an estimate on the norm of the derivative.
Let f
and B
be continuous functions on [a, b]
such that
∥f a∥ ≤ B a
;f
has right derivativef'
at every point of[a, b)
;B
has derivativeB'
everywhere onℝ
;- we have
∥f' x∥ ≤ B x
everywhere on[a, b)
.
Then ∥f x∥ ≤ B x
everywhere on [a, b]
. We use one-sided derivatives in the assumptions
to make this theorem work for piecewise differentiable functions.
A function on [a, b]
with the norm of the right derivative bounded by C
satisfies ∥f x - f a∥ ≤ C * (x - a)
.
A function on [a, b]
with the norm of the derivative within [a, b]
bounded by C
satisfies ∥f x - f a∥ ≤ C * (x - a)
, has_deriv_within_at
version.
A function on [a, b]
with the norm of the derivative within [a, b]
bounded by C
satisfies ∥f x - f a∥ ≤ C * (x - a)
, deriv_within
version.
A function on [0, 1]
with the norm of the derivative within [0, 1]
bounded by C
satisfies ∥f 1 - f 0∥ ≤ C
, has_deriv_within_at
version.
A function on [0, 1]
with the norm of the derivative within [0, 1]
bounded by C
satisfies ∥f 1 - f 0∥ ≤ C
, deriv_within
version.
Vector-valued functions f : E → F
The mean value theorem on a convex set: if the derivative of a function is bounded by C
, then
the function is C
-Lipschitz. Version with has_fderiv_within
.
The mean value theorem on a convex set: if the derivative of a function within this set is
bounded by C
, then the function is C
-Lipschitz. Version with fderiv_within
.
The mean value theorem on a convex set: if the derivative of a function is bounded by C
,
then the function is C
-Lipschitz. Version with fderiv
.
Variant of the mean value inequality on a convex set, using a bound on the difference between
the derivative and a fixed linear map, rather than a bound on the derivative itself. Version with
has_fderiv_within
.
Variant of the mean value inequality on a convex set. Version with fderiv_within
.
Variant of the mean value inequality on a convex set. Version with fderiv
.
If a function has zero Fréchet derivative at every point of a convex set, then it is a constant on this set.
The mean value theorem on a convex set in dimension 1: if the derivative of a function is
bounded by C
, then the function is C
-Lipschitz. Version with has_deriv_within
.
The mean value theorem on a convex set in dimension 1: if the derivative of a function within
this set is bounded by C
, then the function is C
-Lipschitz. Version with deriv_within
The mean value theorem on a convex set in dimension 1: if the derivative of a function is
bounded by C
, then the function is C
-Lipschitz. Version with deriv
.
Functions [a, b] → ℝ
.
Cauchy's Mean Value Theorem, has_deriv_at
version.
Cauchy's Mean Value Theorem, extended has_deriv_at
version.
Lagrange's Mean Value Theorem, has_deriv_at
version
Cauchy's Mean Value Theorem, deriv
version.
Cauchy's Mean Value Theorem, extended deriv
version.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and C < f'
, then
f
grows faster than C * x
on D
, i.e., C * (y - x) < f y - f x
whenever x, y ∈ D
,
x < y
.
Let f : ℝ → ℝ
be a differentiable function. If C < f'
, then f
grows faster than
C * x
, i.e., C * (y - x) < f y - f x
whenever x < y
.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and C ≤ f'
, then
f
grows at least as fast as C * x
on D
, i.e., C * (y - x) ≤ f y - f x
whenever x, y ∈ D
,
x ≤ y
.
Let f : ℝ → ℝ
be a differentiable function. If C ≤ f'
, then f
grows at least as fast
as C * x
, i.e., C * (y - x) ≤ f y - f x
whenever x ≤ y
.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and f' < C
, then
f
grows slower than C * x
on D
, i.e., f y - f x < C * (y - x)
whenever x, y ∈ D
,
x < y
.
Let f : ℝ → ℝ
be a differentiable function. If f' < C
, then f
grows slower than
C * x
on D
, i.e., f y - f x < C * (y - x)
whenever x < y
.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and f' ≤ C
, then
f
grows at most as fast as C * x
on D
, i.e., f y - f x ≤ C * (y - x)
whenever x, y ∈ D
,
x ≤ y
.
Let f : ℝ → ℝ
be a differentiable function. If f' ≤ C
, then f
grows at most as fast
as C * x
, i.e., f y - f x ≤ C * (y - x)
whenever x ≤ y
.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and f'
is positive, then
f
is a strictly monotonically increasing function on D
.
Let f : ℝ → ℝ
be a differentiable function. If f'
is positive, then
f
is a strictly monotonically increasing function.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and f'
is nonnegative, then
f
is a monotonically increasing function on D
.
Let f : ℝ → ℝ
be a differentiable function. If f'
is nonnegative, then
f
is a monotonically increasing function.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and f'
is negative, then
f
is a strictly monotonically decreasing function on D
.
Let f
be a function continuous on a convex (or, equivalently, connected) subset D
of the real line. If f
is differentiable on the interior of D
and f'
is nonpositive, then
f
is a monotonically decreasing function on D
.
If a function f
is continuous on a convex set D ⊆ ℝ
, is differentiable on its interior,
and f'
is monotone on the interior, then f
is convex on ℝ
.
Functions f : E → ℝ
Lagrange's Mean Value Theorem, applied to convex domains.
Vector-valued functions f : E → F
. Strict differentiability.
Over the reals, a continuously differentiable function is strictly differentiable.