Loading AI tools
From Wikipedia, the free encyclopedia
In mathematics, the Landau–Kolmogorov inequality, named after Edmund Landau and Andrey Kolmogorov, is the following family of interpolation inequalities between different derivatives of a function f defined on a subset T of the real numbers:[1]
For k = 1, n = 2 and T = [c,∞) or T = R, the inequality was first proved by Edmund Landau[2] with the sharp constants C(2, 1, [c,∞)) = 2 and C(2, 1, R) = √2. Following contributions by Jacques Hadamard and Georgiy Shilov, Andrey Kolmogorov found the sharp constants and arbitrary n, k:[3]
where an are the Favard constants.
Following work by Matorin and others, the extremising functions were found by Isaac Jacob Schoenberg,[4] explicit forms for the sharp constants are however still unknown.
There are many generalisations, which are of the form
Here all three norms can be different from each other (from L1 to L∞, with p=q=r=∞ in the classical case) and T may be the real axis, semiaxis or a closed segment.
The Kallman–Rota inequality generalizes the Landau–Kolmogorov inequalities from the derivative operator to more general contractions on Banach spaces.[5]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.