5

The standard definition is that a function $f:\mathbb{R}^n\to \mathbb{R}$ is differentiable at a point $x$ if there exists a linear map $\mathrm{d}f_x: \mathbb{R}^n \to \mathbb{R}$ such that

$$f(x+h) = f(x) + \mathrm{d}f_x(h) + \epsilon \|h\|$$

where $\epsilon\to 0$ as $h\to 0$. This is stronger than the existence of all partial (or directional) derivatives, but weaker than their continuity. However, when talking about higher differentiability, one usually switches to talking about partial derivatives, asking them to be continuous in order to prove basic properties.

Suppose that instead we define $f$ to be "twice differentiable" at $x$ if in addition to $\mathrm{d}f_x$ as above there exists a quadratic form $\mathrm{d}^2f_x$ such that

$$f(x+h) = f(x) + \mathrm{d}f_x(h) + \frac{1}{2}\mathrm{d}^2f_x(h) + \epsilon \|h\|^2$$

where $\epsilon\to 0$ as $h\to 0$. This is true if $f$ has continous second-order partials (it's the multidimensional Taylor expansion, with $\mathrm{d}^2f_x$ the Hessian matrix).

  1. Does this imply that all second-order partial derivatives of $f$ exist?
  2. If so, does it imply that the mixed second-order partials are equal?
Mike Shulman
  • 65,064
  • I am not sure. I will think some about this. My definition of higher order differentiability would be the following:

    First $f$ needs to be differentiable in a nbhd of $p$. Then $f$ is twice differentiable at $p$ if there is a bilinear form $D^2f(p)$ such that $Df(p+v_1)(v_2) = Df(p)(v_2)+D^2f(v_1,v_2)+\epsilon |v_1||v_2|$.

    For some reason it is very hard to find sources where people talk about the second derivative as a bilinear form rather than just a quadratic form.

    – Steven Gubkin May 06 '14 at 21:31
  • You may also be interested in my online course with Jim Fowler here: http://ximera.osu.edu/course/kisonecat/m2o2c2/course/activity/week1/. It is a little buggy, and a little sparse in terms of examples, but it covers this perspective on higher order derivatives in a kind of "interactive textbook" format. – Steven Gubkin May 06 '14 at 21:36
  • 5
    See Chapter XIII, section 3--6 of Lang's "Real and Functional Analysis" for a very elegant treatment under which higher derivatives are defined via multilinear maps and moreover the $p$th higher derivative is genuinely the "derivative" of the $(p-1)$th. One virtue of this approach is that it permits both a formulation and proof of the higher-dimensional Taylor formula which looks and feels exactly like the 1-dimensional case (with the mess of factorials hidden away within a clean formalism); of course, one can bust out coordinates and recover the usual messier explicit version from that. – user76758 May 06 '14 at 21:47
  • I sort of doubt this even implies that $f$ is differentiable except at $x$. For example, let $n = 1$, let $w(t)$ be your typical nowhere-differentiable continuous (i.e. Weierstrass) function, and let $f(t) = t^2 + t^3 w(t)$. Then this is "twice differentiable" according to you, but only at $t = 0$. – Ryan Reich May 06 '14 at 21:54
  • @StevenGubkin: Toby Bartels has convinced me that the best point of view on the second derivative is as the second differential $\mathrm{d}^2f = \sum_{i,j}\partial_{i,j}f , \mathrm{d}x_i ,\mathrm{d}x_j + \sum_i \partial_i f,\mathrm{d}^2x_i$. E.g. this has the advantage of yielding the correct chain rule when you "calculate by substitution". But I don't really know how to express that as any sort of approximation. – Mike Shulman May 07 '14 at 03:09
  • Fedor Petrov posted this on MO's big list of common false beliefs in mathematics: http://mathoverflow.net/a/25899/5963. – Noah Stein May 07 '14 at 10:15
  • 1
    The first derivative of $f$ at $a$ is the linear function of $h$ that best approximates $f(a+h)-f(a)$. The second derivative of $f$ at $a$ is the bilinear function of $(h_1,h_2)$ that best approximates $f(a+h_1+h_2)-f(a+h_1)-f(a+h_2)+f(a)$. Etc. – Tom Goodwillie May 07 '14 at 12:13
  • @MikeShulman Do you have any references for this point of view? Of course, the first derivatives must be present because this is how second derivatives transform. So, invariantly, jet bundles should come into the picture somehow. – Steven Gubkin May 07 '14 at 15:00
  • Maybe you could get Toby to write an answer to his question here http://mathoverflow.net/questions/60474/is-there-a-convenient-differential-calculus-for-cojets/133864#133864, if he has worked something out? – Steven Gubkin May 07 '14 at 15:01
  • @StevenGubkin Making formal sense of this point of view is still kind of a work in progress, but you can see what we've got at http://ncatlab.org/nlab/show/cogerm+differential+form and the nForum discussions linked at the bottom, plus http://nforum.mathforge.org/discussion/5817/cojet-differential-forms/. One issue is that so far (related to this question) it makes the most sense for things that are already known to be smooth; the cogerm differential is a generalization of "all the directional derivatives", not necessarily implying true differentiability. – Mike Shulman May 07 '14 at 15:22
  • @TomGoodwillie (1) is there a reference that develops higher derivatives using that point of view? (2) can you rephrase it in terms of the quadratic form? (3) does second-differentiability in that sense imply equality of mixed partials? – Mike Shulman May 07 '14 at 15:53
  • (1) Maybe the Lang chapters mentioned above? (2) I don't think so. (3) Yes: $f(a+h1+h2)−f(a+h1)−f(a+h2)+f(a)$ is a symmetric function of $h_1$ and $h_2$ and therefore so is its best bilinear approximation. – Tom Goodwillie May 07 '14 at 17:34
  • @TomGoodwillie I'm looking at the Lang chapters now, and I only see the second derivative of $f:E\to F$ defined as the derivative of $D f : E \to L(E,F)$ (for $E$ and $F$ vector spaces). He does of course use the expression $f(x+v+w)-f(x+w)-f(x+v)+f(x)$ in proving symmetry of a continuous second derivative, but I don't see it in a characterization of second differentiability. – Mike Shulman May 10 '14 at 00:54
  • I don't know the Lang book. I was just going by the comment above. – Tom Goodwillie May 10 '14 at 01:21
  • 1
    In case anyone is still watching this question, I am puzzled. On p183 of Strichartz' The Way of Analysis, after proving the second-order Taylor approximation to a $C^2$ function, he says "We will not discuss the problem of obtaining a converse kind of statement, deducing the existence of the second derivative from the existence of quadratic polynomial approximations, because such theorems are extremely difficult to prove and have few applications." But that implies that such theorems exist, which the answers to this question seem to deny. Can anyone guess what he had in mind? – Mike Shulman May 12 '14 at 22:16
  • Even for n=1, it does not even imply continuity in other points than x, so that the second order derivability does not even make sense. Standard example: $f(t):=e^{-1/t}\chi_{\mathbb{Q}}$, that has polynomial expansion of any order at $0$, and is discontinuous at any other point. – Pietro Majer Sep 17 '14 at 11:20

3 Answers3

8

The funny thing is that I got almost the same question from one of my student last semester. The idea was to neglect both the linear and quadratic parts of the hypothetical expansion (without loss of generality, we may assume they are zero), and focus on $ \epsilon \|h\|^2$. We came to the following (obvious) one-dimensional counterexample:

$$f(h) = h^3 \sin(\frac{1}{h})$$

for $h \not= 0$, and: $$f(0) = 0$$

  • This example makes it clear that the issue is not that the first derivative might fail to exist (except at the point in question), so that it can't possibly have its own derivative. In this case, the first derivative exists everywhere and is even continuous, just not differentiable at zero. – Toby Bartels Apr 08 '19 at 18:15
5

No, if I understand the question right, not even in one variable. Let $f(x)$ be $x^3g(x)$ where $g$ is bounded and, except at $x=0$, infinitely differentiable. If $g$ oscillates fast enough near $x=0$ then even though it is the sum of a degree two polynomial (namely zero) and an error term of the kind you are allowing, it will not have a second derivative at $0$.

3

Thanks Michal and Tom for the answer. Let me improve on it slightly with an example of a function which is well-approximated by a polynomial of every degree near 0 (in fact, the zero polynomial), but is still not even twice differentiable there:

$$f(x) = e^{-1/x^2} \sin(e^{1/x^2}) $$

Mike Shulman
  • 65,064