Can derivatives be defined as anti-integrals?Is “integrability” equivalent to “having...

Finding the number of digits of a given integer.

Bit one of the Intel 8080's Flags register

How do certain apps show new notifications when internet access is restricted to them?

Can derivatives be defined as anti-integrals?

Asked to Not Use Transactions and to Use A Workaround to Simulate One

Karazuba Algorithm with arbitrary bases

Is there any way to land a rover on the Moon without using any thrusters?

What jurisdiction do Scottish courts have over the Westminster parliament?

Why some files are not movable in Windows 10

Some Prime Peerage

Planar regular languages

What is the mathematical notation for rounding a given number to the nearest integer?

What is a realistic time needed to get a properly trained army?

Why is my fire extinguisher emptied after one use?

Were Roman public roads build by private companies?

How do EVA suits manage water excretion?

why car dealer is insisting on loan v/s cash

Why does the speed of sound decrease at high altitudes although the air density decreases?

Sort files in a given folders and provide as a list

Can you add polynomial terms to multiple linear regression?

What was the ultimate objective of The Party in 1984?

The Planck constant for mathematicians

Double it your way

Can I toggle Do Not Disturb on/off on my Mac as easily as I can on my iPhone?



Can derivatives be defined as anti-integrals?


Is “integrability” equivalent to “having antiderivative”?Two functions agreeing except on set of measure zeroAre integrals thought of as 'antiderivatives' to avoid using Faulhaber?What is the proof of integral theorem i.e area under curve is given by anti derivative?Don't see the point of the Fundamental Theorem of Calculus.Does picking $C=0$ as a constant of integration result in a nominated anti-derivative?Why are derivatives and antiderivatives defined only on intervals?What is the difference between Definite Integral & Indefinite Integral on the basis of their connection with derivatives?Relating differential equations and integrals and notationsWhat is the fastest converging method which can be used to approximate definite integrals?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







7












$begingroup$


I see integrals defined as anti-derivatives but for some reason I haven't come across the reverse. Both seem equally implied by the fundamental theorem of calculus.



This emerged as a sticking point in this question.










share|cite|improve this question









$endgroup$










  • 4




    $begingroup$
    In a certain sense this is precisely what we do when we give coordinate-independent definitions of the divergence and curl of a vector field: we can define them as the limit of flux or circulation integrals taken over small and smaller regions. You see this approach a lot in engineering and physics books.
    $endgroup$
    – symplectomorphic
    8 hours ago




















7












$begingroup$


I see integrals defined as anti-derivatives but for some reason I haven't come across the reverse. Both seem equally implied by the fundamental theorem of calculus.



This emerged as a sticking point in this question.










share|cite|improve this question









$endgroup$










  • 4




    $begingroup$
    In a certain sense this is precisely what we do when we give coordinate-independent definitions of the divergence and curl of a vector field: we can define them as the limit of flux or circulation integrals taken over small and smaller regions. You see this approach a lot in engineering and physics books.
    $endgroup$
    – symplectomorphic
    8 hours ago
















7












7








7


1



$begingroup$


I see integrals defined as anti-derivatives but for some reason I haven't come across the reverse. Both seem equally implied by the fundamental theorem of calculus.



This emerged as a sticking point in this question.










share|cite|improve this question









$endgroup$




I see integrals defined as anti-derivatives but for some reason I haven't come across the reverse. Both seem equally implied by the fundamental theorem of calculus.



This emerged as a sticking point in this question.







calculus terminology






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 8 hours ago









mjcmjc

1747 bronze badges




1747 bronze badges











  • 4




    $begingroup$
    In a certain sense this is precisely what we do when we give coordinate-independent definitions of the divergence and curl of a vector field: we can define them as the limit of flux or circulation integrals taken over small and smaller regions. You see this approach a lot in engineering and physics books.
    $endgroup$
    – symplectomorphic
    8 hours ago
















  • 4




    $begingroup$
    In a certain sense this is precisely what we do when we give coordinate-independent definitions of the divergence and curl of a vector field: we can define them as the limit of flux or circulation integrals taken over small and smaller regions. You see this approach a lot in engineering and physics books.
    $endgroup$
    – symplectomorphic
    8 hours ago










4




4




$begingroup$
In a certain sense this is precisely what we do when we give coordinate-independent definitions of the divergence and curl of a vector field: we can define them as the limit of flux or circulation integrals taken over small and smaller regions. You see this approach a lot in engineering and physics books.
$endgroup$
– symplectomorphic
8 hours ago






$begingroup$
In a certain sense this is precisely what we do when we give coordinate-independent definitions of the divergence and curl of a vector field: we can define them as the limit of flux or circulation integrals taken over small and smaller regions. You see this approach a lot in engineering and physics books.
$endgroup$
– symplectomorphic
8 hours ago












4 Answers
4






active

oldest

votes


















7














$begingroup$

Let $f(x)=0$ for all real $x$.



Here is one anti-integral for $f$:



$$ g(x) = begin{cases} x &text{when }xinmathbb Z \ 0 & text{otherwise} end{cases} $$
in the sense that $int_a^b g(x),dx = f(b)-f(a)$ for all $a,b$.



How do you explain that the slope of $f$ at $x=5$ is not $g(5)=5$?





The idea works better if we restrict all the functions we ever look at to "sufficiently nice" ones -- for example, we could insist that everything is real analytic.



Merely looking for a continuous anti-integral wouldn't suffice to recover the usual concept of derivative, because then something like
$$ x mapsto begin{cases} 0 & text{when }x=0 \ x^2sin(1/x) & text{otherwise} end{cases} $$
wouldn't have a derivative on $mathbb R$ (which is does by the usual definition).






share|cite|improve this answer











$endgroup$















  • $begingroup$
    Is this a restriction on the FTC? It sounds like you're saying integration and differentiation aren't always one-to-one inverses of each other. Would it be fair to say that integration and differentiation each have (at least) two definitions: one via limits, one as the inverse of each other; and the latter definition is less general?
    $endgroup$
    – mjc
    8 hours ago






  • 2




    $begingroup$
    @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept.
    $endgroup$
    – Henning Makholm
    7 hours ago








  • 2




    $begingroup$
    Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim.
    $endgroup$
    – Henning Makholm
    7 hours ago










  • $begingroup$
    If $c$ could be any constant, does that mean that every integral is surjective on N?
    $endgroup$
    – mjc
    6 hours ago








  • 1




    $begingroup$
    @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective.
    $endgroup$
    – Henning Makholm
    6 hours ago



















5














$begingroup$

In a sense your question is very natural. Let's take an informal approach to it, and then see where the technicalities arise. (That's how a lot of research mathematics works, by the way! Have an intuitive idea, and then try to implement it carefully. The devil is always in the details.)



So, one way to tell the familiar story of one-variable calculus is as follows:




  1. Define the derivative $f'$ of a function $f$ as the limit of the difference quotient, $h^{-1}(f(x+h)-f(x))$, as $hto0$.

  2. Define an anti-derivative of a function $f$ as a function $F$ for which $F'=f$.

  3. Define the definite integral of a function $f$ over $[a,b]$, say as the limit of Riemann sums.

  4. Discover that (2) and (3) are related, in the sense that
    $$int_a^bf=F(b)-F(a)$$
    so long as $F$ is any anti-derivative of $f$.




Now, your idea is that you can imagine doing this the other way around, as follows:




  1. Define the definite integral of a function $f$ over an interval $[a,b]$, say as a limit of Riemann sums.

  2. Define an anti-integral of a function $F$ as a function $f$ for which
    $$F(x)-F(0)=int_0^xf$$

  3. Define the derivative of a function, as the limit of the difference quotient.

  4. Discover that (2) and (3) are related, in the sense that
    one anti-integral of $f$ is just $f'$, so long as $f'$ is defined.




The trouble in both stories arises in steps 2 and 4. (Step 4, in both stories, are forms of the Fundamental Theorem.)



The Problem with Step 2



In both the standard and the flipped story, step 2 poses existence and uniqueness problems.



In the standard story, an anti-derivative of $f$ may not even exist; one sufficient condition is to require that $f$ be continuous, but that is not necessary. And even if you do require that $f$ be continuous, you're always going to have non-uniqueness. Thus "anti-differentiation" construed as an operation is not really a bona fide "inverse" operation, because it is not single-valued. Or in other words, differentiation is not injective: it identifies many different functions. (Exactly which functions it identifies depends on the topology of the domain they're defined on.)



In the flipped story, again note that we certainly will never have uniqueness. Given any anti-integral $f$, you can find infinitely many others by changing the values of $f$ at a set of measure zero. We also aren't guaranteed existence of an anti-integral for a given $F$, and this time not even the continuity of $F$ will serve as a sufficient condition. What we need is even stronger, "absolute continuity."



The Problem with Step 4



In the standard story, the catch is in "so long as $F$ is any anti-derivative of $f$." The problem is that not every Riemann integrable function has an anti-derivative. If we want to guarantee an anti-derivative, we can impose the additional hypothesis that $f$ is continuous (which is again sufficient but not necessary).



A similar problem arises in the flipped scenario: given an arbitrary $f$, it might not have an anti-integral. The fundamental theorem for Lebesgue integrals shows that it's both necessary and sufficient to require that $f$ be absolutely continuous. But given the fact that integrals are not sensitive to values on a set of measure zero, the best conclusion we can draw in that case is that an anti-integral of $f$ equals $f'$ "almost everywhere" (meaning, everywhere except at a set of measure zero).






share|cite|improve this answer











$endgroup$























    2














    $begingroup$

    From the point of view of analysis (as hinted at in Henning Makholm's answer) the issue is that the mapping $I:f'to f$ is extremely not one-to-one. When you try to invert it, you find that a great deal of functions are possible "anti-integrals" of a given function. While this does occur for $d:fto f'$ as well, there is a robust mathematical theory about how to address this and how to describe the set of anti-derivatives of a given function. For example, if $f$ is defined on $[a,b]$ then all antiderivatives of $f$ are of the form $$F_i(x)=c_i + int_a^x f(t)dt$$ for constants $c_i$. Although in some contexts the situation becomes more complicated (for example, if we look at $1/x$ defined on $[-1,0)cup(0,1]$ then you have two constants, one for each side) there is a whole field that studies what happens for various domains.



    The situation for inverting $I$ is a lot less rosy. For one thing, if you take any finite subset of the domain you can move the function's value around however you like without changing the value. More generally, as long as two functions agree on a set of measure zero they will have the same integral. As far as I know there are no known ways to fruitfully analyze such a set of functions (a statement that has deep repercussions in machine learning and functional analysis).



    A second issue is that integrating doesn't always ensure that you can differentiate. There are a wide variety of functions $f$ such that the anti-integral doesn't (or doesn't have to) produce a differentiable function! For example, if $1_mathbb{Q}$ denotes the function that takes on the value $1$ on rational inputs and $0$ on irrational inputs, this function has a Lebesgue integral of $0$ (a similar example works for Riemann integral but it's more work). If you take the anti-integral of $f(x)=0$ and get $1_mathbb{Q}$, you can't differentiate and get back $f(x)=0$ because it's not differentiable.



    A commenter mentions vector calculus, and it is true that something like this happens in vector calculus but there are a couple massive caveats.






    share|cite|improve this answer











    $endgroup$























      0














      $begingroup$

      Not exactly an answer to your question, but related:



      One thing that has always bugged me, in a sense, is this: When you have higher order derivatives of $f(x)$, instead of writing $f''(x)$ and $f'(x)$, people start using a number between parenthesis. So the 5th order derivative is $f^{(5)}(x)$.



      However, this notation doesn't translate back to integrals, for some reason. If you have a function $f(x)$, and $f^{(1)}(x)$ is the first derivative of $f$, then wouldn't it make sense to denote the anti-derivative/integral of $f$ as $f^{(-1)}(x)$?



      So for example if $f(x)$ is some acceleration function, then the speed function is $f^{(-1)}(x)$ and the position function is $f^{(-2)}(x)$. I think this is good notation.






      share|cite|improve this answer









      $endgroup$















      • $begingroup$
        This notation does appear to be used in the contexts of repeated integration and fractional calculus. That said, this really seems like a comment more than an answer...
        $endgroup$
        – Semiclassical
        8 hours ago






      • 1




        $begingroup$
        You may find this of interest.
        $endgroup$
        – J.G.
        8 hours ago














      Your Answer








      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });















      draft saved

      draft discarded
















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3354566%2fcan-derivatives-be-defined-as-anti-integrals%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      4 Answers
      4






      active

      oldest

      votes








      4 Answers
      4






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      7














      $begingroup$

      Let $f(x)=0$ for all real $x$.



      Here is one anti-integral for $f$:



      $$ g(x) = begin{cases} x &text{when }xinmathbb Z \ 0 & text{otherwise} end{cases} $$
      in the sense that $int_a^b g(x),dx = f(b)-f(a)$ for all $a,b$.



      How do you explain that the slope of $f$ at $x=5$ is not $g(5)=5$?





      The idea works better if we restrict all the functions we ever look at to "sufficiently nice" ones -- for example, we could insist that everything is real analytic.



      Merely looking for a continuous anti-integral wouldn't suffice to recover the usual concept of derivative, because then something like
      $$ x mapsto begin{cases} 0 & text{when }x=0 \ x^2sin(1/x) & text{otherwise} end{cases} $$
      wouldn't have a derivative on $mathbb R$ (which is does by the usual definition).






      share|cite|improve this answer











      $endgroup$















      • $begingroup$
        Is this a restriction on the FTC? It sounds like you're saying integration and differentiation aren't always one-to-one inverses of each other. Would it be fair to say that integration and differentiation each have (at least) two definitions: one via limits, one as the inverse of each other; and the latter definition is less general?
        $endgroup$
        – mjc
        8 hours ago






      • 2




        $begingroup$
        @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept.
        $endgroup$
        – Henning Makholm
        7 hours ago








      • 2




        $begingroup$
        Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim.
        $endgroup$
        – Henning Makholm
        7 hours ago










      • $begingroup$
        If $c$ could be any constant, does that mean that every integral is surjective on N?
        $endgroup$
        – mjc
        6 hours ago








      • 1




        $begingroup$
        @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective.
        $endgroup$
        – Henning Makholm
        6 hours ago
















      7














      $begingroup$

      Let $f(x)=0$ for all real $x$.



      Here is one anti-integral for $f$:



      $$ g(x) = begin{cases} x &text{when }xinmathbb Z \ 0 & text{otherwise} end{cases} $$
      in the sense that $int_a^b g(x),dx = f(b)-f(a)$ for all $a,b$.



      How do you explain that the slope of $f$ at $x=5$ is not $g(5)=5$?





      The idea works better if we restrict all the functions we ever look at to "sufficiently nice" ones -- for example, we could insist that everything is real analytic.



      Merely looking for a continuous anti-integral wouldn't suffice to recover the usual concept of derivative, because then something like
      $$ x mapsto begin{cases} 0 & text{when }x=0 \ x^2sin(1/x) & text{otherwise} end{cases} $$
      wouldn't have a derivative on $mathbb R$ (which is does by the usual definition).






      share|cite|improve this answer











      $endgroup$















      • $begingroup$
        Is this a restriction on the FTC? It sounds like you're saying integration and differentiation aren't always one-to-one inverses of each other. Would it be fair to say that integration and differentiation each have (at least) two definitions: one via limits, one as the inverse of each other; and the latter definition is less general?
        $endgroup$
        – mjc
        8 hours ago






      • 2




        $begingroup$
        @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept.
        $endgroup$
        – Henning Makholm
        7 hours ago








      • 2




        $begingroup$
        Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim.
        $endgroup$
        – Henning Makholm
        7 hours ago










      • $begingroup$
        If $c$ could be any constant, does that mean that every integral is surjective on N?
        $endgroup$
        – mjc
        6 hours ago








      • 1




        $begingroup$
        @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective.
        $endgroup$
        – Henning Makholm
        6 hours ago














      7














      7










      7







      $begingroup$

      Let $f(x)=0$ for all real $x$.



      Here is one anti-integral for $f$:



      $$ g(x) = begin{cases} x &text{when }xinmathbb Z \ 0 & text{otherwise} end{cases} $$
      in the sense that $int_a^b g(x),dx = f(b)-f(a)$ for all $a,b$.



      How do you explain that the slope of $f$ at $x=5$ is not $g(5)=5$?





      The idea works better if we restrict all the functions we ever look at to "sufficiently nice" ones -- for example, we could insist that everything is real analytic.



      Merely looking for a continuous anti-integral wouldn't suffice to recover the usual concept of derivative, because then something like
      $$ x mapsto begin{cases} 0 & text{when }x=0 \ x^2sin(1/x) & text{otherwise} end{cases} $$
      wouldn't have a derivative on $mathbb R$ (which is does by the usual definition).






      share|cite|improve this answer











      $endgroup$



      Let $f(x)=0$ for all real $x$.



      Here is one anti-integral for $f$:



      $$ g(x) = begin{cases} x &text{when }xinmathbb Z \ 0 & text{otherwise} end{cases} $$
      in the sense that $int_a^b g(x),dx = f(b)-f(a)$ for all $a,b$.



      How do you explain that the slope of $f$ at $x=5$ is not $g(5)=5$?





      The idea works better if we restrict all the functions we ever look at to "sufficiently nice" ones -- for example, we could insist that everything is real analytic.



      Merely looking for a continuous anti-integral wouldn't suffice to recover the usual concept of derivative, because then something like
      $$ x mapsto begin{cases} 0 & text{when }x=0 \ x^2sin(1/x) & text{otherwise} end{cases} $$
      wouldn't have a derivative on $mathbb R$ (which is does by the usual definition).







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited 7 hours ago

























      answered 8 hours ago









      Henning MakholmHenning Makholm

      255k18 gold badges338 silver badges583 bronze badges




      255k18 gold badges338 silver badges583 bronze badges















      • $begingroup$
        Is this a restriction on the FTC? It sounds like you're saying integration and differentiation aren't always one-to-one inverses of each other. Would it be fair to say that integration and differentiation each have (at least) two definitions: one via limits, one as the inverse of each other; and the latter definition is less general?
        $endgroup$
        – mjc
        8 hours ago






      • 2




        $begingroup$
        @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept.
        $endgroup$
        – Henning Makholm
        7 hours ago








      • 2




        $begingroup$
        Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim.
        $endgroup$
        – Henning Makholm
        7 hours ago










      • $begingroup$
        If $c$ could be any constant, does that mean that every integral is surjective on N?
        $endgroup$
        – mjc
        6 hours ago








      • 1




        $begingroup$
        @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective.
        $endgroup$
        – Henning Makholm
        6 hours ago


















      • $begingroup$
        Is this a restriction on the FTC? It sounds like you're saying integration and differentiation aren't always one-to-one inverses of each other. Would it be fair to say that integration and differentiation each have (at least) two definitions: one via limits, one as the inverse of each other; and the latter definition is less general?
        $endgroup$
        – mjc
        8 hours ago






      • 2




        $begingroup$
        @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept.
        $endgroup$
        – Henning Makholm
        7 hours ago








      • 2




        $begingroup$
        Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim.
        $endgroup$
        – Henning Makholm
        7 hours ago










      • $begingroup$
        If $c$ could be any constant, does that mean that every integral is surjective on N?
        $endgroup$
        – mjc
        6 hours ago








      • 1




        $begingroup$
        @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective.
        $endgroup$
        – Henning Makholm
        6 hours ago
















      $begingroup$
      Is this a restriction on the FTC? It sounds like you're saying integration and differentiation aren't always one-to-one inverses of each other. Would it be fair to say that integration and differentiation each have (at least) two definitions: one via limits, one as the inverse of each other; and the latter definition is less general?
      $endgroup$
      – mjc
      8 hours ago




      $begingroup$
      Is this a restriction on the FTC? It sounds like you're saying integration and differentiation aren't always one-to-one inverses of each other. Would it be fair to say that integration and differentiation each have (at least) two definitions: one via limits, one as the inverse of each other; and the latter definition is less general?
      $endgroup$
      – mjc
      8 hours ago




      2




      2




      $begingroup$
      @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept.
      $endgroup$
      – Henning Makholm
      7 hours ago






      $begingroup$
      @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept.
      $endgroup$
      – Henning Makholm
      7 hours ago






      2




      2




      $begingroup$
      Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim.
      $endgroup$
      – Henning Makholm
      7 hours ago




      $begingroup$
      Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim.
      $endgroup$
      – Henning Makholm
      7 hours ago












      $begingroup$
      If $c$ could be any constant, does that mean that every integral is surjective on N?
      $endgroup$
      – mjc
      6 hours ago






      $begingroup$
      If $c$ could be any constant, does that mean that every integral is surjective on N?
      $endgroup$
      – mjc
      6 hours ago






      1




      1




      $begingroup$
      @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective.
      $endgroup$
      – Henning Makholm
      6 hours ago




      $begingroup$
      @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective.
      $endgroup$
      – Henning Makholm
      6 hours ago













      5














      $begingroup$

      In a sense your question is very natural. Let's take an informal approach to it, and then see where the technicalities arise. (That's how a lot of research mathematics works, by the way! Have an intuitive idea, and then try to implement it carefully. The devil is always in the details.)



      So, one way to tell the familiar story of one-variable calculus is as follows:




      1. Define the derivative $f'$ of a function $f$ as the limit of the difference quotient, $h^{-1}(f(x+h)-f(x))$, as $hto0$.

      2. Define an anti-derivative of a function $f$ as a function $F$ for which $F'=f$.

      3. Define the definite integral of a function $f$ over $[a,b]$, say as the limit of Riemann sums.

      4. Discover that (2) and (3) are related, in the sense that
        $$int_a^bf=F(b)-F(a)$$
        so long as $F$ is any anti-derivative of $f$.




      Now, your idea is that you can imagine doing this the other way around, as follows:




      1. Define the definite integral of a function $f$ over an interval $[a,b]$, say as a limit of Riemann sums.

      2. Define an anti-integral of a function $F$ as a function $f$ for which
        $$F(x)-F(0)=int_0^xf$$

      3. Define the derivative of a function, as the limit of the difference quotient.

      4. Discover that (2) and (3) are related, in the sense that
        one anti-integral of $f$ is just $f'$, so long as $f'$ is defined.




      The trouble in both stories arises in steps 2 and 4. (Step 4, in both stories, are forms of the Fundamental Theorem.)



      The Problem with Step 2



      In both the standard and the flipped story, step 2 poses existence and uniqueness problems.



      In the standard story, an anti-derivative of $f$ may not even exist; one sufficient condition is to require that $f$ be continuous, but that is not necessary. And even if you do require that $f$ be continuous, you're always going to have non-uniqueness. Thus "anti-differentiation" construed as an operation is not really a bona fide "inverse" operation, because it is not single-valued. Or in other words, differentiation is not injective: it identifies many different functions. (Exactly which functions it identifies depends on the topology of the domain they're defined on.)



      In the flipped story, again note that we certainly will never have uniqueness. Given any anti-integral $f$, you can find infinitely many others by changing the values of $f$ at a set of measure zero. We also aren't guaranteed existence of an anti-integral for a given $F$, and this time not even the continuity of $F$ will serve as a sufficient condition. What we need is even stronger, "absolute continuity."



      The Problem with Step 4



      In the standard story, the catch is in "so long as $F$ is any anti-derivative of $f$." The problem is that not every Riemann integrable function has an anti-derivative. If we want to guarantee an anti-derivative, we can impose the additional hypothesis that $f$ is continuous (which is again sufficient but not necessary).



      A similar problem arises in the flipped scenario: given an arbitrary $f$, it might not have an anti-integral. The fundamental theorem for Lebesgue integrals shows that it's both necessary and sufficient to require that $f$ be absolutely continuous. But given the fact that integrals are not sensitive to values on a set of measure zero, the best conclusion we can draw in that case is that an anti-integral of $f$ equals $f'$ "almost everywhere" (meaning, everywhere except at a set of measure zero).






      share|cite|improve this answer











      $endgroup$




















        5














        $begingroup$

        In a sense your question is very natural. Let's take an informal approach to it, and then see where the technicalities arise. (That's how a lot of research mathematics works, by the way! Have an intuitive idea, and then try to implement it carefully. The devil is always in the details.)



        So, one way to tell the familiar story of one-variable calculus is as follows:




        1. Define the derivative $f'$ of a function $f$ as the limit of the difference quotient, $h^{-1}(f(x+h)-f(x))$, as $hto0$.

        2. Define an anti-derivative of a function $f$ as a function $F$ for which $F'=f$.

        3. Define the definite integral of a function $f$ over $[a,b]$, say as the limit of Riemann sums.

        4. Discover that (2) and (3) are related, in the sense that
          $$int_a^bf=F(b)-F(a)$$
          so long as $F$ is any anti-derivative of $f$.




        Now, your idea is that you can imagine doing this the other way around, as follows:




        1. Define the definite integral of a function $f$ over an interval $[a,b]$, say as a limit of Riemann sums.

        2. Define an anti-integral of a function $F$ as a function $f$ for which
          $$F(x)-F(0)=int_0^xf$$

        3. Define the derivative of a function, as the limit of the difference quotient.

        4. Discover that (2) and (3) are related, in the sense that
          one anti-integral of $f$ is just $f'$, so long as $f'$ is defined.




        The trouble in both stories arises in steps 2 and 4. (Step 4, in both stories, are forms of the Fundamental Theorem.)



        The Problem with Step 2



        In both the standard and the flipped story, step 2 poses existence and uniqueness problems.



        In the standard story, an anti-derivative of $f$ may not even exist; one sufficient condition is to require that $f$ be continuous, but that is not necessary. And even if you do require that $f$ be continuous, you're always going to have non-uniqueness. Thus "anti-differentiation" construed as an operation is not really a bona fide "inverse" operation, because it is not single-valued. Or in other words, differentiation is not injective: it identifies many different functions. (Exactly which functions it identifies depends on the topology of the domain they're defined on.)



        In the flipped story, again note that we certainly will never have uniqueness. Given any anti-integral $f$, you can find infinitely many others by changing the values of $f$ at a set of measure zero. We also aren't guaranteed existence of an anti-integral for a given $F$, and this time not even the continuity of $F$ will serve as a sufficient condition. What we need is even stronger, "absolute continuity."



        The Problem with Step 4



        In the standard story, the catch is in "so long as $F$ is any anti-derivative of $f$." The problem is that not every Riemann integrable function has an anti-derivative. If we want to guarantee an anti-derivative, we can impose the additional hypothesis that $f$ is continuous (which is again sufficient but not necessary).



        A similar problem arises in the flipped scenario: given an arbitrary $f$, it might not have an anti-integral. The fundamental theorem for Lebesgue integrals shows that it's both necessary and sufficient to require that $f$ be absolutely continuous. But given the fact that integrals are not sensitive to values on a set of measure zero, the best conclusion we can draw in that case is that an anti-integral of $f$ equals $f'$ "almost everywhere" (meaning, everywhere except at a set of measure zero).






        share|cite|improve this answer











        $endgroup$


















          5














          5










          5







          $begingroup$

          In a sense your question is very natural. Let's take an informal approach to it, and then see where the technicalities arise. (That's how a lot of research mathematics works, by the way! Have an intuitive idea, and then try to implement it carefully. The devil is always in the details.)



          So, one way to tell the familiar story of one-variable calculus is as follows:




          1. Define the derivative $f'$ of a function $f$ as the limit of the difference quotient, $h^{-1}(f(x+h)-f(x))$, as $hto0$.

          2. Define an anti-derivative of a function $f$ as a function $F$ for which $F'=f$.

          3. Define the definite integral of a function $f$ over $[a,b]$, say as the limit of Riemann sums.

          4. Discover that (2) and (3) are related, in the sense that
            $$int_a^bf=F(b)-F(a)$$
            so long as $F$ is any anti-derivative of $f$.




          Now, your idea is that you can imagine doing this the other way around, as follows:




          1. Define the definite integral of a function $f$ over an interval $[a,b]$, say as a limit of Riemann sums.

          2. Define an anti-integral of a function $F$ as a function $f$ for which
            $$F(x)-F(0)=int_0^xf$$

          3. Define the derivative of a function, as the limit of the difference quotient.

          4. Discover that (2) and (3) are related, in the sense that
            one anti-integral of $f$ is just $f'$, so long as $f'$ is defined.




          The trouble in both stories arises in steps 2 and 4. (Step 4, in both stories, are forms of the Fundamental Theorem.)



          The Problem with Step 2



          In both the standard and the flipped story, step 2 poses existence and uniqueness problems.



          In the standard story, an anti-derivative of $f$ may not even exist; one sufficient condition is to require that $f$ be continuous, but that is not necessary. And even if you do require that $f$ be continuous, you're always going to have non-uniqueness. Thus "anti-differentiation" construed as an operation is not really a bona fide "inverse" operation, because it is not single-valued. Or in other words, differentiation is not injective: it identifies many different functions. (Exactly which functions it identifies depends on the topology of the domain they're defined on.)



          In the flipped story, again note that we certainly will never have uniqueness. Given any anti-integral $f$, you can find infinitely many others by changing the values of $f$ at a set of measure zero. We also aren't guaranteed existence of an anti-integral for a given $F$, and this time not even the continuity of $F$ will serve as a sufficient condition. What we need is even stronger, "absolute continuity."



          The Problem with Step 4



          In the standard story, the catch is in "so long as $F$ is any anti-derivative of $f$." The problem is that not every Riemann integrable function has an anti-derivative. If we want to guarantee an anti-derivative, we can impose the additional hypothesis that $f$ is continuous (which is again sufficient but not necessary).



          A similar problem arises in the flipped scenario: given an arbitrary $f$, it might not have an anti-integral. The fundamental theorem for Lebesgue integrals shows that it's both necessary and sufficient to require that $f$ be absolutely continuous. But given the fact that integrals are not sensitive to values on a set of measure zero, the best conclusion we can draw in that case is that an anti-integral of $f$ equals $f'$ "almost everywhere" (meaning, everywhere except at a set of measure zero).






          share|cite|improve this answer











          $endgroup$



          In a sense your question is very natural. Let's take an informal approach to it, and then see where the technicalities arise. (That's how a lot of research mathematics works, by the way! Have an intuitive idea, and then try to implement it carefully. The devil is always in the details.)



          So, one way to tell the familiar story of one-variable calculus is as follows:




          1. Define the derivative $f'$ of a function $f$ as the limit of the difference quotient, $h^{-1}(f(x+h)-f(x))$, as $hto0$.

          2. Define an anti-derivative of a function $f$ as a function $F$ for which $F'=f$.

          3. Define the definite integral of a function $f$ over $[a,b]$, say as the limit of Riemann sums.

          4. Discover that (2) and (3) are related, in the sense that
            $$int_a^bf=F(b)-F(a)$$
            so long as $F$ is any anti-derivative of $f$.




          Now, your idea is that you can imagine doing this the other way around, as follows:




          1. Define the definite integral of a function $f$ over an interval $[a,b]$, say as a limit of Riemann sums.

          2. Define an anti-integral of a function $F$ as a function $f$ for which
            $$F(x)-F(0)=int_0^xf$$

          3. Define the derivative of a function, as the limit of the difference quotient.

          4. Discover that (2) and (3) are related, in the sense that
            one anti-integral of $f$ is just $f'$, so long as $f'$ is defined.




          The trouble in both stories arises in steps 2 and 4. (Step 4, in both stories, are forms of the Fundamental Theorem.)



          The Problem with Step 2



          In both the standard and the flipped story, step 2 poses existence and uniqueness problems.



          In the standard story, an anti-derivative of $f$ may not even exist; one sufficient condition is to require that $f$ be continuous, but that is not necessary. And even if you do require that $f$ be continuous, you're always going to have non-uniqueness. Thus "anti-differentiation" construed as an operation is not really a bona fide "inverse" operation, because it is not single-valued. Or in other words, differentiation is not injective: it identifies many different functions. (Exactly which functions it identifies depends on the topology of the domain they're defined on.)



          In the flipped story, again note that we certainly will never have uniqueness. Given any anti-integral $f$, you can find infinitely many others by changing the values of $f$ at a set of measure zero. We also aren't guaranteed existence of an anti-integral for a given $F$, and this time not even the continuity of $F$ will serve as a sufficient condition. What we need is even stronger, "absolute continuity."



          The Problem with Step 4



          In the standard story, the catch is in "so long as $F$ is any anti-derivative of $f$." The problem is that not every Riemann integrable function has an anti-derivative. If we want to guarantee an anti-derivative, we can impose the additional hypothesis that $f$ is continuous (which is again sufficient but not necessary).



          A similar problem arises in the flipped scenario: given an arbitrary $f$, it might not have an anti-integral. The fundamental theorem for Lebesgue integrals shows that it's both necessary and sufficient to require that $f$ be absolutely continuous. But given the fact that integrals are not sensitive to values on a set of measure zero, the best conclusion we can draw in that case is that an anti-integral of $f$ equals $f'$ "almost everywhere" (meaning, everywhere except at a set of measure zero).







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited 1 hour ago

























          answered 7 hours ago









          symplectomorphicsymplectomorphic

          13.1k2 gold badges20 silver badges43 bronze badges




          13.1k2 gold badges20 silver badges43 bronze badges


























              2














              $begingroup$

              From the point of view of analysis (as hinted at in Henning Makholm's answer) the issue is that the mapping $I:f'to f$ is extremely not one-to-one. When you try to invert it, you find that a great deal of functions are possible "anti-integrals" of a given function. While this does occur for $d:fto f'$ as well, there is a robust mathematical theory about how to address this and how to describe the set of anti-derivatives of a given function. For example, if $f$ is defined on $[a,b]$ then all antiderivatives of $f$ are of the form $$F_i(x)=c_i + int_a^x f(t)dt$$ for constants $c_i$. Although in some contexts the situation becomes more complicated (for example, if we look at $1/x$ defined on $[-1,0)cup(0,1]$ then you have two constants, one for each side) there is a whole field that studies what happens for various domains.



              The situation for inverting $I$ is a lot less rosy. For one thing, if you take any finite subset of the domain you can move the function's value around however you like without changing the value. More generally, as long as two functions agree on a set of measure zero they will have the same integral. As far as I know there are no known ways to fruitfully analyze such a set of functions (a statement that has deep repercussions in machine learning and functional analysis).



              A second issue is that integrating doesn't always ensure that you can differentiate. There are a wide variety of functions $f$ such that the anti-integral doesn't (or doesn't have to) produce a differentiable function! For example, if $1_mathbb{Q}$ denotes the function that takes on the value $1$ on rational inputs and $0$ on irrational inputs, this function has a Lebesgue integral of $0$ (a similar example works for Riemann integral but it's more work). If you take the anti-integral of $f(x)=0$ and get $1_mathbb{Q}$, you can't differentiate and get back $f(x)=0$ because it's not differentiable.



              A commenter mentions vector calculus, and it is true that something like this happens in vector calculus but there are a couple massive caveats.






              share|cite|improve this answer











              $endgroup$




















                2














                $begingroup$

                From the point of view of analysis (as hinted at in Henning Makholm's answer) the issue is that the mapping $I:f'to f$ is extremely not one-to-one. When you try to invert it, you find that a great deal of functions are possible "anti-integrals" of a given function. While this does occur for $d:fto f'$ as well, there is a robust mathematical theory about how to address this and how to describe the set of anti-derivatives of a given function. For example, if $f$ is defined on $[a,b]$ then all antiderivatives of $f$ are of the form $$F_i(x)=c_i + int_a^x f(t)dt$$ for constants $c_i$. Although in some contexts the situation becomes more complicated (for example, if we look at $1/x$ defined on $[-1,0)cup(0,1]$ then you have two constants, one for each side) there is a whole field that studies what happens for various domains.



                The situation for inverting $I$ is a lot less rosy. For one thing, if you take any finite subset of the domain you can move the function's value around however you like without changing the value. More generally, as long as two functions agree on a set of measure zero they will have the same integral. As far as I know there are no known ways to fruitfully analyze such a set of functions (a statement that has deep repercussions in machine learning and functional analysis).



                A second issue is that integrating doesn't always ensure that you can differentiate. There are a wide variety of functions $f$ such that the anti-integral doesn't (or doesn't have to) produce a differentiable function! For example, if $1_mathbb{Q}$ denotes the function that takes on the value $1$ on rational inputs and $0$ on irrational inputs, this function has a Lebesgue integral of $0$ (a similar example works for Riemann integral but it's more work). If you take the anti-integral of $f(x)=0$ and get $1_mathbb{Q}$, you can't differentiate and get back $f(x)=0$ because it's not differentiable.



                A commenter mentions vector calculus, and it is true that something like this happens in vector calculus but there are a couple massive caveats.






                share|cite|improve this answer











                $endgroup$


















                  2














                  2










                  2







                  $begingroup$

                  From the point of view of analysis (as hinted at in Henning Makholm's answer) the issue is that the mapping $I:f'to f$ is extremely not one-to-one. When you try to invert it, you find that a great deal of functions are possible "anti-integrals" of a given function. While this does occur for $d:fto f'$ as well, there is a robust mathematical theory about how to address this and how to describe the set of anti-derivatives of a given function. For example, if $f$ is defined on $[a,b]$ then all antiderivatives of $f$ are of the form $$F_i(x)=c_i + int_a^x f(t)dt$$ for constants $c_i$. Although in some contexts the situation becomes more complicated (for example, if we look at $1/x$ defined on $[-1,0)cup(0,1]$ then you have two constants, one for each side) there is a whole field that studies what happens for various domains.



                  The situation for inverting $I$ is a lot less rosy. For one thing, if you take any finite subset of the domain you can move the function's value around however you like without changing the value. More generally, as long as two functions agree on a set of measure zero they will have the same integral. As far as I know there are no known ways to fruitfully analyze such a set of functions (a statement that has deep repercussions in machine learning and functional analysis).



                  A second issue is that integrating doesn't always ensure that you can differentiate. There are a wide variety of functions $f$ such that the anti-integral doesn't (or doesn't have to) produce a differentiable function! For example, if $1_mathbb{Q}$ denotes the function that takes on the value $1$ on rational inputs and $0$ on irrational inputs, this function has a Lebesgue integral of $0$ (a similar example works for Riemann integral but it's more work). If you take the anti-integral of $f(x)=0$ and get $1_mathbb{Q}$, you can't differentiate and get back $f(x)=0$ because it's not differentiable.



                  A commenter mentions vector calculus, and it is true that something like this happens in vector calculus but there are a couple massive caveats.






                  share|cite|improve this answer











                  $endgroup$



                  From the point of view of analysis (as hinted at in Henning Makholm's answer) the issue is that the mapping $I:f'to f$ is extremely not one-to-one. When you try to invert it, you find that a great deal of functions are possible "anti-integrals" of a given function. While this does occur for $d:fto f'$ as well, there is a robust mathematical theory about how to address this and how to describe the set of anti-derivatives of a given function. For example, if $f$ is defined on $[a,b]$ then all antiderivatives of $f$ are of the form $$F_i(x)=c_i + int_a^x f(t)dt$$ for constants $c_i$. Although in some contexts the situation becomes more complicated (for example, if we look at $1/x$ defined on $[-1,0)cup(0,1]$ then you have two constants, one for each side) there is a whole field that studies what happens for various domains.



                  The situation for inverting $I$ is a lot less rosy. For one thing, if you take any finite subset of the domain you can move the function's value around however you like without changing the value. More generally, as long as two functions agree on a set of measure zero they will have the same integral. As far as I know there are no known ways to fruitfully analyze such a set of functions (a statement that has deep repercussions in machine learning and functional analysis).



                  A second issue is that integrating doesn't always ensure that you can differentiate. There are a wide variety of functions $f$ such that the anti-integral doesn't (or doesn't have to) produce a differentiable function! For example, if $1_mathbb{Q}$ denotes the function that takes on the value $1$ on rational inputs and $0$ on irrational inputs, this function has a Lebesgue integral of $0$ (a similar example works for Riemann integral but it's more work). If you take the anti-integral of $f(x)=0$ and get $1_mathbb{Q}$, you can't differentiate and get back $f(x)=0$ because it's not differentiable.



                  A commenter mentions vector calculus, and it is true that something like this happens in vector calculus but there are a couple massive caveats.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited 7 hours ago

























                  answered 7 hours ago









                  Stella BidermanStella Biderman

                  26.9k6 gold badges34 silver badges75 bronze badges




                  26.9k6 gold badges34 silver badges75 bronze badges


























                      0














                      $begingroup$

                      Not exactly an answer to your question, but related:



                      One thing that has always bugged me, in a sense, is this: When you have higher order derivatives of $f(x)$, instead of writing $f''(x)$ and $f'(x)$, people start using a number between parenthesis. So the 5th order derivative is $f^{(5)}(x)$.



                      However, this notation doesn't translate back to integrals, for some reason. If you have a function $f(x)$, and $f^{(1)}(x)$ is the first derivative of $f$, then wouldn't it make sense to denote the anti-derivative/integral of $f$ as $f^{(-1)}(x)$?



                      So for example if $f(x)$ is some acceleration function, then the speed function is $f^{(-1)}(x)$ and the position function is $f^{(-2)}(x)$. I think this is good notation.






                      share|cite|improve this answer









                      $endgroup$















                      • $begingroup$
                        This notation does appear to be used in the contexts of repeated integration and fractional calculus. That said, this really seems like a comment more than an answer...
                        $endgroup$
                        – Semiclassical
                        8 hours ago






                      • 1




                        $begingroup$
                        You may find this of interest.
                        $endgroup$
                        – J.G.
                        8 hours ago
















                      0














                      $begingroup$

                      Not exactly an answer to your question, but related:



                      One thing that has always bugged me, in a sense, is this: When you have higher order derivatives of $f(x)$, instead of writing $f''(x)$ and $f'(x)$, people start using a number between parenthesis. So the 5th order derivative is $f^{(5)}(x)$.



                      However, this notation doesn't translate back to integrals, for some reason. If you have a function $f(x)$, and $f^{(1)}(x)$ is the first derivative of $f$, then wouldn't it make sense to denote the anti-derivative/integral of $f$ as $f^{(-1)}(x)$?



                      So for example if $f(x)$ is some acceleration function, then the speed function is $f^{(-1)}(x)$ and the position function is $f^{(-2)}(x)$. I think this is good notation.






                      share|cite|improve this answer









                      $endgroup$















                      • $begingroup$
                        This notation does appear to be used in the contexts of repeated integration and fractional calculus. That said, this really seems like a comment more than an answer...
                        $endgroup$
                        – Semiclassical
                        8 hours ago






                      • 1




                        $begingroup$
                        You may find this of interest.
                        $endgroup$
                        – J.G.
                        8 hours ago














                      0














                      0










                      0







                      $begingroup$

                      Not exactly an answer to your question, but related:



                      One thing that has always bugged me, in a sense, is this: When you have higher order derivatives of $f(x)$, instead of writing $f''(x)$ and $f'(x)$, people start using a number between parenthesis. So the 5th order derivative is $f^{(5)}(x)$.



                      However, this notation doesn't translate back to integrals, for some reason. If you have a function $f(x)$, and $f^{(1)}(x)$ is the first derivative of $f$, then wouldn't it make sense to denote the anti-derivative/integral of $f$ as $f^{(-1)}(x)$?



                      So for example if $f(x)$ is some acceleration function, then the speed function is $f^{(-1)}(x)$ and the position function is $f^{(-2)}(x)$. I think this is good notation.






                      share|cite|improve this answer









                      $endgroup$



                      Not exactly an answer to your question, but related:



                      One thing that has always bugged me, in a sense, is this: When you have higher order derivatives of $f(x)$, instead of writing $f''(x)$ and $f'(x)$, people start using a number between parenthesis. So the 5th order derivative is $f^{(5)}(x)$.



                      However, this notation doesn't translate back to integrals, for some reason. If you have a function $f(x)$, and $f^{(1)}(x)$ is the first derivative of $f$, then wouldn't it make sense to denote the anti-derivative/integral of $f$ as $f^{(-1)}(x)$?



                      So for example if $f(x)$ is some acceleration function, then the speed function is $f^{(-1)}(x)$ and the position function is $f^{(-2)}(x)$. I think this is good notation.







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered 8 hours ago









                      Victor S.Victor S.

                      6532 silver badges16 bronze badges




                      6532 silver badges16 bronze badges















                      • $begingroup$
                        This notation does appear to be used in the contexts of repeated integration and fractional calculus. That said, this really seems like a comment more than an answer...
                        $endgroup$
                        – Semiclassical
                        8 hours ago






                      • 1




                        $begingroup$
                        You may find this of interest.
                        $endgroup$
                        – J.G.
                        8 hours ago


















                      • $begingroup$
                        This notation does appear to be used in the contexts of repeated integration and fractional calculus. That said, this really seems like a comment more than an answer...
                        $endgroup$
                        – Semiclassical
                        8 hours ago






                      • 1




                        $begingroup$
                        You may find this of interest.
                        $endgroup$
                        – J.G.
                        8 hours ago
















                      $begingroup$
                      This notation does appear to be used in the contexts of repeated integration and fractional calculus. That said, this really seems like a comment more than an answer...
                      $endgroup$
                      – Semiclassical
                      8 hours ago




                      $begingroup$
                      This notation does appear to be used in the contexts of repeated integration and fractional calculus. That said, this really seems like a comment more than an answer...
                      $endgroup$
                      – Semiclassical
                      8 hours ago




                      1




                      1




                      $begingroup$
                      You may find this of interest.
                      $endgroup$
                      – J.G.
                      8 hours ago




                      $begingroup$
                      You may find this of interest.
                      $endgroup$
                      – J.G.
                      8 hours ago



















                      draft saved

                      draft discarded



















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3354566%2fcan-derivatives-be-defined-as-anti-integrals%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Taj Mahal Inhaltsverzeichnis Aufbau | Geschichte | 350-Jahr-Feier | Heutige Bedeutung | Siehe auch |...

                      Baia Sprie Cuprins Etimologie | Istorie | Demografie | Politică și administrație | Arii naturale...

                      Ciclooctatetraenă Vezi și | Bibliografie | Meniu de navigare637866text4148569-500570979m