In Section 15, when Servois expands \(F(x)\) in terms of \[\frac{x - p}{\alpha} ,\] he finds the first order coefficient is \[\Delta F (p) - \frac{1}{2}\Delta^{2} F (p) + \frac{1}{3}\Delta^{3} F (p) - \cdots.\] Lagrange’s work [1797] leads us to expect that the first order coefficient in a power series expansion should tell us about the derivative of the function. Indeed, combining this formula and his results of Section 14, Servois is led to define the differential \(\mbox d\) of an arbitrary function \(z\) as follows: \[{\mbox d}z = \Delta z - \frac{1}{2}\Delta^{2}z + \frac{1}{3}\Delta^{3}z - \cdots.\] Then, he investigates the higher order powers of the differential and derives several forms of the Taylor series in his equations (45)-(48).
Of these, equation (48) is the easiest to recognize as a Taylor series: \[F(x) = F(x_0) + \frac{x}{\alpha} {\mbox d} F (x_0) + \frac{x^2}{1 \cdot 2 \cdot \alpha^2}{\mbox d}^2 F(x_0) + \frac{x^3}{1 \cdot 2 \cdot 3 \cdot \alpha^3}{\mbox d}^3 F(x_0) + \cdots.\] If we let \(z\) be the identity function \(f(x) = x\) and \(\alpha\) be the increment in \(x\), then \(\Delta x = \alpha\) and \(\Delta^n x = 0\) for \(n > 1\). Using Servois’ definition of the differential, we have \({\mbox d} x = \alpha\). If we therefore replace \(\alpha\) in (48) with \({\mbox d} x\) and formally rearrange each term, we see that the coefficient of \(x^n\) is \[\frac{1}{n!} \frac{{\mbox d}^n F(x_0)}{{\mbox d} x^n} = \frac{F^{(n)}(x_0)}{n!},\] which is the familiar coefficient.
In eighteenth century infinitesimal calculus, differentials like \({\mbox d} x\) and \({\mbox d} y\) were thought of as infinitely small quantities. In definition (5), above, Servois has succeeded in defining the differential without recourse to the infinitely small, but in terms of an infinite series of finite differences. This provided him with a more satisfactory foundation for calculus; however, it made the derivation of the rules of calculus more difficult.
Definition (5) is easy to apply to polynomials. Consider the simple case of \(z = x^2\). Then, \[\Delta z = (x + \alpha)^2 - x^2 = 2x \alpha + \alpha^2 \quad \rm{and}\] \[\Delta^2 z = [2(x + \alpha) \alpha + \alpha^2] - [2x \alpha + \alpha^2] = 2\alpha^2.\] All higher orders of the difference are zero. (It is actually an easy induction to show that if \(z\) is a polynomial of degree \(n\), then \(\Delta^{n+1} z\), and higher differences, are 0.) Substituting the above differences into definition (5), we obtain \({\mbox d} z = 2x \alpha.\) Understanding \(\alpha\) to be \(dx\), we have that \({\mbox d} z = 2x dx.\) Definition (5) is not so easy to apply when \(z\) is a transcendental function. If you attempt to apply the same procedure to \(\sin x\), then you will notice that no amount of manipulation will simplify the differences. Therefore, Servois had to invent a new function to aid in evaluating trigonometric differentials (see his Section 18).