Can we say according to calculus that infinitesimally small is equal to 0 ?
Answers
Answer:
nope... it tends to zero that doesn't mean it is equal to zero
Answer:
Step-by-step explanation:
When Isaac Newton and Gottfried Wilhelm Leibniz first formulated differential calculus they effectively made use of the concept of an infinitesimal, which they referred to as an infinitely small number, whatever that was supposed to mean. For example, Leibniz, who introduced the d notation for differentials, said in deriving d(xy)=xdy+ydx that this follows from d(xy) = (x+dx)(y+dy) − xy = xdy + ydx + dxdy and the omission of the quantity dxdy, which is infinitely small in comparison with the rest, for it is supposed that dx and dy are infinitely small.
In effect he was saying that the although infinitesimals are not zero the product of infinitesimals is zero.
Newton's analysis involved taking ratios of infinitesimals. Those terms for the ratio that which had an infinitesimal as a factor were equated to zero, or as he expressed it...
[T]erms which have [an infinitesimal] as a factor will be equivalent to nothing in respect to the others. I therefore cast them out…: