The very foundation of Differential Calculus
Although, I am too naive when it comes to mathematics in general, I have been pondering over some fundamental questions for a long time. There is an ongoing debate about who invented differential calculus but I will not get into that. This article is about a core concept in 'limits', one of the founding principles in differential calculus. It all started with the question of finding a slope to a curve at a given point. Consider a secant intercepting the curve (a continuous curve) at any two points. Now, as we move the secant gradually towards the point at which we need to find the tangent, the length of the secant also decreases. When the length of the secant 'tends to zero', i.e., when the distance between the two intercept points 'tends to zero', you get the 'tangent' to the curve which touches it at 'effectively' one point. Now, what does it really mean by 'tends to zero'? When we say that the length (of the secant) 'tends' to zero, we are saying that it is not actually zero but 'very close' to it. However, when do we say that it is 'very close'? 'Very close' does not imply 'equality' (it cannot imply, else the whole premise falls flat). The smallest possible length in the known universe is called a 'Planck Length'. Does 'very close' equate to a unit 'Planck Length'? An argument could be made that it might not be the case and that, 'very close' would imply something even smaller in value. So how small can that value be? Values are 'counted' to denote 'something'. For example, we all know what '10 apples' means. We can even have a 0.1 of an apple. We could in principle have even a 10^(-5) of an apple. But, there is a point where it ceases to BE an 'apple'. For example, it doesn't make any sense to say 'a Planck Length of an apple'. Thus, I ponder - what does 'very close' actually mean? (can a numeric value be smaller than what can possibly be "counted" in the known universe?)