Spec138 Posted April 4, 2009 Share Posted April 4, 2009 lets say you start off at point 0. and every step you take you get half way closer to point 1. YOU WILL NEVER REACH POINT ONE!!!! EVER. But the difference between the two will be so infinitesimal that we just say that you arrive at point one. Here is what you would get if you were to run a program that would calculate this. Well isn't the concept of infinity is that it's unattainable? Basically you could say that this series: .9x10^0 + .9x10^1 + .9x10^2... is equal to .999... correct? So each time you add another part of the above series, you are reducing the error between the series and 1 by a factor of 10. Since you are reducing the error each time, as you approach infinity your error would be 1/(infinity) of what it originally was. 1 - 1/infinity = 1. Link to comment Share on other sites More sharing options...
This topic is now archived and is closed to further replies.