It was not a straw man, it was the point I was trying to make from my very first post. I made one simple comment saying that math was imperfect. Xei and Cmind then blatantly attacked me: |
|
It was not a straw man, it was the point I was trying to make from my very first post. I made one simple comment saying that math was imperfect. Xei and Cmind then blatantly attacked me: |
|
Nah, what's actually happened here is that I've repeatedly tried to explain that your queries were addressed in the thread already and I also briefly summarised for you, and you've repeatedly ignored me. |
|
Your first comment was (paraphrasing) "I've struggled with this since I was little, and it's proof maths isn't perfect". That right there is enough to get alarm bells ringing for me, because this is not a particularly hard concept to grasp, and I certainly do not consider myself an amazing mathematician (though given that I studied Chemistry, I suppose I've got to be reasonably competent at it). |
|
Last edited by Photolysis; 03-02-2011 at 09:02 PM.
I can't believe there's an actual argument going on here. Those proofs are faulty, as at some point you either reach an infinite number of decimals, or must have a ...3334 ending to allow it to become 1. |
|
|
|
Last edited by A Roxxor; 03-16-2011 at 02:43 AM.
You fail a large amount. |
|
While it may seem ridiculous... the way I see it is 1 is separate from .999... by the fact that in Euclidean mathematics, they are exactly one infinitesimally small step down or up, similar to dividing a segment infinite times. You can always divide it once more. |
|
And what is wrong with the algebra and calculus of the previous proofs? |
|
"0.999...8" doesn't exist. There's no such concept, not even in abstract math. The mere fact that you would even bring up a thing like "oh, the ellipsis is an infinite number of 9s" (which implies that the 8 is the "infinity + 1"-th term), is absolutely ridiculous. |
|
Last edited by PhilosopherStoned; 03-16-2011 at 11:41 AM.
Previously PhilosopherStoned
You stand in error. Convention of names. Irrational means the inability to provide a name using the given naming convention. The approximation of a name, is still not a name. When you write 1/3 = .3333, you are actually using the equal sign in error. |
|
Last edited by Philosopher8659; 03-16-2011 at 06:14 PM.
Please point out the error in my proof. |
|
Previously PhilosopherStoned
How do people not get that infinity means never ending? |
|
It has zero applications really. If you tried to manufacture an object that was 0.999... metres across, even if you didn't know it equaled 1 metre, for all intents and purposes you would end up manufacturing an object 1 metre across, because you'd round up or whatever. |
|
Last edited by Xei; 03-27-2011 at 05:17 PM.
You know, the "1/3 = 0.333..., therefore 0.999... = 3*0.333... = 3/3 = 1" proof says more about people than mathematics. |
|
It is a proof if you accept that 1/3 = 0.333 |
|
Last edited by PhilosopherStoned; 03-27-2011 at 09:00 PM.
Previously PhilosopherStoned
So.... does this mean that infinite doesn't exist? |
|
There's nothing here about infinite converging to a finite value. What's converging to a finite value is an infinite amount of finite numbers. For this to work, the numbers have to get smaller "faster" than the sum gets bigger. |
|
Previously PhilosopherStoned
"Infinity" is an abstract concept. |
|
Yes but infinity is still real. I know what you're saying completely, but the universe/nature/whatever this is will go on forever, even though forever is just a concept. The concept is just explaining a real occurrence. |
|
Definite integrals are defined in terms of areas under curves, which are in turn defined as limits of Riemann sums as the number of rectangles go to infinity... the fact that this is equivalent to antidifferentiation, which I suppose is what you mean by Liebniz, is something that requires proof. Definite integrals are fundamentally an infinite sum, hence the elongated S that represents them. |
|
Right but none of that happened until the 1800s or so. It is intuitively very clear. I'm not saying it's right. It was the confusion that resulting from doing things that way that led us to rigour outside of geometry. But yeah, you're pretty much right. |
|
Previously PhilosopherStoned
Bookmarks