It's an understandable mistake and one that was historically widespread... I believe British mathematics after Newton was quite hindered by it, much more progress was made on the continent where they were much faster to accept complex numbers for what they were.
I think the root of the problem can be expressed in this way: we get our concept of number from abstraction from various physical instances. For instance, whole numbers such as the number 3 come from the perceived commonality between three fish and three rocks and three trees, and the consistent way in which this abstraction behaves. Addition and multiplication are similarly sourced. This is all very fine, but there are two pitfalls.
Firstly, you can assign some kind of special ontological status to the number: you may be so impressed by the abstraction that you say that the number 3 is 'real'. Whether this is correct naturally depends on what you define 'real' to mean; if you mean a specific, solid entity, then clearly the number 3 is not real, and neither is the number i, for that matter. If, however, you mean an abstraction that behaves in a consistent manner, then the number 3 is real, and so is the number i.
And secondly, you can identify that which is abstracted from with the abstraction itself, and end up with too narrow a definition. For instance, you may make the mistake of saying that multiplication of n by m just is the number of things you have when you have n lots of m things (fish, rocks, trees). Of course, this becomes very problematic when you come across irrational numbers, and you may, like the Pythagoreans, reject 'the number that multiplies by itself to give 2', as a 'hypothetical' or an 'absurdity'. The correct approach, as we now know, is simply to accommodate such entities into your abstraction; as long as it's still consistent. Not only does this give you a much richer theory with the potential to provide new truths about your simpler abstraction: it also turns out that there is a correct 'grounding' to abstract from; namely lengths (the diagonal of a unit square being the 'hypothetical' number). If you view multiplication in the aforementioned non-abstract way, you will reject i, that is, 'the number which multiplies by itself to -1', as something which is patently not 'real'. And in the very limited scope to which you have confined yourself, you are effectively correct; no grouping of groupings of physical objects will give you -1. But again, if we simply accommodate i, we again get a consistent system which is much more powerful than the old one, even in the old one's domain. And again, it actually turns out that a more general grounding, namely that of two dimensional arrows, encapsulates your abstraction.
|
|
Bookmarks