Wrong.

y = x^3 / x - 5(.999~)

If 1 = .999~ then y would be undefined at x = 5. When x = 5, that equation equals -5. You have to use limits to show it, but it's -5, not undefined.



Limits is a weird thing. Those proofs are correct on wiki, however you can also prove that they don't work. This happens a lot with limits and calculus.