## What Is The Logarithm of a Negative Number?

Learning of imaginary numbers, things created to be the square roots of negative numbers, inspired me. It probably inspires anyone who’s the sort of person who’d become a mathematician. The trick was great. I wondered could I do it? Could I find some other useful expansion of the number system?

The square root of a complex-valued number sounded like the obvious way to go, until a little later that week when I learned that’s just some other complex-valued numbers. The next thing I hit on: how about the logarithm of a negative number? Couldn’t that be a useful expansion of numbers?

No. It turns out you can make a sensible logarithm of negative, and complex-valued, numbers using complex-valued numbers. Same with trigonometric and inverse trig functions, tangents and arccosines and all that. There isn’t anything we can do with the normal mathematical operations that needs something bigger than the complex-valued numbers to play with. It’s possible to expand on the complex-valued numbers. We can make quaternions and some more elaborate constructs there. They don’t solve any particular shortcoming in complex-valued numbers, but they’ve got their uses. I never got anywhere near reinventing them. I don’t regret the time spent on that. There’s something useful in trying to invent something even if it fails.

One problem with mathematics — with all intellectual fields, really — is that it’s easy, when teaching, to give the impression that this stuff is the Word of God, built into the nature of the universe and inarguable. It’s so not. The stuff we find interesting and how we describe those things are the results of human thought, attempts to say what is interesting about a thing and what is useful. And what best approximates our ideas of what we would like to know. So I was happy to see this come across my Twitter feed:

The links to a 12-page paper by Deepak Bal, Leibniz, Bernoulli, and the Logarithms of Negative Numbers. It’s a review of how the idea of a logarithm of a negative number got developed over the course of the 18th century. And what great minds, like Gottfried Leibniz and John (I) Bernoulli argued about as they find problems with the implications of what they were doing. (There were a lot of Bernoullis doing great mathematics, and even multiple John Bernoullis. The (I) is among the ways we keep them sorted out.) It’s worth a read, I think, even if you’re not all that versed in how to calculate logarithms. (but if you’d like to be better-versed, here’s the tail end of some thoughts about that.) The process of how a good idea like this comes to be is worth knowing.

Also: it turns out there’s not “the” logarithm of a complex-valued number. There’s infinitely many logarithms. But they’re a family, all strikingly similar, so we can pick one that’s convenient and just use that. Ask if you’re really interested.

## 16,000 and a Square

I reached my 16,000th page view, sometime on Thursday. That’s a tiny bit slower than I projected based on May’s readership statistics, but May was a busy month and I’ve had a little less time to write stuff this month, so I’m not feeling bad about that.

Meanwhile, while looking for something else, I ran across a bit about mathematical notation in Florian Cajori’s A History of Mathematical Notation which has left me with a grin since. The book is very good about telling the stories of just what the title suggests. It’s a book well worth dipping into because everything you see written down is the result of a long process of experimentation and fiddling about to find the right balance of “expressing an idea clearly” and “expressing an idea concisely” and “expressing an idea so it’s not too hard to work with”.

The idea here is the square of a variable, which these days we’d normally write as $a^2$. According to Cajori (section 304), René Descartes “preferred the notation $aa$ to $a^2$.” Cajori notes that Carl Gauss had this same preference and defended it on the grounds that doubling the symbol didn’t take any more (or less) space than the superscript 2 did. Cajori lists other great mathematicians who preferred doubling the letter for squaring, including Christiaan Huygens, Edmond Halley, Leonhard Euler, and Isaac Newton. Among mathematicians who preferred $a^2$ were Blaise Pascal, David Gregory (who was big in infinite series), and Wilhelm Leibniz.

Well of course Newton and Leibniz would be on opposite sides of the $aa$ versus $a^2$ debate. How could the universe be sensible otherwise?

## Where Do Negative Numbers Come From?

Some time ago — and I forget when, I’m embarrassed to say, and can’t seem to find it because the search tool doesn’t work on comments — I was asked about how negative numbers got to be accepted. That’s a great question, particularly since while it seems like the idea of positive numbers is probably lost in prehistory, negative numbers definitely progressed in the past thousand years or so from something people might wildly speculate about to being a reasonably comfortable part of daily mathematics.

While searching for background information I ran across a doctoral thesis, Making Sense Of Negative Numbers, which is uncredited in the PDF I just linked to but appears to be by Dr Cecilia Kilhamn, of the University of Gothenburg, Sweden. Dr Kilhamn’s particular interest (here) is in how people learn to use negative numbers, so most of the thesis is about the conceptual difficulties people have when facing the minus sign (not least because it serves two roles, of marking a number as negative and of marking the subtraction operation), but the first chapters describe the historical process of developing the concept of negative numbers.