## Machines That Think About Logarithms

I confess that I picked up Edmund Callis Berkeley’s **Giant Brains: Or Machines That Think**, originally published 1949, from the library shelf as a source of cheap ironic giggles. After all, what is funnier than an attempt to explain to a popular audience that, wild as it may be to contemplate, electrically-driven machines could “remember” information and follow “programs” of instructions based on different conditions satisfied by that information? There’s a certain amount of that, though not as much as I imagined, and a good amount of descriptions of how the hardware of different partly or fully electrical computing machines of the 1940s worked.

But a good part, and the most interesting part, of the book is about algorithms, the ways to solve complicated problems without demanding too much computing power. This is fun to read because it showcases the ingenuity and creativity required to *do* useful work. The need for ingenuity will never leave us — we will always want to compute things that are a little beyond our ability — but to see how it’s done for a simple problem is instructive, if for nothing else to learn the kinds of tricks you can do to get the most of your computing resources.

The example that most struck me and which I want to share is from the chapter on the IBM Automatic Sequence-Controlled Calculator, built at Harvard at a cost of “somewhere near 3 or 4 hundred thousand dollars, if we leave out some of the cost of research and development, which would have been done whether or not this particular machine had ever been built”. It started working in April 1944, and wasn’t officially retired until 1959. It could store 72 numbers, each with 23 decimal digits. Like most computers (then and now) it could do addition and subtraction very quickly, in the then-blazing speed of about a third of a second; it could do multiplication tolerably quickly, in about six seconds; and division, rather slowly, in about fifteen seconds.

The process I want to describe is the taking of logarithms, and *why* logarithms should be interesting to compute takes a little bit of justification, although it’s implicitly there just in how fast calculations get done. Logarithms let one replace the multiplication of numbers with their addition, for a considerable savings in time; better, they let you replace the division of numbers with subtraction. They further let you turn exponentiation and roots into multiplication and division, which is almost always faster to do. Many human senses seem to work on a logarithmic scale, as well: we can tell that one weight is twice as heavy as the other much more reliably than we can tell that one weight is four pounds heavier than the other, or that one light is twice as bright as the other rather than is ten lumens brighter.

What the logarithm of a number is depends on some other, fixed, quantity, known as the base. In principle any positive number will do as base; in practice, these days people mostly only care about base e (which is a little over 2.718), the “natural” logarithm, because it has some nice analytic properties. Back in the day, which includes when this book was written, we also cared about base 10, the “common” logarithm, because we mostly work in base ten. I have heard of people who use base 2, but haven’t seen them myself and must regard them as an urban legend. The other bases are mostly used by people who are writing homework problems for the part of the class dealing with logarithms. To some extent it doesn’t matter what base you use. If you work out the logarithm in one base, you can convert that to the logarithm in another base by a multiplication.

The logarithm of some number in your base is the exponent you have to raise the base to to get your desired number. For example, the logarithm of 100, in base 10, is going to be 2 because 10^{2} is 100, and the logarithm of e^{1/3} (a touch greater than 1.3956), in base e, is going to be 1/3. To dig deeper in my reserve of in-jokes, the logarithm of 2038, in base 10, is approximately 3.3092, because 10^{3.3092} is just about 2038. The logarithm of e, in base 10, is about 0.4343, and the logarithm of 10, in base e, is about 2.303. Your calculator will verify all that.

All that talk about “approximately” should have given you some hint of the trouble with logarithms. They’re only really *easy* to compute if you’re looking for whole powers of whatever your base is, and that if your base is 10 or 2 or something else simple like that. If you’re clever and determined you can work out, say, that the logarithm of 2, base 10, has to be close to 0.3. It’s fun to do that, but it’ll involve such reasoning as “two to the tenth power is 1,024, which is very close to ten to the third power, which is 1,000, so therefore the logarithm of two to the tenth power must be about the same as the logarithm of ten to the third power”. That’s clever and fun, but it’s hardly systematic, and it doesn’t get you many digits of accuracy.

So when I pick up this thread I hope to explain one way to produce as many decimal digits of a logarithm as you could want, without asking for too much from your poor Automatic Sequence-Controlled Calculator.

## Boxing Pythagoras 6:33 pm

onSunday, 24 August, 2014 Permalink |I definitely need this book, now. Computers have been so naturally ingrained in our lives that people often forget how incredulous the concept once was.

LikeLike

## Joseph Nebus 8:40 pm

onTuesday, 26 August, 2014 Permalink |Oh, yes, people do forget how fantastical computers were until quite recently. This book goes into a lot of the hardware, though, which I think is great. There’s something wonderful in how a piece of metal can be made to remember a fact, and it’s not really explained anymore in introductions to computers even though that’s become much more standard than memory had been in the 1940s.

I’d picked up the book from the university library. I haven’t checked how available and expensive it is as a used book.

LikeLike

## chrisbrakeshow 3:04 am

onTuesday, 26 August, 2014 Permalink |This is some pretty heavy stuff, man. Thanks for digging deep!

-john

LikeLike

## Joseph Nebus 8:41 pm

onTuesday, 26 August, 2014 Permalink |I’m glad you like. There’ll be a follow-up soon.

LikeLiked by 1 person

## chrisbrakeshow 6:25 am

onWednesday, 27 August, 2014 Permalink |Awesome man, I will keep my eyes peeled and/or glued to the WP Reader!

-john

LikeLike

## howardat58 3:43 am

onWednesday, 27 August, 2014 Permalink |I am a bit stuck with the text only in the comment section but here goes:

It started with logs-like-products so I wrote the chord slope function for log as

(log(x + x*h) – log(x)/(x*h)

which quickly and to my surprise became

(1/x)*log(1+h)/h

This provided a rationale for the formal definition of log(x) as

integral(1/t) from t=1 to t=x

I then thought to try the standard “add up the rectangles” approach, but with unequal widths, in a geometric progression.

So for 5 intervals and k^5=x the interval points were 1, k, k^2, k^3, k^4 and k^5 (=x)

The sum of the rectangles came to 5*(1 – 1/k) which eventually, via the trapezoidal rule, gave the n th estimate of log(x) as

n(root(x,n) – 1/root(x,n))

where root(x,n) is the nth root of x.

I tried it out with n as a power of 2, so square rooting is the only messy bit, and with n=2^10 I got log(e) = 0.9999…..

That is 4 dp in ten steps

Not brilliant but it works, and I have never seen anything like this before.

LikeLike

## Joseph Nebus 2:22 am

onThursday, 28 August, 2014 Permalink |I’m still working out to my satisfaction the algebra behind this (in particular the formula n(root(x, n) – 1/root(x, n)) keeps coming out about twice the natural logarithm and I’m not sure just where the two comes in, but have suspicions), but I do believe you’re right about the basic approach. I believe that it’s structurally similar to the “method of exhaustion” that the Ancient Greeks would use to work out the areas underneath a curve, long before there was a calculus to work out these problems. In any case it’s a good scheme.

Of course, the tradeoff for this is that you need to have the n-th root of the number whose logarithm you want. This might not be too bad, especially if you decide to use an n that’s a power of two, though.

LikeLike

## howardat58 5:01 pm

onThursday, 28 August, 2014 Permalink |Transcription error !

The divisor of 2 was scribbled so small that I missed it.

Yes, it should be divided by 2.

More on a separate post, the lines are getting shorter.

LikeLike

## Joseph Nebus 2:44 am

onFriday, 29 August, 2014 Permalink |I’m glad to have that sorted out.

If you’d like, I’d be happy to set your work as a guest post on the main page. WordPress even allows the use of basic LaTeX commands … certainly in main posts; I’m not sure how it does in comments. (I’m afraid to try, given how hard it can be to get right the first time and that there’s no preview and editing of comments that I’ve found.)

LikeLike

## howardat58 3:26 am

onFriday, 29 August, 2014 Permalink |I would like that. Thankyou.

What I use is a very simple program called mathedit , which lets you put the stuff in picture form (media) and I figured out how to modify the display size to fill the width. Have a look at my latest post to see the effect.

If I save any pics to my blogsite you can access them, put them in a post and so on using the url

I was about to start on this, with explanation, anyway, and I have figured out a way of improving the trapezium rule method dramatically.

If you email me at howard_at_58@yahoo.co.uk I can send you stuff directly to check out.

Gracias

LikeLike

## howardat58 3:32 am

onFriday, 29 August, 2014 Permalink |I am putting this as a separate reply as it might then be readable!!

I would like that. Thankyou.

What I use is a very simple program called mathedit , which lets you put the stuff in picture form (media) and I figured out how to modify the display size to fill the width. Have a look at my latest post to see the effect.

If I save any pics to my blogsite you can access them, put them in a post and so on using the url

I was about to start on this, with explanation, anyway, and I have figured out a way of improving the trapezium rule method dramatically.

If you email me at howard_at_58@yahoo.co.uk I can send you stuff directly to check out.

Gracias

LikeLike

## howardat58 3:33 am

onFriday, 29 August, 2014 Permalink |That didn’t work!!!!!!!!!!!!!!!!! trying again

I would like that. Thankyou.

What I use is a very simple program called mathedit , which lets you put the stuff in picture form (media) and I figured out how to modify the display size to fill the width. Have a look at my latest post to see the effect.

If I save any pics to my blogsite you can access them, put them in a post and so on using the url

I was about to start on this, with explanation, anyway, and I have figured out a way of improving the trapezium rule method dramatically.

If you email me at howard_at_58@yahoo.co.uk I can send you stuff directly to check out.

Gracias

LikeLike

## nebusresearch | My Math Blog Statistics, August 2014 11:52 pm

onMonday, 1 September, 2014 Permalink |[…] Machines That Think About Logarithms, so I’m predicting good things for the follow-up on machines which do something about logarithms. […]

LikeLike

## nebusresearch | Machines That Do Something About Logarithms 10:05 pm

onWednesday, 3 September, 2014 Permalink |[…] I’m going to assume everyone reading this accepts that logarithms are worth computing, and try to describe how Harvard’s IBM Automatic Sequence-Controlled Calculator would work them… […]

LikeLike