How To Build Infinite Numbers


I had missed it, as mentioned in the above tweet. The link is to a page on the Form And Formalism blog, reprinting a translation of one of Georg Cantor’s papers in which he founded the modern understanding of sets, of infinite sets, and of infinitely large numbers. Although it gets into pretty heady topics, it doesn’t actually require a mathematical background, at least as I look at it; it just requires a willingness to follow long chains of reasoning, which I admit is much harder than algebra.

Cantor — whom I’d talked a bit about in a recent Reading The Comics post — was deeply concerned and intrigued by infinity. His paper enters into that curious space where mathematics, philosophy, and even theology blend together, since it’s difficult to talk about the infinite without people thinking of God. I admit the philosophical side of the discussion is difficult for me to follow, and the theological side harder yet, but a philosopher or theologian would probably have symmetric complaints.

The translation is provided as scans of a typewritten document, so you can see what it was like trying to include mathematical symbols in non-typeset text in the days before LaTeX (which is great at it, but requires annoying amounts of setup) or HTML (which is mediocre at it, but requires less setup) or Word (I don’t use Word) were available. Somehow, folks managed to live through times like that, but it wasn’t pretty.

George Berkeley’s 329th Birthday


The stream of mathematics-trivia tweets brought to my attention that the 12th of March, 1685 [1], was the birthday of George Berkeley, who’d become the Bishop of Cloyne and be an important philosopher, and who’s gotten a bit of mathematical immortality for complaining about calculus. Granted everyone who takes it complains about calculus, but Berkeley had the good sorts of complaints, the ones that force people to think harder and more clearly about what they’re doing.

Berkeley — whose name I’m told by people I consider reliable was pronounced “barkley” — particularly protested the “fluxions” of calculus as it was practiced in the day in his 1734 tract The Analyst: Or A Discourse Addressed To An Infidel Mathematician, which as far as I know nobody I went to grad school with ever read either, so maybe you shouldn’t bother reading what I have to say about them.

Fluxions were meant to represent infinitesimally small quantities, which could be added to or subtracted from a number without changing the number, but which could be divided by one another to produce a meaningful answer. That’s a hard set of properties to quite rationalize — if you can add something to a number without changing the number, you’re adding zero; and if you’re dividing zero by zero you’re not doing division anymore — and yet calculus was doing just that. For example, if you want to find the slope of a curve at a single point on the curve you’d take the x- and y-coordinates of that point, and add an infinitesimally small number to the x-coordinate, and see how much the y-coordinate has to change to still be on the curve, and then divide those changes, which are too small to even be numbers, and get something out of it.

It works, at least if you’re doing the calculations right, and Berkeley supposed that it was the result of multiple logical errors cancelling one another out that they did work; but he termed these fluxions with spectacularly good phrasing “ghosts of departed quantities”, and it would take better than a century to put all his criticisms quite to rest. The result we know as differential calculus.

I should point out that it’s not as if mathematicians playing with their shiny new calculus tools were being irresponsible in using differentials and integrals despite Berkeley’s criticisms. Mathematical concepts work a good deal like inventions, in that it’s not clear what is really good about them until they’re used, and it’s not clear what has to be made better until there’s a body of experience working with them and seeing where the flaws. And Berkeley was hardly being unreasonable for insisting on logical rigor in mathematics.

[1] Berkeley was born in Ireland. I have found it surprisingly hard to get a clear answer about when Ireland switched from the Julian to the Gregorian calendar, so I have no idea whether this birthdate is old style or new style, and for that matter whether the 1685 represents the civil year or the historical year. Perhaps it suffices to say that Berkeley was born sometime around this time of year, a long while ago.

November 2013’s Statistics


Hi again. I was hesitant to look at this month’s statistics, as I pretty much fell off the face of the earth for a week there, but I didn’t have the chance to do the serious thinking that’s needed for mathematics writing. The result’s almost exactly the dropoff in readership I might have predicted: from 440 views in October down to 308, and from 220 unique visitors down to 158. That’s almost an unchanged number of views per visitor, 2.00 dropping to 1.95, so at least the people still interested in me are sticking around.

The countries sending me the most viewers were as ever the United States, then Austria (hi, Elke, and thank you), the United Kingdom and then Canada. Sending me a single visitor each were Bulgaria, Cyprus, Czech Republic, Ethiopia, France, Jordan, Lebanon, Nepal, New Zealand, Russia, Singapore, Slovenia, Switzerland, and Thailand. This is also a drop in the number of single-viewer countries, although stalwarts Finland and the Netherlands are off the list. Slovenia’s the only country making a repeat appearance from last month, in fact.

The most popular articles the past month were:

And I apologize for not having produced many essays the past couple weeks, and can only fault myself for being more fascinated by some problems in my day job that’ve been taking up time and mental energy and waking me in the middle of the night with stuff I should try. I’ll be back to normal soon, I’m sure. Don’t tell my boss.

Philosophical Origins of Computers


Scott Pellegrino here talks a bit about Boole’s laws, logic, set theory, and is building up into computation and information theory if his writing continues along the line promised here.

The Modern Dilettante

As indicated by my last post, I’d really like to tie in philosophical contributions to mathematics to the rise of the computer.  I’d like to jump from Leibniz to Boole, since Boole got the ball rolling to finally bring to fruition what Leibniz first speculated on the possibility.

In graduate school, I came across a series of lectures by a former head of the entire research and development division of IBM, which covered, in surprising level of detail, the philosophical origins of the computer industry. To be honest, it’s the sort of subject that really should be book in length.  But I think it really is a great contemporary example of exactly what philosophy is supposed to be, discovering new methods of analysis that as they develop are spun out of philosophy and are given birth as a new independent (or semi-independent) field their philosophical origins.  Theoretical linguistics is a…

View original post 771 more words

Who Was Karl Pearson?


An offhanded joke in the Usenet newsgroup alt.fan.cecil-adams — a great spot for offhanded jokes, as the audience is reasonably demanding — about baseball being a game of statistics but this is ridiculous prompted me to say I hoped the Pearson Chi-Squared Test had a good season since it was at the core of my baseball statistics fantasy team. One respondent asked if this was connected to Pearson Publishing, which has recently screwed up its composition of standardized tests for New York State quite severely, including giving as a reading comprehension assignment a bit of nonsense composed to have no meaning, and twenty mistakes in the non-English translation of a math exam. There’s no connection of which I’m aware; but, why not take a couple paragraphs to talk about Karl Pearson?

Continue reading “Who Was Karl Pearson?”