A Little More Talk About What We Talk About When We Talk About How Interesting What We Talk About Is


I had been talking about how much information there is in the outcome of basketball games, or tournaments, or the like. I wanted to fill in at least one technical term, to match some of the others I’d given.

In this information-theory context, an experiment is just anything that could have different outcomes. A team can win or can lose or can tie in a game; that makes the game an experiment. The outcomes are the team wins, or loses, or ties. A team can get a particular score in the game; that makes that game a different experiment. The possible outcomes are the team scores zero points, or one point, or two points, or so on up to whatever the greatest possible score is.

If you know the probability p of each of the different outcomes, and since this is a mathematics thing we suppose that you do, then we have what I was calling the information content of the outcome of the experiment. That’s a number, measured in bits, and given by the formula

\sum_{j} - p_j \cdot \log\left(p_j\right)

The sigma summation symbol means to evaluate the expression to the right of it for every value of some index j. The pj means the probability of outcome number j. And the logarithm may be that of any base, although if we use base two then we have an information content measured in bits. Those are the same bits as are in the bytes that make up the megabytes and gigabytes in your computer. You can see this number as an estimate of how many well-chosen yes-or-no questions you’d have to ask to pick the actual result out of all the possible ones.

I’d called this the information content of the experiment’s outcome. That’s an idiosyncratic term, chosen because I wanted to hide what it’s normally called. The normal name for this is the “entropy”.

To be more precise, it’s known as the “Shannon entropy”, after Claude Shannon, pioneer of the modern theory of information. However, the equation defining it looks the same as one that defines the entropy of statistical mechanics, that thing everyone knows is always increasing and somehow connected with stuff breaking down. Well, almost the same. The statistical mechanics one multiplies the sum by a constant number called the Boltzmann constant, after Ludwig Boltzmann, who did so much to put statistical mechanics in its present and very useful form. We aren’t thrown by that. The statistical mechanics entropy describes energy that is in a system but that can’t be used. It’s almost background noise, present but nothing of interest.

Is this Shannon entropy the same entropy as in statistical mechanics? This gets into some abstract grounds. If two things are described by the same formula, are they the same kind of thing? Maybe they are, although it’s hard to see what kind of thing might be shared by “how interesting the score of a basketball game is” and “how much unavailable energy there is in an engine”.

The legend has it that when Shannon was working out his information theory he needed a name for this quantity. John von Neumann, the mathematician and pioneer of computer science, suggested, “You should call it entropy. In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.” There are variations of the quote, but they have the same structure and punch line. The anecdote appears to trace back to an April 1961 seminar at MIT given by one Myron Tribus, who claimed to have heard the story from Shannon. I am not sure whether it is literally true, but it does express a feeling about how people understand entropy that is true.

Well, these entropies have the same form. And they’re given the same name, give or take a modifier of “Shannon” or “statistical” or some other qualifier. They’re even often given the same symbol; normally a capital S or maybe an H is used as the quantity of entropy. (H tends to be more common for the Shannon entropy, but your equation would be understood either way.)

I’m not comfortable saying they’re the same thing, though. After all, we use the same formula to calculate a batting average and to work out the average time of a commute. But we don’t think those are the same thing, at least not more generally than “they’re both averages”. These entropies measure different kinds of things. They have different units that just can’t be sensibly converted from one to another. And the statistical mechanics entropy has many definitions that not just don’t have parallels for information, but wouldn’t even make sense for information. I would call these entropies siblings, with strikingly similar profiles, but not more than that.

But let me point out something about the Shannon entropy. It is low when an outcome is predictable. If the outcome is unpredictable, presumably knowing the outcome will be interesting, because there is no guessing what it might be. This is where the entropy is maximized. But an absolutely random outcome also has a high entropy. And that’s boring. There’s no reason for the outcome to be one option instead of another. Somehow, as looked at by the measure of entropy, the most interesting of outcomes and the most meaningless of outcomes blur together. There is something wondrous and strange in that.

Reading the Comics, April 22, 2015: April 21, 2015 Edition


I try to avoid doing Reading The Comics entries back-to-back since I know they can get a bit repetitive. How many ways can I say something is a student-resisting-the-word-problem joke? But if Comic Strip Master Command is going to send a half-dozen strips at least mentioning mathematical topics in a single day, how can I resist the challenge? Worse, what might they have waiting for me tomorrow? So here’s a bunch of comic strips from the 21st of April, 2015:

Mark Anderson’s Andertoons plays on the idea of a number being used up. I’m most tickled by this one. I have heard that the New York Yankees may be running short on uniform numbers after having so many retired. It appears they’ve only retired 17 numbers, but they do need numbers for a 40-player roster as well as managers and coaches and other participants. Also, and this delights me, two numbers are retired for two people each. (Number 8, for Yogi Berra and Bill Dickey, and Number 42, for Jackie Robinson and Mariano Rivera.)

Continue reading “Reading the Comics, April 22, 2015: April 21, 2015 Edition”

Reading the Comics, April 20, 2015: History of Mathematics Edition


This is a bit of a broad claim, but it seems Comic Strip Master Command was thinking of all mathematics one lead-time ago. There’s a comic about the original invention of mathematics, and another showing off 20th century physics equations. This seems as much of the history of mathematics as one could reasonably expect from the comics page.

Mark Anderson’s Andertoons gets its traditional appearance around here with the April 17th strip. It features a bit of arithmetic that is indeed lovely but wrong.

Continue reading “Reading the Comics, April 20, 2015: History of Mathematics Edition”

The Leap-The-Dips at Lakemont Park, Altoona, Pennsylvania, as photographed by Joseph Nebus in July 2013 from the edge of the launch platform.

Roller Coaster Immortality Update!


Several years ago I had the chance to go to Lakemont Park, in Altoona, Pennsylvania. It’s a lovely and very old amusement park, featuring the oldest operating roller coaster, Leap The Dips. As roller coasters go it’s not very large and not very fast, but it’s a great ride. It does literally and without exaggeration leap off the track, though not far enough to be dangerous. I recommend the park and the ride to people who have cause to be in the middle of Pennsylvania.

I wondered whether any boards in it might date from the original construction in 1902 by the E Joy Morris company. If we make some assumptions we can turn this into a probability problem. It’s a problem of a type that always seems to be answered 1/e. (The problem is “what is the probability that any particular piece of wood has lasted 100 years, if a piece of wood has a one percent chance of needing replacement every year?”) That’s a probability of about 37 percent. But I doubted this answer meant anything. My skepticism came from wondering why every piece of wood should be equally likely to survive every year. Different pieces serve different structural roles, and will be exposed to the elements differently. How can I be sure that the probability one piece needs replacement is independent of the probability some other piece needs replacement? But if they’re not independent then my calculation doesn’t give a relevant answer.

The Leap-The-Dips roller coaster at Lakemont Park, Altoona, Pennsylvania.
The Leap-The-Dips roller coaster at Lakemont Park, Altoona, Pennsylvania.

A recent post on the Usenet roller coaster enthusiast newsgroup rec.roller-coaster, in a discussion titled “Age a coaster should be preserved”, suggests I was right in my skepticism. Derek Gee writes:

According to the video documentary the park produced around
1999, all of the original upright lumber was found to be in excellent shape.
The E. Joy Morris company had waterproofed it by sealing it in ten coats of
paint and it was old-growth hardwood. All the horizontal lumber was
replaced as I recall.

I am aware this is not an academically rigorous answer to the question of how much of the roller coaster’s original construction is still in place. But it is a lead. It suggests that quite a bit of the antique ride is as antique as could be.

Reading the Comics, April 15, 2015: Tax Day Edition


Since it is mid-April, and most of the comic strips at Comics Kingdom and GoComics.com are based in the United States, Comic Strip Master Command ordered quite a few comics about taxes. Most of those are simple grumbling, but the subject naturally comes around to arithmetic and calculation and sometimes even logic. Thus, this is a Tax Day edition, though it’s bookended with Mutt and Jeff.

Bud Fisher’s Mutt And Jeff (April 11) — a rerun rom goodness only knows when, and almost certainly neither written nor drawn by Bud Fisher at that point — recounts a joke that has the form of a word problem in which a person’s age is deduced from information about the age. It’s an old form, but jokes about cutting the Gordion knot are probably always going to be reliable. I’m reminded there’s a story of Thomas Edison giving a new hire, mathematician, the problem of working out the volume of a light bulb. Edison got impatient with the mathematician treating it as a calculus problem — the volume of a rotationally symmetric object like a bulb is the sort of thing you can do by the end of Freshman Calculus — and instead filling a bulb with water, pouring the water into a graduated cylinder, and reading it off that.

Calculus under 50: vectors and stuff. Calculus over 50: diet and exercise problems.
Sandra Bell-Lundy’s Between Friends for the 12th of April, 2015. The link will likely expire around the 12th of May.

Sandra Bell-Lundy’s Between Friends (April 12) uses Calculus as the shorthand for “the hardest stuff you might have to deal with”. The symbols on the left-hand side are fair enough, although I’d think of them more as precalculus or linear algebra or physics, but they do parse well enough as long as I suppose that what sure looks like a couple of extraneous + signs are meant to refer to “t”. But “t” is a common enough variable in calculus problems, usually representing time, sometimes just representing “some parameter whose value we don’t really care about, but we don’t want it to be x”, and it looks an awful lot like a plus sign there too. On the right side, I have no idea what a root of forty minutes on a treadmill might be. It’s symbolic.

Continue reading “Reading the Comics, April 15, 2015: Tax Day Edition”

Spontaneity and the performance of work


Joseph Nebus:

I’d wanted just to point folks to the latest essay in the CarnotCycle blog. This thermodynamics piece is a bit about how work gets done, and how it relates to two kinds of variables describing systems. The two kinds are known as intensive and extensive variables, and considering them helps guide us to a different way to regard physical problems.

Originally posted on carnotcycle:

spn01

Imagine a perfect gas contained by a rigid-walled cylinder equipped with a frictionless piston held in position by a removable external agency such as a magnet. There are finite differences in the pressure (P1>P2) and volume (V2>V1) of the gas in the two compartments, while the temperature can be regarded as constant.

If the constraint on the piston is removed, will the piston move? And if so, in which direction?

Common sense, otherwise known as dimensional analysis, tells us that differences in volume (dimensions L3) cannot give rise to a force. But differences in pressure (dimensions ML-1T-2) certainly can. There will be a net force of P1–P2 per unit area of piston, driving it to the right.

– – – –

The driving force

In thermodynamics, there exists a set of variables which act as “generalised forces” driving a system from one state to…

View original 290 more words

Reading the Comics, April 10, 2015: Getting Into The Story Problem Edition


I know it’s been like forever, or four days, since the last time I had a half-dozen or so mathematically themed comic strips to write about, but if Comic Strip Master Command is going to order cartoonists to give me stuff to write about I’m not going to turn them away. Several seemed to me about the struggle to get someone to buy into a story — the thing being asked after in a word problem, perhaps, or about the ways mathematics is worth knowing, or just how the mathematics in a joke’s setup are presented — and how skepticism about these things can turn up. So I’ll declare that the theme of this collection.

Steve Sicula’s Home And Away started a sequence on April 7th about “is math really important?”, with the father trying to argue that it’s so very useful. I’m not sure anyone’s ever really been convinced by the argument that “this is useful, therefore it’s important, therefore it’s interesting”. Lots of things are useful or important while staying fantastically dull to all but a select few souls. I would like to think a better argument for learning mathematics is that it’s beautiful, and astounding, and it allows you to discover new ways of studying the world; it can offer all the joy of any art, even as it has a practical side. Anyway, the sequence goes on for several days, and while I can’t say the arguments get very convincing on any side, they do allow for a little play with the fourth wall that I usually find amusing in comics which don’t do that much.

Continue reading “Reading the Comics, April 10, 2015: Getting Into The Story Problem Edition”

Doesn’t The Other Team Count? How Much?


I’d worked out an estimate of how much information content there is in a basketball score, by which I was careful to say the score that one team manages in a game. I wasn’t able to find out what the actual distribution of real-world scores was like, unfortunately, so I made up a plausible-sounding guess: that college basketball scores would be distributed among the imaginable numbers (whole numbers from zero through … well, infinitely large numbers, though in practice probably not more than 150) according to a very common distribution called the “Gaussian” or “normal” distribution, that the arithmetic mean score would be about 65, and that the standard deviation, a measure of how spread out the distribution of scores is, would be about 10.

If those assumptions are true, or are at least close enough to true, then there are something like 5.4 bits of information in a single team’s score. Put another way, if you were trying to divine the score by asking someone who knew it a series of carefully-chosen questions, like, “is the score less than 65?” or “is the score more than 39?”, with at each stage each question equally likely to be answered yes or no, you could expect to hit the exact score with usually five, sometimes six, such questions.

==>

Reading the Comics, April 6, 2015: Little Infinite Edition


As I warned, there were a lot of mathematically-themed comic strips the last week, and here I can at least get us through the start of April. This doesn’t include the strips that ran today, the 7th of April by my calendar, because I have to get some serious-looking men to look at my car and I just know they’re going to disapprove of what my CV joint covers look like, even though I’ve done nothing to them. But I won’t be reading most of today’s comic strips until after that’s done, and so commenting on them later.

Mark Anderson’s Andertoons (April 3) makes its traditional appearance in my roundup, in this case with a business-type guy declaring infinity to be “the loophole of all loopholes!” I think that’s overstating things a fair bit, but strange and very counter-intuitive things do happen when you try to work out a problem in which infinities turn up. For example: in ordinary arithmetic, the order in which you add together a bunch of real numbers makes no difference. If you want to add together infinitely many real numbers, though, it is possible to have them add to different numbers depending on what order you add them in. Most unsettlingly, it’s possible to have infinitely many real numbers add up to literally any real number you like, depending on the order in which you add them. And then things get really weird.

Keith Tutt and Daniel Saunders’s Lard’s World Peace Tips (April 3) is the other strip in this roundup to at least name-drop infinity. I confess I don’t see how “being infinite” would help in bringing about world peace, but I suppose being finite hasn’t managed the trick just yet so we might want to think outside the box.

Continue reading “Reading the Comics, April 6, 2015: Little Infinite Edition”

But How Interesting Is A Basketball Score?


When I worked out how interesting, in an information-theory sense, a basketball game — and from that, a tournament — might be, I supposed there was only one thing that might be interesting about the game: who won? Or to be exact, “did (this team) win”? But that isn’t everything we might want to know about a game. For example, we might want to know what a team scored. People often do. So how to measure this?

The answer was given, in embryo, in my first piece about how interesting a game might be. If you can list all the possible outcomes of something that has multiple outcomes, and how probable each of those outcomes is, then you can describe how much information there is in knowing the result. It’s the sum, for all of the possible results, of the quantity negative one times the probability of the result times the logarithm-base-two of the probability of the result. When we were interested in only whether a team won or lost there were just the two outcomes possible, which made for some fairly simple calculations, and indicates that the information content of a game can be as high as 1 — if the team is equally likely to win or to lose — or as low as 0 — if the team is sure to win, or sure to lose. And the units of this measure are bits, the same kind of thing we use to measure (in groups of bits called bytes) how big a computer file is.

Continue reading “But How Interesting Is A Basketball Score?”