Reading the Comics, February 3, 2017: Counting Edition


And now I can close out last week’s mathematically-themed comic strips. Two of them are even about counting, which is enough for me to make that the name of this set.

John Allen’s Nest Heads for the 2nd mentions a probability and statistics class and something it’s supposed to be good for. I would agree that probability and statistics are probably (I can’t find a better way to write this) the most practically useful mathematics one can learn. At least once you’re past arithmetic. They’re practical by birth; humans began studying them because they offer guidance in uncertain situations. And one can use many of their tools without needing more than arithmetic.

I’m not so staunchly anti-lottery as many mathematics people are. I’ll admit I play it myself, when the jackpot is large enough. When the expectation value of the prize gets to be positive, it’s harder to rationalize not playing. This happens only once or twice a year, but it’s fun to watch and see when it happens. I grant it’s a foolish way to use two dollars (two tickets are my limit), but you know? My budget is not so tight I can’t spend four dollars foolishly a year. Besides, I don’t insist on winning one of those half-billion-dollar prizes. I imagine I’d be satisfied if I brought in a mere $10,000.

'Hey, Ruthie's Granny, how old are you?' 'You can't count that high, James.' 'I can too!' 'Fine! Start at one and I'll tell you when you get to my age.' '1, 2, 3, 4, 11, 22, 88, 99, 200, a gazillion!' 'Very good! It's somewhere between 22 and a gazillion!' 'Gazowie!'
Rick Detorie’s One Big Happy for the 3rd of February, 2017. A ‘gazillion’ is actually a surprisingly low number, hovering as it does somewhere around 212. Fun fact!

Rick Detorie’s One Big Happy for the 3rd continues my previous essay’s bit of incompetence at basic mathematics, here, counting. But working out that her age is between 22 an a gazillion may be worth doing. It’s a common mathematical challenge to find a correct number starting from little information about it. Usually we find it by locating bounds: the number must be larger than this and smaller than that. And then get the bounds closer together. Stop when they’re close enough for our needs, if we’re numerical mathematicians. Stop when the bounds are equal to each other, if we’re analytic mathematicians. That can take a lot of work. Many problems in number theory amount to “improve our estimate of the lowest (or highest) number for which this is true”. We have to start somewhere.

Samson’s Dark Side of the Horse for the 3rd is a counting-sheep joke and I was amused that the counting went so awry here. On looking over the strip again for this essay, though, I realize I read it wrong. It’s the fences that are getting counted, not the sheep. Well, it’s a cute little sheep having the same problems counting that Horace has. We don’t tend to do well counting more than around seven things at a glance. We can get a bit farther if we can group things together and spot that, say, we have four groups of four fences each. That works and it’s legitimate; we’re counting and we get the right count out of it. But it does feel like we’re doing something different from how we count, say, three things at a glance.

Mick Mastroianni and Mason MastroianniDogs of C Kennel for the 3rd is about the world’s favorite piece of statistical mechanics, entropy. There’s room for quibbling about what exactly we mean by thermodynamics saying all matter is slowly breaking down. But the gist is fair enough. It’s still mysterious, though. To say that the disorder of things is always increasing forces us to think about what we mean by disorder. It’s easy to think we have an idea what we mean by it. It’s hard to make that a completely satisfying definition. In this way it’s much like randomness, which is another idea often treated as the same as disorder.

Bill Amend’s FoxTrot Classics for the 3rd reprinted the comic from the 10th of February, 2006. Mathematics teachers always want to see how you get your answers. Why? … Well, there are different categories of mistakes someone can make. One can set out trying to solve the wrong problem. One can set out trying to solve the right problem in a wrong way. One can set out solving the right problem in the right way and get lost somewhere in the process. Or one can be doing just fine and somewhere along the line change an addition to a subtraction and get what looks like the wrong answer. Each of these is a different kind of mistake. Knowing what kinds of mistakes people make is key to helping them not make these mistakes. They can get on to making more exciting mistakes.

Advertisements

Reading the Comics, May 3, 2016: Lots Of Images Edition


After the heavy pace of March and April I figure to take it easy and settle to about a three-a-week schedule around here. That doesn’t mean that Comic Strip Master Command wants things to be too slow for me. And this time they gave me more comics than usual that have expiring URLs. I don’t think I’ve had this many pictures to include in a long while.

Bill Whitehead’s Free Range for the 28th presents an equation-solving nightmare. From my experience, this would be … a great pain, yes. But it wouldn’t be a career-wrecking mess. Typically a problem that’s hard to solve is hard because you have no idea what to do. Given an expression, you’re allowed to do anything that doesn’t change its truth value. And many approaches might look promising without quite resolving to something useful. The real breakthrough is working out what approach should be used. For an astrophysics problem, there are some classes of key decisions to make. One class is what to include and what to omit in the model. Another class is what to approximate — and how — versus what to treat exactly. Another class is what sorts of substitutions and transformations turn the original expression into one that reveals what you want. Those are the hard parts, and those are unlikely to have been forgotten. Applying those may be tedious, and I don’t doubt it would be anguishing to have the finished work wiped out. But it wouldn’t set one back years either. It would just hurt.

Christopher Grady’s Lunar Babboon for the 29th I classify as the “anthropomorphic numerals” joke for this essay. Boy, have we all been there.

'Numbers are boring!' complains the audience. 'Not so. They contain high drama and narrative. Here's an expense account that was turned in to me last week. Can you create a *story* based on these numbers?' 'Once upon a time, a guy was fired for malfeasance ... ' 'If you skip right to the big finish, sure.'
Bill Holbrook’s On The Fastrack for the 29th of April, 2016. Spoiler: there aren’t any numbers in the second panel.

Bill Holbrook’s On The Fastrack for the 29th continues the storyline about Fi giving her STEM talk. She is right, as I see it, in attributing drama and narrative to numbers. This is most easily seen in the sorts of finance and accounting mathematics which the character does. And the inevitable answer to “numbers are boring” (or “mathematics is boring”) is surely to show how they are about people. Even abstract mathematics is about things (some) people find interesting, and that must be about the people too.

'Look, Grandpa! I got 100% on my math test! Do you know what that means? It means that out of ten questions, I got at least half of them correct!' 'It must be that new, new, new math.' 'So many friendly numbers!'
Rick Detorie’s One Big Happy for the 3rd of May, 2016. Ever notice how many shirt pockets Grandpa has? I’m not saying it’s unrealistic, just that it’s more than the average.

Rick Detorie’s One Big Happy for the 16th is a confused-mathematics joke. Grandpa tosses off a New Math joke that’s reasonably age-appropriate too, which is always nice to see in a comic strip. I don’t know how seriously to take Ruthie’s assertion that a 100% means she only got at least half of the questions correct. It could be a cartoonist grumbling about how kids these days never learn anything, the same way ever past generation of cartoonists had complained. But Ruthie is also the sort of perpetually-confused, perpetually-confusing character who would get the implications of a 100% on a test wrong. Or would state them weirdly, since yes, a 100% does imply getting at least half the test’s questions right.

Border Collies, as we know, are highly intelligent. 'Yup, the math confirms it --- we can't get by without people.'
Niklas Eriksson’s Carpe Diem for the 3rd of May, 2016. I’m a little unnerved there seems to be a multiplication x at the end of the square root vinculum on the third line there.

Niklas Eriksson’s Carpe Diem for the 3rd uses the traditional board full of mathematical symbols as signifier of intelligence. There’s some interesting mixes of symbols here. The c2, for example, isn’t wrong for mathematics. But it does evoke Einstein and physics. There’s the curious mix of the symbol π and the approximation 3.14. But then I’m not sure how we would get from any of this to a proposition like “whether we can survive without people”.

'What comes after eleven?' 'I can't do it. I don't have enough fingers to count on!' Tiger hands him a baseball glove. 'Use this.'
Bud Blake’s Tiger for the 3rd of May, 2016. How did Punkinhead get up to eleven?

Bud Blake’s Tiger for the 3rd is a cute little kids-learning-to-count thing. I suppose it doesn’t really need to be here. But Punkinhead looks so cute wearing his tie dangling down onto the floor, the way kids wear their ties these days.

Tony Murphy’s It’s All About You for the 3rd name-drops algebra. I think what the author really wanted here was arithmetic, if the goal is to figure out the right time based on four clocks. They seem to be trying to do a simple arithmetic mean of the time on the four clocks, which is fair if we make some assumptions about how clocks drift away from the correct time. Mostly those assumptions are that the clocks all started right and are equally likely to drift backwards or forwards, and do that drifting at the same rate. If some clocks are more reliable than others, then, their claimed time should get more weight than the others. And something like that must be at work here. The mean of 7:56, 8:02, 8:07, and 8:13, uncorrected, is 8:04 and thirty seconds. That’s not close enough to 8:03 “and five-eighths” unless someone’s been calculating wrong, or supposing that 8:02 is more probably right than 8:13 is.

A Leap Day 2016 Mathematics A To Z: Uncountable


I’m drawing closer to the end of the alphabet. While I have got choices for ‘V’ and ‘W’ set, I’ll admit that I’m still looking for something that inspires me in the last couple letters. Such inspiration might come from anywhere. HowardAt58, of that WordPress blog, gave me the notion for today’s entry.

Uncountable.

What are we doing when we count things?

Maybe nothing. We might be counting just to be doing something. Or we might be counting because we want to do nothing. Counting can be a good way into a restful state. Fair enough. Just because we do something doesn’t mean we care about the result.

Suppose we do care about the result of our counting. Then what is it we do when we count? The mechanism is straightforward enough. We pick out things and say, or imagine saying, “one, two, three, four,” and so on. Or we at least imagine the numbers along with the things being numbered. When we run out of things to count, we take whatever the last number was. That’s how many of the things there were. Why are there eight light bulbs in the chandelier fixture above the dining room table? Because there are not nine.

That’s how lay people count anyway. Mathematicians would naturally have a more sophisticated view of the business. A much more powerful counting scheme. Concepts in counting that go far beyond what you might work out in first grade.

Yeah, so that’s what most of us would figure. Things don’t get much more sophisticated than that, though. This probably is because the idea of counting is tied to the theory of sets. And the theory of sets grew, in part, to come up with a logically solid base for arithmetic. So many of the key ideas of set theory are so straightforward they hardly seem to need explaining.

We build the idea of “countable” off of the nice, familiar numbers 1, 2, 3, and so on. That set’s called the counting numbers. They’re the numbers that everybody seems to recognize as numbers. Not just people. Even animals seem to understand at least the first couple of counting numbers. Sometimes these are called the natural numbers.

Take a set of things we want to study. We’re interested in whether we can match the things in that set one-to-one with the things in the counting numbers. We don’t have to use all the counting numbers. But we can’t use the same counting number twice. If we’ve matched one chandelier light bulb with the number ‘4’, we mustn’t match a different bulb with the same number. Similarly, if we’ve got the number ‘4’ matched to one bulb, we mustn’t match ‘4’ with another bulb at the same time.

If we can do this, then our set’s countable. If we really wanted, we could pick the counting numbers in order, starting from 1, and match up all the things with counting numbers. If we run out of things, then we have a finitely large set. The last number we used to match anything up with anything is the size, or in the jargon, the cardinality of our set. We might not care about the cardinality, just whether the set is finite. Then we can pick counting numbers as we like in no particular order. Just use whatever’s convenient.

But what if we don’t run out of things? And it’s possible we won’t. Suppose our set is the negative whole numbers: -1, -2, -3, -4, -5, and so on. We can match each of those to a counting number many ways. We always can. But there’s an easy way. Match -1 to 1, match -2 to 2, match -3 to 3, and so on. Why work harder than that? We aren’t going to run out of negative whole numbers. And we aren’t going to find any we can’t match with some counting number. And we aren’t going to have to match two different negative numbers to the same counting number. So what we have here is an infinitely large, yet still countable, set.

So a set of things can be countable and finite. It can be countable and infinite. What else is there to be?

There must be something. It’d be peculiar to have a classification that everything was in, after all. At least it would be peculiar except for people studying what it means to exist or to not exist. And most of those people are in the philosophy department, where we’re scared of visiting. So we must mean there’s some such thing as an uncountable set.

The idea means just what you’d guess if you didn’t know enough mathematics to be tricky. Something is uncountable if it can’t be counted. It can’t be counted if there’s no way to match it up, one thing-to-one thing, with the counting numbers. We have to somehow run out of counting numbers.

It’s not obvious that we can do that. Some promising approaches don’t work. For example, the set of all the integers — 1, 2, 3, 4, 5, and all that, and 0, and the negative numbers -1, -2, -3, -4, -5, and so on — is still countable. Match the counting number 1 to 0. Match the counting number 2 to 1. Match the counting number 3 to -1. Match 4 to 2. Match 5 to -2. Match 6 to 3. Match 7 to -3. And so on.

Even ordered pair of the counting numbers don’t do it. We can match the counting number 1 to the pair (1, 1). Match the counting number 2 to the pair (2, 1). Match the counting number 3 to (1, 2). Match 4 to (3, 1). Match 5 to (2, 2). Match 6 to (1, 3). Match 7 to (4, 1). Match 8 to (3, 2). And so on. We can achieve similar staggering results with ordered triplets, quadruplets, and more. Ordered pairs of integers, positive and negative? Longer to do, yes, but just as doable.

So are there any uncountable things?

Sure. Wouldn’t be here if there weren’t. For example: think about the set that’s all the ways to pick things from a set. I sense your confusion. Let me give you an example. Suppose we have the set of three things. They’re the numbers 1, 2, and 3. We can make a bunch of sets out of things from this set. We can make the set that just has ‘1’ in it. We can make the set that just has ‘2’ in it. Or the set that just has ‘3’ in it. We can also make the set that has just ‘1’ and ‘2’ in it. Or the set that just has ‘2’ and 3′ in it. Or the set that just has ‘3’ and ‘1’ in it. Or the set that has all of ‘1’, ‘2’, and ‘3’ in it. And we can make the set that hasn’t got any of these in it. (Yes, that does too count as a set.)

So from a set of three things, we were able to make a collection of eight sets. If we had a set of four things, we’d be able to make a collection of sixteen sets. With five things to start from, we’d be able to make a collection of thirty-two sets. This collection of sets we call the “power set” of our original set, and if there’s one thing we can say about it, it’s that it’s bigger than the set we start from.

The power set for a finite set, well, that’ll be much bigger. But it’ll still be finite. Still be countable. What about the power set for an infinitely large set?

And the power set of the counting numbers, the collection of all the ways you can make a set of counting numbers, is really big. Is it uncountably big?

Let’s step back. Remember when I said mathematicians don’t get “much more” sophisticated than matching up things to the counting numbers? Here’s a little bit of that sophistication. We don’t have to match stuff up to counting numbers if we like. We can match the things in one set to the things in another set. If it’s possible to match them up one-to-one, with nothing missing in either set, then the two sets have to be the same size. The same cardinality, in the jargon.

So. The set of the numbers 1, 2, 3, has to have a smaller cardinality than its power set. Want to prove it? Do this exactly the way you imagine. You run out of things in the original set before you run out of things in the power set, so there’s no making a one-to-one matchup between the two.

With the infinitely large yet countable set of the counting numbers … well, the same result holds. It’s harder to prove. You have to show that there’s no possible way to match the infinitely many things in the counting numbers to the infinitely many things in the power set of the counting numbers. (The easiest way to do this is by contradiction. Imagine that you have made such a matchup, pairing everything in your power set to everything in the counting numbers. Then you go through your matchup and put together a collection that isn’t accounted for. Whoops! So you must not have matched everything up in the first place. Why not? Because you can’t.)

But the result holds. The power set of the counting numbers is some other set. It’s infinitely large, yes. And it’s so infinitely large that it’s somehow bigger than the counting numbers. It is uncountable.

There’s more than one uncountably large set. Of course there are. We even know of some of them. For example, there’s the set of real numbers. Three-quarters of my readers have been sitting anxiously for the past eight paragraphs wondering if I’d ever get to them. There’s good reason for that. Everybody feels like they know what the real numbers are. And the proof that the real numbers are a larger set than the counting numbers is easy to understand. An eight-year-old could master it. You can find that proof well-explained within the first ten posts of pretty much every mathematics blog other than this one. (I was saving the subject. Then I finally decided I couldn’t explain it any better than everyone else has done.)

Are the real numbers the same size, the same cardinality, as the power set of the counting numbers?

Sure, they are.

No, they’re not.

Whichever you like. This is one of the many surprising mathematical results of the surprising 20th century. Starting from the common set of axioms about set theory, it’s undecidable whether the set of real numbers is as big as the power set of the counting numbers. You can assume that it is. This is known as the Continuum Hypothesis. And you can do fine mathematical work with it. You can assume that it is not. This is known as the … uh … Rejecting the Continuum Hypothesis. And you can do fine mathematical work with that. What’s right depends on what work you want to do. Either is consistent with the starting hypothesis. You are free to choose either, or if you like, neither.

My understanding is that most set theory finds it more productive to suppose that they’re not the same size. I don’t know why this is. I know enough set theory to lead you to this point, but not past it.

But that the question can exist tells you something fascinating. You can take the power set of the power set of the counting numbers. And this gives you another, even vaster, uncountably large set. As enormous as the collection of all the ways to pick things out of the counting numbers is, this power set of the power set is even vaster.

We’re not done. There’s the power set of the power set of the power set of the counting numbers. And the power set of that. Much as geology teaches us to see Deep Time, and astronomy Deep Space, so power sets teach us to see Deep … something. Deep Infinity, perhaps.

Fish, Re-Counted


We had a bit of a surprise with our goldfish. Longtime readers might remember my string of essays describing how one might count fish by something other than the process of actually counting them all. In preparing our pond, which isn’t deep enough to be safe against a harsh winter, last October we set out trap and caught, we believed, all of them. There turned out to be 53 of them when I posted an update.

Several dozen goldfish, most of them babies, within a 150-gallon rubber stock tank, their wintering home.
Stock photograph of our goldfish in a stock tank for the winter. Previous winter.

A couple of weeks after that — on Thanksgiving, it happens — we caught one more fish. This brought the total to 54. And I either failed to make note of it or I can’t find the note I made of it. Such happens.

In getting the pond ready for the spring, and the return of our goldfish to the outdoors, we found another one! It was just this orange thing dug into the muck of the pool, and we thought initially it was something that had fallen in and gotten lost. A heron scarer, was my love’s first guess. The pond thermometer that sank without trace some years back was mine. I used the grabber to poke at it and woke up a pretty sulky goldfish. It went over to some algae where we couldn’t so easily bother it.

So that brings our fish count to 55, for those keeping track. Fortunately, it was a very gentle winter in our parts. We’re hoping to bring the goldfish back out to the pond in the next week or two. Our best estimate for the carrying capacity of the pond is 65 to 130 goldfish, so, we will see whether the goldfish do anything about this slight underpopulation.

Reading the Comics, February 2, 2016: Pre-Lottery Edition


So a couple weeks ago one of the multi-state lotteries in the United States reached a staggering jackpot of one and a half billion dollars. And it turns out that “a couple weeks” is about the lead time most syndicated comic strip artists maintain. So there’s a rash of lottery-themed comic strips. There’s enough of them that I’m going to push those off to the next Reading the Comics installment. I’ll make do here with what Comic Strip master Command sent us before thoughts of the lottery infiltrated folks’ heads.

Punkinhead: 'I was counting to five and couldn't remember what came after seven.' Tiger: 'If you're counting to five nothing comes after seven.' Punkinhead: 'I thought sure he would know.'
Bud Blake’s Tiger for the 28th of January, 2016. I do like Punkinhead’s look of dismay in the second panel that Tiger has failed him.

Bud Blake’s Tiger for the 28th of January (a rerun; Blake’s been dead a long while) is a cute one about kids not understanding numbers. And about expectations of those who know more than you, I suppose. I’d say this is my favorite of this essay’s strips. Part of that is that it reminds me of a bit in one of the lesser Wizard of Oz books. In it the characters have to count by twos to seventeen to make a successful wish. That’s the sort of problem you expect in fairy lands and quick gags.

Mort Walker’s Beetle Bailey (Vintage) from the 7th of July, 1959 (reprinted the 28th of January) also tickles me. It uses the understanding of mathematics as stand-in for the understanding of science. I imagine it’s also meant to stand in for intelligence. It’s also a good riff on the Sisyphean nature of teaching. The equations on the board at the end almost look meaningful. At least, I can see some resemblance between them and the equations describing orbital mechanics. Camp Swampy hasn’t got any obvious purpose or role today. But the vintage strips reveal it had some role in orbital rocket launches. This was in the late 50s, before orbital rockets worked.

General: 'How's your porject coming along to teach the men some science, Captain?' Captain: 'Wonderful, sir. Six months ago they didn't know what the square root of four was! Now they don't know what this [ blackboard full of symbols ] is!'
Mort Walker’s Beetle Bailey (Vintage) for the 7th of July, 1959. This is possibly the brightest I’ve ever seen Beetle, and he doesn’t know what he’s looking at.

Matt Lubchansky’s Please Listen To Me for the 28th of January is a riff on creationist “teach the controversy” nonsense. So we get some nonsense about a theological theory of numbers. Historically, especially in the western tradition, much great mathematics was done by theologians. Lazy histories of science make out religion as the relentless antagonist to scientific knowledge. It’s not so.

The equation from the last panel, F(x) = \mathcal{L}\left\{f(t)\right\} = \int_0^{\infty} e^{-st} f(t) dt , is a legitimate one. It describes the Laplace Transform of the function f(t). It’s named for Pierre-Simon Laplace. That name might be familiar from mathematical physics, astronomy, the “nebular” hypothesis of planet formation, probability, and so on. Laplace transforms have many uses. One is in solving differential equations. They can change a differential equation, hard to solve, to a polynomial, easy to solve. Then by inverting the Laplace transform you can solve the original, hard, differential equation.

Another major use that I’m familiar with is signal processing. Often we will have some data, a signal, that changes in time or in space. The Laplace transform lets us look at the frequency distribution. That is, what regularly rising and falling patterns go in to making up the signal (or could)? If you’ve taken a bit of differential equations this might sound like it’s just Fourier series. It’s related. (If you don’t know what a Fourier series might be, don’t worry. I bet we’ll come around to discussing it someday.) It might also remind readers here of the z-transform and yes, there’s a relationship.

The transform also shows itself in probability. We’re often interested in the probability distribution of a quantity. That’s what the possible values it might have are, and how likely each of those values is. The Laplace transform lets us switch between the probability distribution and a thing called the moment-generating function. I’m not sure of an efficient way of describing what good that is. If you do, please, leave a comment. But it lets you switch from one description of a thing to another. And your problem might be easier in the other description.

John McPherson’s Close To Home for the 30th of January uses mathematics as the sort of thing that can have an answer just, well, you see it. I suppose only geography would lend itself to a joke like this (“What state is Des Moines in?”)

Wally explains to the Pointy-Haired Boss that he's in the Zeno's Paradox phase of the project, in which 'every step we take gets us halfway closer to launch', a pace that he hopes 'it will look' like he's keeping up. First week in, he is.
Scott Adams’s Dilbert for the 31st of January. The link will probably expire around the end of February or start of March.

Scott Adams’s Dilbert for the 31st of January mentions Zeno’s Paradox, three thousand years old and still going strong. I haven’t heard the paradox used as an excuse to put off doing work. It does remind me of the old saw that half your time is spent on the first 90 percent of the project, and half your time on the remaining 10 percent. It’s absurd but truthful, as so many things are.

Samson’s Dark Side Of The Horse for the 2nd of February (I’m skipping some lottery strips to get here) plays on the merger of the ideas of “turn my life completely around” and “turn around 360 degrees”. A perfect 360 degree rotation would be an “identity tranformation”, leaving the thing it’s done to unchanged. But I understand why the terms merged. As with many English words or terms, “all the way around” can mean opposite things.

But anyone playing pinball or taking time-lapse photographs or just listening to Heraclitus can tell you. Turning all the way around does not leave you quite what you were before. People aren’t perfect at rotations, and even if they were, the act of breaking focus and coming back to it changes what one’s doing.

Fish, Counted


A couple months ago I wrote about the problem of counting the number of goldfish in the backyard pond. For those who’d missed it:

  • How To Count Fish, which presented a way to estimate a population by simply doing two samplings of the population.
  • How To Re-Count Fish, which described some of the numerical problems in estimation-based population samples.
  • How Not To Count Fish, which threatened to collapse the entire project under fiddly practical problems.

Spring finally arrived, and about a month ago we finally stopped having nights that touched freezing. So we moved the goldfish which had been wintering over in the basement out to the backyard. This also let us count just how many goldfish we’d caught, and I thought folks might like to know what the population did look like.

The counting didn’t require probabilistic methods this time. Instead we took the fish from the traps and set up a correspondence between them and an ordered subset of positive whole numbers. This is the way you describe “just counting” so that it sounds either ferociously difficult or like a game. Whether it’s difficult or a game depends on whether you were a parent or a student back when the New Math was a thing. My love and I were students.

Altogether then there were fifty goldfish that had wintered over in the stock tank in the basement: eight adults and 42 baby fish. (Possibly nine and 41; one of the darker goldfish is small for an adult, but large for a baby.) Over the spring I identified at least three baby fish that had wintered over outdoors successfully. It was a less harsh winter than the one before. So there are now at least 53 goldfish in the pond. There are surely more on the way, but we haven’t seen any new babies yet.

A rock-lined circular goldfish pond, with goldfish.
53, or possibly more, goldfish are within this pond.

Also this spring we finally actually measured the pond. We’d previously estimated it to be about ten feet in diameter and two feet deep, implying a carrying capacity of about 60 goldfish if some other assumptions are made. Now we’ve learned it’s nearer twelve feet in diameter and twenty inches deep. Call that two meters radius and half a meter height. That’s a volume of about 6.3 cubic meters, or 6300 liters, or enough volume of water for about 80 goldfish. We’ll see what next fall brings.

A Summer 2015 Mathematics A To Z: characteristic


Characteristic function. (Not the probability one.)

Today’s entry in my mathematical A-To-Z challenge is easier than the bijection function was. This is the characteristic function. Its domain is any set, any collection of things you like. This can be real numbers, it can be regions of space, it can be houses in a neighborhood. Its range, however, is just the two numbers 0 and 1. Its rule — well, that’s the trick. It’s not right to say there’s “the” characteristic function. There are many characteristic functions. It’s just they all look alike. This is the way they look.

To define a characteristic function we need some subset of the domain. A subset is just a collection of things that are also all in another set. So we want a subset — let me give it the name D — of the domain. This subset D can have one or a couple of things in it; it could have everything in the domain that’s in it. The one rule is that D can’t have something in it which isn’t also in the domain. Otherwise, anything goes. (It’s even fine if D doesn’t have anything in it.)

Now, the rule for the characteristic function for D is that the function for any given item in the domain — use x as a name for that — is equal to 1 if x is in D, and is equal to 0 if x is not in D. The function is usually written as the Greek letter chi (\chi ), or the letter I, or the number 1 put in some kind of fancy heavy font, with the D as a subscript so we know which characteristic function it is.

For example. Suppose the domain is the counting numbers. Suppose the subset D is the prime numbers: 2, 3, 5, 7, 11, 13, and so on. Then the characteristic function looks like this:

For the number x \chi_D(x) is
1 0
2 1
3 1
4 0
5 1
6 0
7 1
8 0
9 0
10 0

… and so on.

Some might ask why create, much less care about, such a boring function? These are people who’ve never had to count how many rows on a large spreadsheet satisfied some complicated set of conditions. That trick where you create a column with a rule like ‘IF((C$2 > 80 AND C$2 ’01/01/2013’ AND C$4 < '05/01/2013'), 1, 0)', and then add up the column, to find out how many things had a value between 80 and 90 and a date between the start of January and the start of May, 2013? That's using a characteristic function to figure out how large a collection of things is.

Characteristic functions offer ways of breaking down a complicated set into smaller ones all of which share some property. This can be used just to work out how large are the collections of things that share different properties. It can also be a way to break a big problem into multiple smaller problems. We hope those smaller problems are simpler enough that we’re making overall less work for ourselves despite increasing the number of problems. And that’s a good trick, one mathematicians rely on a lot.

Reading the Comics, January 20, 2014


I’m getting to wonder whether cartoonists really do think about mathematics only when schools are in session; there was a frightening lull in mathematics-themed comic strips this month and I was getting all ready to write about something meaningful like how Gaussian integration works instead. But they came around, possibly because the kids went back to school and could resist answering word problems about driving places so they can divide apples again.

Carla Ventresca and Henry Beckett’s On A Claire Day (January 3) really just name-drops mathematics, as a vaguely unpleasant thing intruding on a conversation, even though Paul’s just dropped in a bit of silliness that, if I’m not reading it wrongly, is tautological anyway. There’s a fair question at work here, though: can “how good” a person is be measured? Obviously, it can’t if nobody tries; but could they succeed at all?

It sounds a bit silly, but then, measuring something like the economic state of a nation was not even imagined until surprisingly recently: most of the economic measures we have postdate World War II. One can argue whether they’re measuring what they are supposed to represent well, but there’s not much dispute about the idea that economic health could be measured anymore. When Assistant Secretary of State for the Truman administration, James Webb — later famous for managing NASA during the bulk of the Space Race — tried to get foreign relations measured in a similar way, though the idea was mocked as ridiculous (the joke was apparently something along the lines of a person rushing in to announce “Bulgaria is down two points!”, which is probably funnier if you haven’t grown up playing Civilization-style grand strategy games), and he gave up on that fight in favor of completing a desperately needed reorganization of the department.

I don’t know how I would measure a person’s goodness, but I could imagine a process of coming up with some things that could be measured, and trying them out, and seeing how well the measurements match what it feels they should be measuring. This is all probably too much work for a New Year’s Resolution, but it might get someone their thesis project.

Steve Moore’s In The Bleachers (January 14) comes back with a huge pile of equations standing as a big, complicated explanation for something. It doesn’t look to me like the description has much to do with describing balls bouncing, however, which is a bit of a disappointment given previous strips that name-drop Lev Landau or pull up implicit differentiation when they don’t need even need it. Maybe Moore wasn’t able to find something that looked good before deadline.

Bill Hinds’s Cleats (January 16, rerun) is just the sort of straightforward pun I actually more expect out of FoxTrot (see below).

Nate Frakes’s Break of Day (January 19) shows an infant trying to count sheep and concluding she’s too young to. Interesting to me is that the premise of the joke might actually be wrong: humans appear to have at least a rough sense of numbers, at least for things like counting and addition, from a surprisingly early age. This is a fascinating thing to learn about, both because it’s remarkable that humans should have a natural aptitude for arithmetic, and because of how difficult it is to come up with tests for understanding quantity and counting and addition that work on people with whom you can’t speak and who can’t be given instruction on how to respond to a test. Stanislas Dehaene’s The Number Sense: How The Mind Creates Mathematics describes some of this, although I’m hesitant to recommend it uncritically because I know I’m not well-read in the field. It’s somewhere to start learning, though.

Chip Sansom’s The Born Loser (January 20) could be the start of a word problem in translating from percentiles to rankings, and, for that matter, vice-versa. It’s convenient to switch a ranking to percentiles because that makes it easier to compare groups of different sizes. But many statistical tools, particularly the z-score, might be considered to be ways of meaningfully comparing the order of groups of different sizes that are nevertheless similar.

Bill Amend’s FoxTrot (January 20, rerun) is the reliable old figure-eight ice skating gag. I hope people won’t think worse of me for feeling that Droopy did it better.

T Lewis and Michael Fry’s Over The Hedge (January 20) uses a spot of the Fundamental Theorem of Calculus (rendered correctly) to stand in for “a really hard thought”. Calculus is probably secure in having that reputation: it’s about the last mathematics that the average person might be expected to take, and it introduces many new symbols and concepts that can be staggering (even the polymath Isaac Asimov found he just couldn’t grasp the subject), and so many of its equations are just beautiful to look at. The integral sign seems to me to have some graphic design sense that, for example, matrices or the polynomial representations of knots just don’t manage.

Counting From 52 to 11,108


fluffy once again brings to my attention the work of Inder J Taneja, who got into the Annals of Improbable Research for a fun parlor-game sort of project a couple of months ago. This was for coming up with ways to (most of) the numbers from 44 up to 1,000 using the digits 1 through 9 in order (ascending and descending), in combinations of addition, multiplication, and exponentiation. Taneja got back in Improbable this weekend with a follow-up project, listing the numbers that can be formed all the way out to a pleasant 11,111.

Taneja’s paper, available at arxiv.org, is that rare mathematics paper that you don’t need to be a mathematician to read, although it isn’t going to strike anyone as very enlightening. The ingenuity involved in many of them is impressive, though, and Taneja lists some interesting things such as how many numbers in a given range can’t be made by the digits in ascending or descending order. (Remarkably, to me at least, everything from 1,001 to 2,000 can be done in ascending or descending order.)

Continue reading “Counting From 52 to 11,108”

Counting To 52


fluffy brought to my attention a cute, amusing little bit from the Annals of Improbable Research, itself passing on some work by one Inder J Taneja. Taneja worked out a paper, available from arxiv.org, which lists results to the sort of mathematical puzzle that’s open to anyone with some paper and a pencil and some desire to do some recreational stuff.

Continue reading “Counting To 52”

Trivial Little Baseball Puzzle


I’ve been reading a book about the innovations of baseball so that’s probably why it’s on my mind. And this isn’t important and I don’t expect it to go anywhere, but it did cross my mind, so, why not give it 200 words where they won’t do any harm?

Imagine one half-inning in a baseball game; imagine that there’s no substitutions or injuries or anything requiring the replacement of a batter. Also suppose there are none of those freak events like when a batter hits out of order and the other team doesn’t notice (or pretends not to notice), the sort of things which launch one into the wonderful and strange world of stuff baseball does because they did it that way in 1835 when everyone playing was striving to be a Gentleman.

What’s the maximum number of runs that could be scored while still having at least one player not get a run?

Continue reading “Trivial Little Baseball Puzzle”