In Our Time podcast repeats episode on Zeno’s Paradoxes


It seems like barely yesterday I was giving people a tip about this podcast. In Our Time, a BBC panel-discussion programme about topics of general interest, this week repeated an episode about Zeno’s Paradoxes. It originally ran in 2016.

The panel this time is two philosophers and a mathematician, which is probably about the correct blend to get the topic down. The mathematician here is Marcus du Sautoy, with the University of Oxford, who’s a renowned mathematics popularizer in his own right. That said I think he falls into a trap that we STEM types often have in talking about Zeno, that of thinking the problem is merely “how can we talk about an infinity of something”. Or “how can we talk about an infinitesimal of something”. Mathematicians have got what seem to be a pretty good hold on how to do these calculations. But that we can provide a logically coherent way to talk about, say, how a line can be composed of points with no length does not tell us where the length of a line comes from. Still, du Sautoy knows rather a few things that I don’t. (The philosophers are Barbara Sattler, with the University of St Andrews, and James Warren, with the University of Cambridge. I know nothing further of either of them.)

The episode also discusses the Quantum Zeno Effect. This is physics, not mathematics, but it’s unsettling nonetheless. The time-evolution of certain systems can be stopped, or accelerated, by frequent measurements of the system. This is not something Zeno would have been pondering. But it is a challenge to our intuition about how change ought to work.

I’ve written some of my own thoughts about some of Zeno’s paradoxes, as well as on the Sorites paradox, which is discussed along the way in this episode. And the episode has prompted new thoughts in me, particularly about what it might mean to do infinitely many things. And what a “thing” might be. This is probably a topic Zeno was hoping listeners would ponder.

My 2019 Mathematics A To Z: Zeno’s Paradoxes


Today’s A To Z term was nominated by Dina Yagodich, who runs a YouTube channel with a host of mathematics topics. Zeno’s Paradoxes exist in the intersection of mathematics and philosophy. Mathematics majors like to declare that they’re all easy. The Ancient Greeks didn’t understand infinite series or infinitesimals like we do. Now they’re no challenge at all. This reflects a belief that philosophers must be silly people who haven’t noticed that one can, say, exit a room.

This is your classic STEM-attitude of missing the point. We may suppose that Zeno of Elea occasionally exited rooms himself. That is a supposition, though. Zeno, like most philosophers who lived before Socrates, we know from other philosophers making fun of him a century after he died. Or at least trying to explain what they thought he was on about. Modern philosophers are expected to present others’ arguments as well and as strongly as possible. This even — especially — when describing an argument they want to say is the stupidest thing they ever heard. Or, to use the lingo, when they wish to refute it. Ancient philosophers had no such compulsion. They did not mind presenting someone else’s argument sketchily, if they supposed everyone already knew it. Or even badly, if they wanted to make the other philosopher sound ridiculous. Between that and the sparse nature of the record, we have to guess a bit about what Zeno precisely said and what he meant. This is all right. We have some idea of things that might reasonably have bothered Zeno.

And they have bothered philosophers for thousands of years. They are about change. The ones I mean to discuss here are particularly about motion. And there are things we do not understand about change. This essay will not answer what we don’t understand. But it will, I hope, show something about why that’s still an interesting thing to ponder.

Cartoony banner illustration of a coati, a raccoon-like animal, flying a kite in the clear autumn sky. A skywriting plane has written 'MATHEMATIC A TO Z'; the kite, with the letter 'S' on it to make the word 'MATHEMATICS'.
Art by Thomas K Dye, creator of the web comics Projection Edge, Newshounds, Infinity Refugees, and Something Happens. He’s on Twitter as @projectionedge. You can get to read Projection Edge six months early by subscribing to his Patreon.

Zeno’s Paradoxes.

When we capture a moment by photographing it we add lies to what we see. We impose a frame on its contents, discarding what is off-frame. We rip an instant out of its context. And that before considering how we stage photographs, making people smile and stop tilting their heads. We forgive many of these lies. The things excluded from or the moments around the one photographed might not alter what the photograph represents. Making everyone smile can convey the emotional average of the event in a way that no individual moment represents. Arranging people to stand in frame can convey the participation in the way a candid photograph would not.

But there remains the lie that a photograph is “a moment”. It is no such thing. We notice this when the photograph is blurred. It records all the light passing through the lens while the shutter is open. A photograph records an eighth of a second. A thirtieth of a second. A thousandth of a second. But still, some time. There is always the ghost of motion in a picture. If we do not see it, it is because our photograph’s resolution is too coarse. If we could photograph something with infinite fidelity we would see, even in still life, the wobbling of the molecules that make up a thing.

A photograph of a blurry roller coaster passing through a vertical loop.
One of the many loops of Vortex, a roller coaster at Kings Island amusement park from 1987 to 2019. Taken by me the last day of the ride’s operation; this was one of the roller coaster’s runs after 7 pm, the close of the park the last day of the season.

Which implies something fascinating to me. Think of a reel of film. Here I mean old-school pre-digital film, the thing that’s a great strip of pictures, a new one shown 24 times per second. Each frame of film is a photograph, recording some split-second of time. How much time is actually in a film, then? How long, cumulatively, was a camera shutter open during a two-hour film? I use pre-digital, strip-of-film movies for convenience. Digital films offer the same questions, but with different technical points. And I do not want the writing burden of describing both analog and digital film technologies. So I will stick to the long sequence of analog photographs model.

Let me imagine a movie. One of an ordinary everyday event; an actuality, to use the terminology of 1898. A person overtaking a walking tortoise. Look at the strip of film. There are many frames which show the person behind the tortoise. There are many frames showing the person ahead of the tortoise. When are the person and the tortoise at the same spot?

We have to put in some definitions. Fine; do that. Say we mean when the leading edge of the person’s nose overtakes the leading edge of the tortoise’s, as viewed from our camera. Or, since there must be blur, when the center of the blur of the person’s nose overtakes the center of the blur of the tortoise’s nose.

Do we have the frame when that moment happened? I’m sure we have frames from the moments before, and frames from the moments after. But the exact moment? Are you positive? If we zoomed in, would it actually show the person is a millimeter behind the tortoise? That the person is a hundredth of a millimeter ahead? A thousandth of a hair’s width behind? Suppose that our camera is very good. It can take frames representing as small a time as we need. Does it ever capture that precise moment? To the point that we know, no, it’s not the case that the tortoise is one-trillionth the width of a hydrogen atom ahead of the person?

If we can’t show the frame where this overtaking happened, then how do we know it happened? To put it in terms a STEM major will respect, how can we credit a thing we have not observed with happening? … Yes, we can suppose it happened if we suppose continuity in space and time. Then it follows from the intermediate value theorem. But then we are begging the question. We impose the assumption that there is a moment of overtaking. This does not prove that the moment exists.

Fine, then. What if time is not continuous? If there is a smallest moment of time? … If there is, then, we can imagine a frame of film that photographs only that one moment. So let’s look at its footage.

One thing stands out. There’s finally no blur in the picture. There can’t be; there’s no time during which to move. We might not catch the moment that the person overtakes the tortoise. It could “happen” in-between moments. But at least we have a moment to observe at leisure.

So … what is the difference between a picture of the person overtaking the tortoise, and a picture of the person and the tortoise standing still? A movie of the two walking should be different from a movie of the two pretending to be department store mannequins. What, in this frame, is the difference? If there is no observable difference, how does the universe tell whether, next instant, these two should have moved or not?

A mathematical physicist may toss in an answer. Our photograph is only of positions. We should also track momentum. Momentum carries within it the information of how position changes over time. We can’t photograph momentum, not without getting blurs. But analytically? If we interpret a photograph as “really” tracking the positions of a bunch of particles? To the mathematical physicist, momentum is as good a variable as position, and it’s as measurable. We can imagine a hyperspace photograph that gives us an image of positions and momentums. So, STEM types show up the philosophers finally, right?

Hold on. Let’s allow that somehow we get changes in position from the momentum of something. Hold off worrying about how momentum gets into position. Where does a change in momentum come from? In the mathematical physics problems we can do, the change in momentum has a value that depends on position. In the mathematical physics problems we have to deal with, the change in momentum has a value that depends on position and momentum. But that value? Put it in words. That value is the change in momentum. It has the same relationship to acceleration that momentum has to velocity. For want of a real term, I’ll call it acceleration. We need more variables. An even more hyperspatial film camera.

… And does acceleration change? Where does that change come from? That is going to demand another variable, the change-in-acceleration. (The “jerk”, according to people who want to tell you that “jerk” is a commonly used term for the change-in-acceleration, and no one else.) And the change-in-change-in-acceleration. Change-in-change-in-change-in-acceleration. We have to invoke an infinite regression of new variables. We got here because we wanted to suppose it wasn’t possible to divide a span of time infinitely many times. This seems like a lot to build into the universe to distinguish a person walking past a tortoise from a person standing near a tortoise. And then we still must admit not knowing how one variable propagates into another. That a person is wide is not usually enough explanation of how they are growing taller.

Numerical integration can model this kind of system with time divided into discrete chunks. It teaches us some ways that this can make logical sense. It also shows us that our projections will (generally) be wrong. At least unless we do things like have an infinite number of steps of time factor into each projection of the next timestep. Or use the forecast of future timesteps to correct the current one. Maybe use both. These are … not impossible. But being “ … not impossible” is not to say satisfying. (We allow numerical integration to be wrong by quantifying just how wrong it is. We call this an “error”, and have techniques that we can use to keep the error within some tolerated margin.)

So where has the movement happened? The original scene had movement to it. The movie seems to represent that movement. But that movement doesn’t seem to be in any frame of the movie. Where did it come from?

We can have properties that appear in a mass which don’t appear in any component piece. No molecule of a substance has a color, but a big enough mass does. No atom of iron is ferromagnetic, but a chunk might be. No grain of sand is a heap, but enough of them are. The Ancient Greeks knew this; we call it the Sorites paradox, after Eubulides of Miletus. (“Sorites” means “heap”, as in heap of sand. But if you had to bluff through a conversation about ancient Greek philosophers you could probably get away with making up a quote you credit to Sorites.) Could movement be, in the term mathematical physicists use, an intensive property? But intensive properties are obvious to the outside observer of a thing. We are not outside observers to the universe. It’s not clear what it would mean for there to be an outside observer to the universe. Even if there were, what space and time are they observing in? And aren’t their space and their time and their observations vulnerable to the same questions? We’re in danger of insisting on an infinite regression of “universes” just so a person can walk past a tortoise in ours.

We can say where movement comes from when we watch a movie. It is a trick of perception. Our eyes take some time to understand a new image. Our brains insist on forming a continuous whole story even out of disjoint ideas. Our memory fools us into remembering a continuous line of action. That a movie moves is entirely an illusion.

You see the implication here. Surely Zeno was not trying to lead us to understand all motion, in the real world, as an illusion? … Zeno seems to have been trying to support the work of Parmenides of Elea. Parmenides is another pre-Socratic philosopher. So we have about four words that we’re fairly sure he authored, and we’re not positive what order to put them in. Parmenides was arguing about the nature of reality, and what it means for a thing to come into or pass out of existence. He seems to have been arguing something like that there was a true reality that’s necessary and timeless and changeless. And there’s an apparent reality, the thing our senses observe. And in our sensing, we add lies which make things like change seem to happen. (Do not use this to get through your PhD defense in philosophy. I’m not sure I’d use it to get through your Intro to Ancient Greek Philosophy quiz.) That what we perceive as movement is not what is “really” going on is, at least, imaginable. So it is worth asking questions about what we mean for something to move. What difference there is between our intuitive understanding of movement and what logic says should happen.

(I know someone wishes to throw down the word Quantum. Quantum mechanics is a powerful tool for describing how many things behave. It implies limits on what we can simultaneously know about the position and the time of a thing. But there is a difference between “what time is” and “what we can know about a thing’s coordinates in time”. Quantum mechanics speaks more to the latter. There are also people who would like to say Relativity. Relativity, special and general, implies we should look at space and time as a unified set. But this does not change our questions about continuity of time or space, or where to find movement in both.)

And this is why we are likely never to finish pondering Zeno’s Paradoxes. In this essay I’ve only discussed two of them: Achilles and the Tortoise, and The Arrow. There are two other particularly famous ones: the Dichotomy, and the Stadium. The Dichotomy is the one about how to get somewhere, you have to get halfway there. But to get halfway there, you have to get a quarter of the way there. And an eighth of the way there, and so on. The Stadium is the hardest of the four great paradoxes to explain. This is in part because the earliest writings we have about it don’t make clear what Zeno was trying to get at. I can think of something which seems consistent with what’s described, and contrary-to-intuition enough to be interesting. I’m satisfied to ponder that one. But other people may have different ideas of what the paradox should be.

There are a handful of other paradoxes which don’t get so much love, although one of them is another version of the Sorites Paradox. Some of them the Stanford Encyclopedia of Philosophy dubs “paradoxes of plurality”. These ask how many things there could be. It’s hard to judge just what he was getting at with this. We know that one argument had three parts, and only two of them survive. Trying to fill in that gap is a challenge. We want to fill in the argument we would make, projecting from our modern idea of this plurality. It’s not Zeno’s idea, though, and we can’t know how close our projection is.

I don’t have the space to make a thematically coherent essay describing these all, though. The set of paradoxes have demanded thought, even just to come up with a reason to think they don’t demand thought, for thousands of years. We will, perhaps, have to keep trying again to fully understand what it is we don’t understand.


And with that — I find it hard to believe — I am done with the alphabet! All of the Fall 2019 A-to-Z essays should appear at this link. Additionally, the A-to-Z sequences of this and past years should be at this link. Tomorrow and Saturday I hope to bring up some mentions of specific past A-to-Z essays. Next week I hope to share my typical thoughts about what this experience has taught me, and some other writing about this writing.

Thank you, all who’ve been reading, and who’ve offered topics, comments on the material, or questions about things I was hoping readers wouldn’t notice I was shorting. I’ll probably do this again next year, after I’ve had some chance to rest.

Reading the Comics, June 24, 2017: Saturday Morning Breakfast Cereal Edition


Somehow this is not the title of every Reading The Comics review! But it is for this post and we’ll explore why below.

Piers Baker’s Ollie and Quentin for the 18th is a Zeno’s Paradox-based joke. This uses the most familiar of Zeno’s Paradoxes, about the problem of covering any distance needing infinitely many steps to be done in a finite time. Zeno’s Paradoxes are often dismissed these days (probably were then, too), on the grounds that the Ancient Greeks Just Didn’t Understand about convergence. Hardly; they were as smart as we were. Zeno had a set of paradoxes, built on the questions of whether space and time are infinitely divisible or whether they’re not. Any answer to one paradox implies problems in others. There’s things we still don’t really understand about infinity and infinitesimals and continuity. Someday I should do a proper essay about them.

Dave Coverly’s Speed Bump for the 18th is not exactly an anthropomorphic-numerals joke. It is about making symbols manifest in the real world, at least. The greater-than and less-than signs as we know them were created by the English mathematician Thomas Harriot, and introduced to the world in his posthumous Artis Analyticae Praxis (1631). He also had an idea of putting a . between the numerals of an expression and the letters multiplied by them, for example, “4.x” to mean four times x. We mostly do without that now, taking multiplication as assumed if two meaningful quantities are put next to one another. But we will use, now, a vertically-centered dot to separate terms multiplied together when that helps our organization. The equals sign we trace to the 16th century mathematician Robert Recorde, whose 1557 Whetsone of Witte uses long but recognizable equals signs. The = sign went into hibernation after that, though, until the 17th century and it took some time to quite get well-used. So it often is with symbols.

Mr Tanner: 'Today we'll talk about where numbers come from. Take zero, for instance ... Quincy, do you know who invented the zero?' Quincy: 'I'm not sure, Mr Tanner, but from the grades I get it must have been one of my teachers.'
Ted Shearer’s Quincy for the 25th of April, 1978 and rerun the 19th of June, 2017. The question does make me wonder how far Mr Tanner was going to go with this. The origins of zero and one are great stuff for class discussion. Two, also. But what about three? Five? Ten? Twelve? Minus one? Irrational numbers, if the class has got up to them? How many students are going to be called on to talk about number origins? And how many truly different stories are there?

Ted Shearer’s Quincy for the 25th of April, 1978 and rerun the 19th of June, starts from the history of zero. It’s worth noting there are a couple of threads woven together in the concept of zero. One is the idea of “nothing”, which we’ve had just forever. I mean, the idea that there isn’t something to work with. Another is the idea of the … well, the additive identity, there being some number that’s one less than one and two less than two. That you can add to anything without changing the thing. And then there’s symbols. There’s the placeholder for “there are no examples of this quantity here”. There’s the denotation of … well, the additive identity. All these things are zeroes, and if you listen closely, they are not quite the same thing. Which is not weird. Most words mean a collection of several concepts. We’re lucky the concepts we mean by “zero” are so compatible in meaning. Think of the poor person trying to understand the word “bear”, or “cleave”.

John Deering’s Strange Brew for the 19th is a “New Math” joke, fittingly done with cavemen. Well, numerals were new things once. Amusing to me is that — while I’m not an expert — in quite a few cultures the symbol for “one” was pretty much the same thing, a single slash mark. It’s hard not to suppose that numbers started out with simple tallies, and the first thing to tally might get dressed up a bit with serifs or such but is, at heart, the same thing you’d get jabbing a sharp thing into a soft rock.

Guy Gilchrist’s Today’s Dogg for the 19th I’m sure is a rerun and I think I’ve featured it here before. So be it. It’s silly symbol-play and dog arithmetic. It’s a comic strip about how dogs are cute; embrace it or skip it.

Zach Weinersmith’s Saturday Morning Breakfast Cereal is properly speaking reruns when it appears on GoComics.com. For whatever reason Weinersmith ran a patch of mathematics strips there this past week. So let me bundle all that up. On the 19th he did a joke mathematicians get a lot, about how the only small talk anyone has about mathematics is how they hated mathematics. I’m not sure mathematicians have it any better than any other teachers, though. Have you ever known someone to say, “My high school gym class gave me a greater appreciation of the world”? Or talk about how grade school history opened their eyes to the wonders of the subject? It’s a sad thing. But there are a lot of things keeping teachers from making students feel joy in their subjects.

For the 21st Weinersmith makes a statisticians joke. I can wrangle some actual mathematics out of an otherwise correctly-formed joke. How do we ever know that something is true? Well, we gather evidence. But how do we know the evidence is relevant? Even if the evidence is relevant, how do we know we’ve interpreted it correctly? Even if we have interpreted it correctly, how do we know that it shows what we want to know? Statisticians become very familiar with hypothesis testing, which amounts to the question, “does this evidence indicate that some condition is implausibly unlikely”? And they can do great work with that. But “implausibly unlikely” is not the same thing as “false”. A person knowledgeable enough and honest turns out to have few things that can be said for certain.

The June 23rd strip I’ve seen go around Mathematics Twitter several times, as see above tweet, about the ways in which mathematical literacy would destroy modern society. It’s a cute and flattering portrait of mathematics’ power, probably why mathematicians like passing it back and forth. But … well, how would “logic” keep people from being fooled by scams? What makes a scam work is that the premise seems logical. And real-world problems — as opposed to logic-class problems — are rarely completely resolvable by deductive logic. There have to be the assumptions, the logical gaps, and the room for humbuggery that allow hoaxes and scams to slip through. And does anyone need a logic class to not “buy products that do nothing”? And what is “nothing”? I have more keychains than I have keys to chain, even if we allow for emergencies and reasonable unexpected extra needs. This doesn’t stop my buying keychains as souvenirs. Does a Penn Central-logo keychain “do nothing” merely because it sits on the windowsill rather than hold any sort of key? If so, was my love foolish to buy it as a present? Granted that buying a lottery ticket is a foolish use of money; is my life any worse for buying that than, say, a peanut butter cup that I won’t remember having eaten a week afterwards? As for credit cards — It’s not clear to me that people max out their credit cards because they don’t understand they will have to pay it back with interest. My experience has been people max out their credit cards because they have things they must pay for and no alternative but going further into debt. That people need more money is a problem of society, yes, but it’s not clear to me that a failure to understand differential equations is at the heart of it. (Also, really, differential equations are overkill to understand credit card debt. A calculator with a repeat-the-last-operation feature and ten minutes to play is enough.)

A Leap Day 2016 Mathematics A To Z: Wlog


Wait for it.

Wlog.

I’d like to say a good word for boredom. It needs the good words. The emotional state has an appalling reputation. We think it’s the sad state someone’s in when they can’t find anything interesting. It’s not. It’s the state in which we are so desperate for engagement that anything is interesting enough.

And that isn’t a bad thing! Finding something interesting enough is a precursor to noticing something curious. And curiosity is a precursor to discovery. And discovery is a precursor to seeing a fuller richness of the world.

Think of being stuck in a waiting room, deprived of reading materials or a phone to play with or much of anything to do. But there is a clock. Your classic analog-face clock. Its long minute hand sweeps out the full 360 degrees of the circle once every hour, 24 times a day. Its short hour hand sweeps out that same arc every twelve hours, only twice a day. Why is the big unit of time marked with the short hand? Good question, I don’t know. Probably, ultimately, because it changes so much less than the minute hand that it doesn’t need the attention of length drawn to it.

But let our waiting mathematician get a little more bored, and think more about the clock. The hour and minute hand must sometimes point in the same direction. They do at 12:00 by the clock, for example. And they will at … a little bit past 1:00, and a little more past 2:00, and a good while after 9:00, and so on. How many times during the day will they point the same direction?

Well, one easy way to do this is to work out how long it takes the hands, once they’ve met, to meet up again. Presumably we don’t want to wait the whole hour-and-some-more-time for it. But how long is that? Well, we know the hands start out pointing the same direction at 12:00. The first time after that will be after 1:00. At exactly 1:00 the hour hand is 30 degrees clockwise of the minute hand. The minute hand will need five minutes to catch up to that. In those five minutes the hour hand will have moved another 2.5 degrees clockwise. The minute hand needs about four-tenths of a minute to catch up to that. In that time the hour hand moves — OK, we’re starting to see why Zeno was not an idiot. He never was.

But we have this roughly worked out. It’s about one hour, five and a half minutes between one time the hands meet and the next. In the course of twelve hours there’ll be time for them to meet up … oh, of course, eleven times. Over the course of the day they’ll meet up 22 times and we can get into a fight over whether midnight counts as part of today, tomorrow, or both days, or neither. (The answer: pretend the day starts at 12:01.)

Hold on, though. How do we know that the time between the hands meeting up at 12:00 and the one at about 1:05 is the same as the time between the hands meeting up near 1:05 and the next one, sometime a little after 2:10? Or between that one and the one at a little past 3:15? What grounds do we have for saying this one interval is a fair representation of them all?

We can argue that it should be fairly enough. Imagine that all the markings were washed off the clock. It’s just two hands sweeping around in circles, one relatively fast, one relatively slow, forever. Give the clockface a spin. When the hands come together again rotate the clock so those two hands are vertical, the “12:00” position. Is this actually 12:00? … Well, we’ve got a one-in-eleven chance it is. It might be a little past 1:05; it might be that time something past 6:30. The movement of the clock hands gives no hint what time it really is.

And that is why we’re justified taking this one interval as representative of them all. The rate at which the hands move, relative to each other, doesn’t depend on what the clock face behind it says. The rate is, if the clock isn’t broken, always the same. So we can use information about one special case that happens to be easy to work out to handle all the cases.

That’s the mathematics term for this essay. We can study the one specific case without loss of generality, or as it’s inevitably abbreviated, wlog. This is the trick of studying something possibly complicated, possibly abstract, by looking for a representative case. That representative case may tell us everything we need to know, at least about this particular problem. Generality means what you might figure from the ordinary English meaning of it: it means this answer holds in general, as opposed to in this specific instance.

Some thought has to go in to choosing the representative case. We have to pick something that doesn’t, somehow, miss out on a class of problems we would want to solve. We mustn’t lose the generality. And it’s an easy mistake to make, especially as a mathematics student first venturing into more abstract waters. I remember coming up against that often when trying to prove properties of infinitely long series. It’s so hard to reason something about a bunch of numbers whose identities I have no idea about; why can’t I just use the sequence, oh, 1/1, 1/2, 1/3, 1/4, et cetera and let that be good enough? Maybe 1/1, 1/4, 1/9, 1/16, et cetera for a second test, just in case? It’s because it takes time to learn how to safely handle infinities.

It’s still worth doing. Few of us are good at manipulating things in the abstract. We have to spend more mental energy imagining the thing rather than asking the questions we want of it. Reducing that abstraction — even if it’s just a little bit, changing, say, from “an infinitely-differentiable function” to “a polynomial of high enough degree” — can rescue us. We can try out things we’re confident we understand, and derive from it things we don’t know.

I can’t say that a bored person observing a clock would deduce all this. Parts of it, certainly. Maybe all, if she thought long enough. I believe it’s worth noticing and thinking of these kinds of things. And it’s why I believe it’s fine to be bored sometimes.

Reading the Comics, February 23, 2016: No Students Resist Word Problems Edition


This week Comic Strip Master Command ordered the mention of some of the more familiar bits of mathematical-premise stock that aren’t students resisting word problems. This happens sometimes.

Rick Stromoski’s Soup to Nutz for the 18th of February finds a fresh joke in the infinite-monkeys problem. Well, it uses a thousand monkeys here, but that hardly matters. If you had one long-enough-lived monkey at the typewriter, in principle, we could expect them to type the works of Shakespeare. It’s how long it takes that changes. In practice, it’s going to be too long to wait for anyway. I wonder if the monkeys will ever get computers to replace their typewriters.

Carol Lay’s Lay Lines for the 19th finds a fresh joke in Zeno’s Paradoxes. Lay particularly uses the most famous of Zeno’s Paradoxes. That’s the one about not being able to get anywhere because you have to get halfway there first, and halfway to that, in infinite regression. The other of Zeno’s Paradoxes that anyone who hasn’t just read the Wikipedia article on them can remember is Achilles and the Tortoise. It’s the question of how one can catch up to something. By the time you get to where the thing ahead of you is now, it’s gotten farther ahead still. And it does so again, in infinite regression. The third of the Paradoxes is about motion, depicted here as an arrow trying to fly through the air. Allow that speed is the distance travelled versus the time it takes to travel. But suppose time can be divided into infinitesimally tiny units. Then the distance the arrow travels in that time will also be infinitesimally tiny. So how can its speed have any meaningful definition? And the last is a hard-to-follow thing about three rods moving relative one another. I don’t feel confident describing it because I only intermittently feel like I understand what the paradox is getting at. I believe it’s supposed to be a problem with understanding how speeds can add together.

Anyway, the point of the paradoxes is not something as trite as “silly Ancient Greeks didn’t understand calculus”. They had an awfully good understanding of what makes calculus work. The point is that either space and time are infinitely divisible or else they aren’t. Either possibility has consequences that challenge our intuitions of how space and time should work.

Dave Blazek’s Loose Parts for the 19th uses scientific notation. It’s a popular way to represent large (and small) numbers. It’s built on the idea that there are two interesting parts to a number: about how big it is, and what its leading values are. We use some base, nearly always 10, raised to a power to represent how big the number is. And we use the rest, a number between 1 and whatever the base is, to represent the leading values. Blazek’s channel 3 x 103 is just channel 3000, though. My satellite TV package has channels numbering from 6 up through 9999, although not all of them. Many are empty. Still, it would be a more excessive number of options if he were on channel 3 x 106, or 3,000,000.

Russell Myers’s Broom Hilda for the 22nd shows Nerwin trying to learn addition by using a real-world model. I tend to be willing to let people use whatever tool they find works to learn something. But any learning aid has its limits, and trying to get around them can be challenging, or just creepy.

Dave Whamond’s Reality Check for the 22nd is another version of that rounding-up joke that’s gone around Comic Strip Master Command, and your friends’ Facebook timelines, several times now. Well, I enjoy how suspicious the sheep up front are.

'Hammie, we do NOT call the police for 'Homework Emergencies'!!' 'But Dad, these fractions are killing me!'
Rick Kirkman and Jerry Scott’s Baby Blues for the 23rd of February, 2016. You know, that’s an awfully tiny mirror above the keys. There’s no way Wanda and Darryl can even see their whole faces in it.

Rick Kirkman and Jerry Scott’s Baby Blues for the 23rd I include mostly because I wanted some pictures to include here. But mathematics is always a reliable choice when one needs scary school work to do. And I grant that fraction are particularly unsettling. There is something exotic in being told 1/2 is much bigger than 1/6, when one knows that 2 is so much smaller than 6. And just when one’s gotten comfortable with that, someone has you subtract one fraction from another.

In the olden days of sailors and shipping, the pay for a ship’s crew would be in shares of the take of the whole venture. The story I have read, but which I am not experienced enough to verify, depends on not understanding fractions. Naive sailors would demand rather than the offered 96th (or whatever) share of the revenues a 100th or 150th or even bigger numbers. Paymasters would pretend to struggle with before assenting to. Perhaps it’s so. Not understanding finance is as old as finance. But it does also feel like a legend designed to answer the question of when will someone need to know mathematics anyway.

Words: SERDS O O O - -; MENOV O - O - -; GENBIN - O - - O O; and LINKUE - O - O - O. The Professor's explanation of infinity seemed like it was OOOOO-OOOOOO.
David L Hoyt and Jeff Knurek’s Jumble for the 24th of February, 2016. The link will likely expire in late March. The third scrambled word reveals to me that ‘nebing’ is totally a word that some science fiction project should be able to use.

David L Hoyt and Jeff Knurek’s Jumble for the 24th is not necessarily a mathematics comic. It could be philosophy or theology or possibly some other fields. Still, I imagine you can have fun working this out even if the final surprise-answer jumped out at me before I looked at the other words.

Reading the Comics, February 2, 2016: Pre-Lottery Edition


So a couple weeks ago one of the multi-state lotteries in the United States reached a staggering jackpot of one and a half billion dollars. And it turns out that “a couple weeks” is about the lead time most syndicated comic strip artists maintain. So there’s a rash of lottery-themed comic strips. There’s enough of them that I’m going to push those off to the next Reading the Comics installment. I’ll make do here with what Comic Strip master Command sent us before thoughts of the lottery infiltrated folks’ heads.

Punkinhead: 'I was counting to five and couldn't remember what came after seven.' Tiger: 'If you're counting to five nothing comes after seven.' Punkinhead: 'I thought sure he would know.'
Bud Blake’s Tiger for the 28th of January, 2016. I do like Punkinhead’s look of dismay in the second panel that Tiger has failed him.

Bud Blake’s Tiger for the 28th of January (a rerun; Blake’s been dead a long while) is a cute one about kids not understanding numbers. And about expectations of those who know more than you, I suppose. I’d say this is my favorite of this essay’s strips. Part of that is that it reminds me of a bit in one of the lesser Wizard of Oz books. In it the characters have to count by twos to seventeen to make a successful wish. That’s the sort of problem you expect in fairy lands and quick gags.

Mort Walker’s Beetle Bailey (Vintage) from the 7th of July, 1959 (reprinted the 28th of January) also tickles me. It uses the understanding of mathematics as stand-in for the understanding of science. I imagine it’s also meant to stand in for intelligence. It’s also a good riff on the Sisyphean nature of teaching. The equations on the board at the end almost look meaningful. At least, I can see some resemblance between them and the equations describing orbital mechanics. Camp Swampy hasn’t got any obvious purpose or role today. But the vintage strips reveal it had some role in orbital rocket launches. This was in the late 50s, before orbital rockets worked.

General: 'How's your porject coming along to teach the men some science, Captain?' Captain: 'Wonderful, sir. Six months ago they didn't know what the square root of four was! Now they don't know what this [ blackboard full of symbols ] is!'
Mort Walker’s Beetle Bailey (Vintage) for the 7th of July, 1959. This is possibly the brightest I’ve ever seen Beetle, and he doesn’t know what he’s looking at.

Matt Lubchansky’s Please Listen To Me for the 28th of January is a riff on creationist “teach the controversy” nonsense. So we get some nonsense about a theological theory of numbers. Historically, especially in the western tradition, much great mathematics was done by theologians. Lazy histories of science make out religion as the relentless antagonist to scientific knowledge. It’s not so.

The equation from the last panel, F(x) = \mathcal{L}\left\{f(t)\right\} = \int_0^{\infty} e^{-st} f(t) dt , is a legitimate one. It describes the Laplace Transform of the function f(t). It’s named for Pierre-Simon Laplace. That name might be familiar from mathematical physics, astronomy, the “nebular” hypothesis of planet formation, probability, and so on. Laplace transforms have many uses. One is in solving differential equations. They can change a differential equation, hard to solve, to a polynomial, easy to solve. Then by inverting the Laplace transform you can solve the original, hard, differential equation.

Another major use that I’m familiar with is signal processing. Often we will have some data, a signal, that changes in time or in space. The Laplace transform lets us look at the frequency distribution. That is, what regularly rising and falling patterns go in to making up the signal (or could)? If you’ve taken a bit of differential equations this might sound like it’s just Fourier series. It’s related. (If you don’t know what a Fourier series might be, don’t worry. I bet we’ll come around to discussing it someday.) It might also remind readers here of the z-transform and yes, there’s a relationship.

The transform also shows itself in probability. We’re often interested in the probability distribution of a quantity. That’s what the possible values it might have are, and how likely each of those values is. The Laplace transform lets us switch between the probability distribution and a thing called the moment-generating function. I’m not sure of an efficient way of describing what good that is. If you do, please, leave a comment. But it lets you switch from one description of a thing to another. And your problem might be easier in the other description.

John McPherson’s Close To Home for the 30th of January uses mathematics as the sort of thing that can have an answer just, well, you see it. I suppose only geography would lend itself to a joke like this (“What state is Des Moines in?”)

Wally explains to the Pointy-Haired Boss that he's in the Zeno's Paradox phase of the project, in which 'every step we take gets us halfway closer to launch', a pace that he hopes 'it will look' like he's keeping up. First week in, he is.
Scott Adams’s Dilbert for the 31st of January. The link will probably expire around the end of February or start of March.

Scott Adams’s Dilbert for the 31st of January mentions Zeno’s Paradox, three thousand years old and still going strong. I haven’t heard the paradox used as an excuse to put off doing work. It does remind me of the old saw that half your time is spent on the first 90 percent of the project, and half your time on the remaining 10 percent. It’s absurd but truthful, as so many things are.

Samson’s Dark Side Of The Horse for the 2nd of February (I’m skipping some lottery strips to get here) plays on the merger of the ideas of “turn my life completely around” and “turn around 360 degrees”. A perfect 360 degree rotation would be an “identity tranformation”, leaving the thing it’s done to unchanged. But I understand why the terms merged. As with many English words or terms, “all the way around” can mean opposite things.

But anyone playing pinball or taking time-lapse photographs or just listening to Heraclitus can tell you. Turning all the way around does not leave you quite what you were before. People aren’t perfect at rotations, and even if they were, the act of breaking focus and coming back to it changes what one’s doing.

Reading the Comics, January 15, 2015: Electric Brains and Klein Bottles Edition


I admit I don’t always find a theme running through Comic Strip Master Command’s latest set of mathematically-themed comics. The edition names are mostly so that I can tell them apart when I see a couple listed in the Popular Posts roundup anyway.

Little Iodine and her parents see an electronic brain capable of solving any problem; her father offers 'square root of 7,921 x^2 y^2'. It gets it correct, 89 xy. Little Iodine, inspired, makes her own. 'Where are you getting all the money for those ice cream cones and stuff?' her father demands. 'I made a 'lectric brain --- the kids pay me a nickel when they got homework --- here --- give me a problem.' He offers 9 times 16. The electric brain writes out 'Dere teecher, Plees xcuse my chidl for not doing his homwork'. 'And then these letters come out --- the kid gives it to the teacher and everything's okey --- '
Jimmy Hatlo’s Little Iodine for the 12th of January, 2016. Originally run the 7th of November, 1954.

Jimmy Hatlo’s Little Iodine is a vintage comic strip from the 1950s. It strikes me as an unlicensed adaptation of Baby Schnooks, but that’s not something for me to worry about. The particular strip, originally from the 7th of November, 1954 (and just run the 12th of January this year) interests me for its ancient views of computers. It’s from the days they were called “electric brains”. I’m also impressed that the machine on display early on is able to work out the “square root of 7921 x2 y2”. The square root of 7921 is no great feat. Being able to work with the symbols of x and y without knowing what they stand for, though, does impress me. I’m not sure there were computers which could handle that sort of symbolic manipulation in 1954. That sort of ability to work with a quantity by name rather than value is what we would buy Mathematica for, if we could afford it. It’s also at least a bit impressive that someone knows the square of 89 offhand. All told, I think this is my favorite of this essay’s set of strips. But it’s a weak field considering none of them are “students giving a snarky reply to a homework/exam/blackboard question”.

Joe Martin’s Willy and Ethel for the 13th of January is a percentages joke. Some might fault it for talking about people giving 110 percent, but of course, what is “100 percent”? If it’s the standard amount of work being done then it does seem like ten people giving 110 percent gets the job done as quickly as eleven people doing 100 percent. If work worked like that.

Willy asks his kid: 'OK, here's a question of my own that involves math and principles. If you're on an 11 man crew and 10 of them are giving 110%, do you have to show up for work?
Joe Martin’s Willy and Ethel for the 13th of January, 2016. The link will likely expire in mid-February.

Steve Sicula’s Home and Away for the 13th (a rerun from the 8th of October, 2004) gives a wrongheaded application of a decent principle. The principle is that of taking several data points and averaging their value. The problem with data is that it’s often got errors in it. Something weird happened and it doesn’t represent what it’s supposed to. Or it doesn’t represent it well. By averaging several data points together we can minimize the influence of a fluke reading. Or if we’re measuring something that changes in time, we might use a running average of the last several sampled values. In this way a short-term spike or a meaningless flutter will be minimized. We can avoid wasting time reacting to something that doesn’t matter. (The cost of this, though, is that if a trend is developing we will notice it later than we otherwise would.) Still, sometimes a data point is obviously wrong.

Zach Weinersmith’s Saturday Morning Breakfast Cereal wanted my attention, and so on the 13th it did a joke about Zeno’s Paradox. There are actually four classic Zeno’s Paradoxes, although the one riffed on here I think is the most popular. This one — the idea that you can’t finish something (leaving a room is the most common form) because you have to get halfway done, and have to get halfway to being halfway done, and halfway to halfway to halfway to being done — is often resolved by people saying that Zeno just didn’t understand that an infinite series could converge. That is, that you can add together infinitely many numbers and get a finite number. I’m inclined to think Zeno did not, somehow, think it was impossible to leave rooms. What the paradoxes as a whole get to are questions about space and time: they’re either infinitely divisible or they’re not. And either way produces effects that don’t seem to quite match our intuitions.

The next day Saturday Morning Breakfast Cereal does a joke about Klein bottles. These are famous topological constructs. At least they’re famous in the kinds of places people talk about topological constructs. It’s much like the Möbius strip, a ribbon given a twist and joined back to its edge. The Klein bottle similarly you can imagine as a cylinder stretched out into the fourth dimension, given a twist, then joined back to itself. We can’t really do this, what with it being difficult to craft four-dimensional objects. But we can imagine this, and it creates an object that doesn’t have a boundary, and has only one side. There’s not an inside or an outside. There’s no making this in the real world, but we can make nice-looking approximations, usually as bottles.

Ruben Bolling’s Super-Fun-Pak Comix for the 13th of January is an extreme installment of Chaos Butterfly. The trouble with touching Chaos Butterfly to cause disasters is that you don’t know — you can’t know — what would have happened had you not touched the butterfly. You change your luck, but there’s no way to tell whether for the better or worse. One of the commenters at Gocomics.com alludes to this problem.

Jon Rosenberg’s Scenes From A Multiverse for the 13th of January makes quite literal quantum mechanics talk about probability waves and quantum foam and the like. The wave formulation of quantum mechanics, the most popular and accessible one, describes what’s going on in equations that look much like the equations for things diffusing into space. And quantum mechanical problems are often solved by supposing that the probability distribution we’re interested in can be broken up into a series of sinusoidal waves. Representing a complex function as a set of waves is a common trick, not just in quantum mechanics, because it works so well so often. Sinusoidal waves behave in nice, predictable ways for most differential equations. So converting a hard differential equation problem into a long string of relatively easy differential equation problems is usually a good trade.

Tom Thaves’s Frank and Ernest for the 14th of January ties together the baffling worlds of grammar and negative numbers. It puts Frank and Ernest on panel with Euclid, who’s a fair enough choice to represent the foundation of (western) mathematics. He’s famous for the geometry we now call Euclidean. That’s the common everyday kind of blackboards and tabletops and solid cubes and spheres. But among his writings are compilations of arithmetic, as understood at the time. So if we know anyone in Ancient Greece to have credentials to talk about negative numbers it’s him. But the choice of Euclid traps the panel into an anachronism: the Ancient Greeks just didn’t think of negative numbers. They could work through “a lack of things” or “a shortage of something”, but a negative? That’s a later innovation. But it’s hard to think of a good rewriting of the joke. You might have Isaac Newton be consulted, but Newton makes normal people think of gravity and physics, confounding the mathematics joke. There’s a similar problem with Albert Einstein. Leibniz or Gauss should be good, but I suspect they’re not the household names that even Euclid is. And if we have to go “less famous mathematician than Gauss” we’re in real trouble. (No, not Andrew Wiles. Normal people know him as “the guy that proved Fermat’s thing”, and that’s too many words to fit on panel.) Perhaps the joke can’t be made to read cleanly and make good historic sense.

Reading The Comics, November 4, 2014: Will Pictures Ever Reappear Edition


I had assumed that at some point the good folks at Comics Kingdom would let any of their cartoonists do a panel that’s got mathematical content relevant enough for me to chat about, but apparently that’s just not happening. So for a third time in a row here’s a set of Gocomics-only comic strips, with reasonably stable links and images I don’t feel the need to include. Enjoy, please.

Fred Wagner’s Animal Crackers (October 26) presents an old joke — counting the number of animals by counting the number of legs and dividing by four — although it’s only silly because it’s hard to imagine a case where it’s easier to count the legs on a bunch of animals than it is to count the animals themselves. But if it’s the case that every animal has exactly four legs, then, there’s what’s called a one-to-one relationship between the set of animals and the set of animal legs: if you have some number of animals you have exactly four times that number of animal legs, and if you have some number of animal legs you have exactly one-fourth that number of animals, and you can count whatever’s the more convenient for you and use that to get what you’re really interested in. Showing such a one-to-one relationship exists between two interesting things can often be a start to doing more interesting problems, especially if you can show that the relationship also preserves some interesting interactions; if you have two ways to work out a problem, you can do the easier one.

Mark Anderson’s Andertoons (October 27) riffs on the place value for numbers written in the familiar Arabic style. As befitting a really great innovation, place value becomes invisible when you’re familiar with it; it takes a little sympathy and imagination to remember the alienness of the idea that a “2” means different things based on how many digits are to the right (or, if it’s a decimal, to the left) of it.

Anthony Blades’s charming Bewley (October 27) has one of the kids insisting that instinct alone is enough to do maths problems. The work comes out disastrously bad, of course, or there’d not be a comic strip. However, my understanding is that people do have some instinctive understanding even of problems that would seem to have little survival application. One test I’ve seen demonstrating this asks people to give, without thinking, their answer to whether a multiplication problem might be right or wrong. It’s pretty quick for most people to say that “7 times 9 equals 12” has to be wrong; to say that “7 times 9 equals 59” is wrong takes longer, and that seems to reflect an idea that 59 is, if not the right answer, at least pretty close to it. There’s an instinctive plausibility at work there and it’s amazing to think people should have that. Zach Weinersmith’s Saturday Morning Breakfast Cereal for October 31 circles around this idea, with one person having little idea what 1,892,491,287 times 7,798,721,415 divided by 82,493,726,631 might be, but being pretty sure that “4” isn’t it.

Saturday Morning Breakfast Cereal (October 30) also contains a mention of “cross products”, which are an interesting thing people learning vectors trip over. A cross product is defined for a pair of three-dimensional vectors, and the interesting thing is it’s a new vector that’s perpendicular to the two vectors multiplied together. The length of the cross product vector depends on the lengths of the two vectors multiplied together and the angle they make; the closer the two vectors multiplied together are, the smaller the cross product is, to the point that the cross product of two parallel vectors has length zero. The closer the two vectors multiplied together are to perpendicular the longer the cross product vector is.

More mysterious: if you swap the first vector and the second vector being cross-multiplied together, you get a cross product that’s the same size but pointing the opposite direction, pointing (say) down instead of up. Cross products have some areas where they’re particularly useful, especially in describing the movement of charged particles in magnetic fields.

(There’s something that looks a lot like the cross product which exists for seven-dimensional vectors, but I’ve never even heard of anyone who had a use for it, so, you don’t need to do anything about it.)

Eric the Circle (November 2), this one by “dDave”, presents the idea that that the points on a line might themselves be miniature Erics the Circle. What a line is made of is again one of those problems that straddles the lines between mathematics and philosophy. It seems to be one of the problems of infinity that Zeno’s Paradoxes outlined so perfectly thousands of years ago. To shorten it to the point it becomes misleading, is a line made up of things that have some width? If they’re infinitesimals, things with no width, then, how can an aggregate of things with no width come to have some width? But if they’re made up of things which have some width, how can there be infinitely many of them fitting into a finite space?

We can form good logical arguments about the convergence of infinite series — lining up, essentially, circles of ever-dwindling but ever-positive sizes so that the pile has a finite length — but that seems to suggest that space has to be made up of intervals of different widths, which seems silly; why couldn’t all the miniature circles be the same? In short, space is either infinitely divisible into identical things, or it is not, and neither one is completely satisfying.

Guy Gilchrist’s Nancy (November 2) uses math homework appearing in the clouds, although that’s surely because it’s easier to draw a division problem than it is to depict an assignment for social studies or English.

Todd Clark’s Lola (November 4) uses an insult-the-minor-characters variant of what seems to be the standard way of explaining fractions to kids, that of dividing a whole thing into smaller pieces and counting the number of smaller pieces. As physical interpretations of mathematical concepts goes I suppose that’s hard to beat.