## Reading the Comics, September 29, 2019: September 29, 2019 Edition

Several of the mathematically-themed comic strips from last week featured the fine art of calculation. So that was set to be my title for this week. Then I realized that all the comics worth some detailed mention were published last Sunday, and I do like essays that are entirely one-day affairs. There are a couple of other comic strips that mentioned mathematics tangentially and I’ll list those later this week.

John Hambrock’s The Brilliant Mind of Edison lee for the 29th has Edison show off an organic computer. This is a person, naturally enough. Everyone can do some arithmetic in their heads, especially if we allow that sometimes approximate answers are often fine. People with good speed and precision have always been wonders, though. The setup may also riff on the ancient joke of mathematicians being ways to turn coffee into theorems. (I would imagine that Hambrock has heard that joke. But it is enough to suppose that he’s aware many adult humans drink coffee.)

John Kovaleski’s Daddy Daze for the 29th sees Paul, the dad, working out the calculations his son (Angus) proposed. It’s a good bit of arithmetic that Paul’s doing in his head. The process of multiplying an insubstantial thing by many, many times until you get something of moderate size happens all the time. Much of integral calculus is based on the idea that we can add together infinitely many infinitesimal numbers, and from that get something understandable on the human scale. Saving nine seconds every other day is useless for actual activities, though. You need a certain fungibility in the thing conserved for the bother to be worth it.

Dan Thompson’s Harley for the 29th gets us into some comic strips not drawn by people named John. The comic has some mathematics in it qualitatively. The observation that you could jump a motorcycle farther, or higher, with more energy, and that you can get energy from rolling downhill. It’s here mostly because of the good fortune that another comic strip did a joke on the same topic, and did it quantitatively. That comic?

Bill Amend’s FoxTrot for the 29th. Young prodigies Jason and Marcus are putting serious calculation into their Hot Wheels track and working out the biggest loop-the-loop possible from a starting point. Their calculations are right, of course. Bill Amend, who’d been a physics major, likes putting authentic mathematics and mathematical physics in. The key is making sure the car moves fast enough in the loop that it stays on the track. This means the car experiencing a centrifugal force that’s larger than that of gravity. The centrifugal force on something moving in a circle is proportional to the square of the thing’s speed, and inversely proportional to the radius of the circle. This for a circle in any direction, by the way.

So they need to know, if the car starts at the height A, how fast will it go at the top of the loop, at height B? If the car’s going fast enough at height B to stay on the track, it’s certainly going fast enough to stay on for the rest of the loop.

The hard part would be figuring the speed at height B. Or it would be hard if we tried calculating the forces, and thus acceleration, of the car along the track. This would be a tedious problem. It would depend on the exact path of the track, for example. And it would be a long integration problem, which is trouble. There aren’t many integrals we can actually calculate directly. Most of the interesting ones we have to do numerically or work on approximations of the actual thing. This is all right, though. We don’t have to do that integral. We can look at potential energy instead. This turns what would be a tedious problem into the first three lines of work. And one of those was “Kinetic Energy = Δ Potential Energy”.

But as Peter observes, this does depend on supposing the track is frictionless. We always do this in basic physics problems. Friction is hard. It does depend on the exact path one follows, for example. And it depends on speed in complicated ways. We can make approximations to allow for friction losses, often based in experiment. Or try to make the problem one that has less friction, as Jason and Marcus are trying to do.

Jeffrey Caulfield and Alexandre Rouillard’s Mustard and Boloney for the 29th is the anthropomorphic numerals joke for the week. This is a slight joke to include here. But there were many comic strips of slight mathematical content. I intend to list them in an essay on Wednesday.

Tuesday I plan to be a day for the Fall 2019 A-to-Z. Again, thank you for reading.

## Reading the Comics, July 3, 2018: Fine, Jef Mallett Wants My Attention Edition

Three of these essays in a row now that Jef Mallett’s Frazz has done something worth responding to. You know, the guy lives in the same metro area. He could just stop in and visit sometime. There’s a pinball league in town and everything. He could view it as good healthy competition.

Bill Hinds’s Cleats for the 1st is another instance of the monkeys-on-typewriters metaphor. The metaphor goes back at least as far as 1913, when Émile Borel wrote a paper on statistical mechanics and the reversibility problem. Along the way it was worth thinking of the chance of impossibly unlikely events, given enough time to happen. Monkeys at typewriters formed a great image for a generator of text that knows no content or plan. Given enough time, this random process should be able to produce all the finite strings of text, whatever their content. And the metaphor’s caught people’s fancy I guess there’s something charming and Dadaist about monkeys doing office work. Borel started out with a million monkeys typing ten hours a day. Modern audiences sometimes make this an infinite number of monkeys typing without pause. This is a reminder of how bad we’re allowing pre-revolutionary capitalism get.

Sometimes it’s cut down to a mere thousand monkeys, as in this example. Often it’s Shakespeare, but sometimes it’s other authors who get duplicated. Dickens seems like a popular secondary choice. In joke forms, the number of monkeys and time it would take to duplicate something is held as a measure of the quality of the original work. This comes from people who don’t understand. Suppose the monkeys and typewriters are producing truly random strings of characters. Then the only thing that affects how long it takes them to duplicate some text is the length of the original text. How good the text is doesn’t enter into it.

Jef Mallett’s Frazz for the 1st is about the comfort of knowing about things one does not know. And that’s fine enough. Frazz cites Fermat’s Last Theorem as a thing everyone knows of but doesn’t understand. And that choice confuses me. I’m not sure what there would be to Fermat’s Last Theorem that someone who had heard of it would not understand. The basic statement of it — if you have three positive whole numbers a, b, and c, then there’s no whole number n larger than 2 so that $a^n + b^n$ equals $c^n$ — has it.

But “understanding” is a flexible concept. He might mean that people don’t know why the Theorem is true. Fair enough. Andrew Wiles and Richard Taylor’s proof is a long thing that goes deep into a field of mathematics that even most mathematicians don’t study. Why it should be true can be an interesting question, and one that’s hard to ever satisfyingly answer. What is the difference between a proof that something is true and an explanation for why it’s true? And before you say there’s not one, please consider that many mathematicians do experience a difference between seeing something proved and understanding why something is true.

And Frazz might also mean that nobody knows what use Fermat’s Last Theorem is. This is a fair complaint too. I’m not aware offhand of any interesting results which follow from its truth, nor of anything neat that would come about had it been false. It’s just one of those things that happens to be true, and that we’ve found to be pretty, perhaps because it is easy to ask whether it’s true and hard to answer. I don’t know.

Morrie Turner’s Wee Pals for the 2nd has a kid looking for a square root. We all have peculiar hobbies. His friends speak of it as though it’s a lost physical object. This is a hilarious misunderstanding until it strikes you that we speak about stuff like square roots “existing”. Indeed, the language of mathematics would be trashed if we couldn’t speak about numerical constructs “existing” somewhere to be “found”. But try to put “four” in a box and see what you get. That we mostly have little trouble understanding what we mean by showing some mathematical construct exists, and what we hope to do by looking for it, suggests we roughly know what we mean by the phrases. All right then; what is that, in terms a kid could understand?

Your first iteration, the first guess for a better answer, is to calculate the number $x_1 = \frac{1}{2}\left( x_0 + \frac{S}{x_0}\right)$. Typically, x1 will be closer to the square root of S than will x0 be. And in any case, we can get closer still. Use x1 to calculate a new number. This is $x_2 = \frac{1}{2}\left( x_1 + \frac{S}{x_1}\right)$. And then x3 and x4 and x5 and so on. In theory, you never finish; you’re stuck finding an infinitely long sequence of better approximations to the square root. In practice, you finish; you find that you’re close enough to the square root. Well, the square root of a whole number is either a whole number (if it was a perfect square to start) or is an irrational number. You were going to stop on an approximation sooner or later.