My Little 2021 Mathematics A-to-Z: Convex


Jacob Siehler, a friend from Mathstodon, and Assistant Professor at Gustavus Adolphus College, offered several good topics for the letter ‘C’. I picked the one that seemed to connect to the greatest number of other topics I’ve covered recently.

Convex

It’s easy to say what convex is, if we’re talking about shapes in ordinary space. A convex shape is one where the line connecting any two points inside the shape always stays inside the shape. Circles are convex. Triangles and rectangles too. Star shapes are not. Is a torus? That depends. If it’s a doughnut shape sitting in some bigger space, then it’s not convex. If the doughnut shape is all the space there is to consider, then it is. There’s a parallel here to prime numbers. Whether 5 is a prime depends on whether you think 5 is an integer, a real number, or a complex number.

Still, this seems easy to the point of boring. So how does Wolfram Mathworld match 337 items for ‘convex’? For a sense of scale, it has only 112 matches for ‘quadrilateral’. This is a word used almost as much as ‘quadratic’, with 370 items. Why?

Why is that it’s one of those terms that sneaks in everywhere. Some of it is obvious. There’s a concept called “star-convex”, where two points only need a connection by some path. It doesn’t have to be a straight line. That’s a familiar mathematical trick, coming up with a less-demanding version of a property. There’s the “convex hull”, which is the smallest convex set that contains a given set of points. We even come up with “convex functions”, functions of real numbers. A function’s convex if, the space above the graph of a function is convex. This seems like stretching the idea of convexity rather a bit.

Still, we wouldn’t coin such a term if we couldn’t use it. Well, if someone couldn’t use it. The saving thing here is the idea of “space”. We get it from our idea of what space is from looking around rooms and walking around hills and stuff. But what makes something a space? When we look at what’s essential? What we need is traits like, there are things. We can measure how far apart things are. We have some idea of paths between things. That’s not asking a lot.

So many things become spaces. And so convexity sneaks in everywhere. A convex function has nice properties if you’re looking for minimums. Or maximums; that’s as easy to do. And we look for minimums a lot. A large, practical set of mathematics is the search for optimum values, the set of values that maximize, or minimize, something. You may protest that not everything we’re intersted in is a convex function. This is true. But a lot of what we are interested in is, or is approximately.

This gets into some surprising corners. Economics, for example. The mathematics of economics is often interested in how much of a thing you can make. But you have to put things in to make it. You expect, at least once the system is set up, that if you halve the components you put in you get half the thing out. Or double the components in and get double the thing out. But you can run out of the components. Or related stuff, like, floor space to store partly-complete product. Or transport available to send this stuff to the customer. Or time to get things finished. For our needs these are all “things you can run out of”.

And so we have a problem of linear programming. We have something or other we want to optimize. Call it y . It depends on a whole range of variables, which we describe as a vector \vec{x} . And we have constraints. Each of these is an inequality; we can represent that as demanding some functions of these variables be at most some numbers. We can bundle those functions together as a matrix called A . We can bundle those maximum numbers together as a vector called \vec{b} . So the problem is finding A\vec{x} \le \vec{b} . Also, we demand that none of these values be smaller than some minimum we might as well call 0. The range of all the possible values of these variables is a space. These constraints chop up that space, into a shape. Into a convex shape, of course, or this paragraph wouldn’t belong in this essay. If you need to be convinced of this, imagine taking a wedge of cheese and hacking away slices all the way through it. How do you cut a cave or a tunnel in it?

So take this convex shape, called a polytope. That’s what we call a polygon or polyhedron if we don’t want to commit to any particular number of dimensions of space. (If we’re being careful. My suspicion is ‘polyhedron’ is more often said.) This makes a shape. Some point in that shape has the best possible value of y . (Also the worst, if that’s your thing.) Where is it? There is an answer, and it gives a pretext to share a fun story. The answer is that it’s on the outside, on one of the faces of the polytope. And you can find it following along the edges of those polytopes. This we know as the simplex method, or Dantzig’s Simplex Method if we must be more particular, for George Dantzig. Its success relies on looking at convex functions in convex spaces and how much this simplifies finding things.

Usually. The simplex method is one of polynomial-order complexity for normal, typical problems. That’s a measure of how much longer it takes to find an answer as you get more variables, more constraints, more work. Polynomial is okay, growing about the way it takes longer to multiply when you have more digits in the numbers. But there’s a worst case, in which the complexity grows exponentially. We shy away from exponential-complexity because … you know, exponentials grow fast, given a chance. What saves us is that that’s a worst case, not a typical case. The convexity lets us set up our problem and, rather often, solve it well enough.

Now the story, a mutation of which it’s likely you encountered. George Dantzig, as a student in Jerzy Neyman’s statistics class, arrived late one day to find a couple problems on the board. He took these to be homework, and struggled with the harder-than-usual set. But turned them in, apologizing for them being late. Neyman accepted the work, and eventually got around to looking at it. This wasn’t the homework. This was some unsolved problems in statistics. Six weeks later Neyman had prepared them for publication. A year later, Neyman explained to Dantzig that all he needed to earn his PhD was put these two papers together in a nice binder.

This cute story somehow escaped into the wild. It became an inspirational tale for more than mathematics grad students. That part’s easy to see; it has most everything inspiration needs. It mutated further, into the movie Good Will Hunting. I do not know that the unsolved problems, work done in the late 1930s, related to Dantzig’s simplex method, proved after World War II. It may be that they are simply connected in their originator. But perhaps it is more than I realize now.


I hope to finish off the word ‘Mathematics’ with the letter S next week. This week’s essay, and all the essays for the Little Mathematics A-to-Z, should be at this link. And all of this year’s essays, and all the A-to-Z essays from past years, should be at this link. Thank you for reading.

My All 2020 Mathematics A to Z: John von Neumann


Mr Wu, author of the Singapore Maths Tuition blog, suggested another biographical sketch for this year of biographies. Once again it’s of a person too complicated to capture in full in one piece, even at the length I’ve been writing. So I take a slice out of John von Neumann’s life here.

Color cartoon illustration of a coati in a beret and neckerchief, holding up a director's megaphone and looking over the Hollywood hills. The megaphone has the symbols + x (division obelus) and = on it. The Hollywood sign is, instead, the letters MATHEMATICS. In the background are spotlights, with several of them crossing so as to make the letters A and Z; one leg of the spotlights has 'TO' in it, so the art reads out, subtly, 'Mathematics A to Z'.
Art by Thomas K Dye, creator of the web comics Projection Edge, Newshounds, Infinity Refugees, and Something Happens. He’s on Twitter as @projectionedge. You can get to read Projection Edge six months early by subscribing to his Patreon.

John von Neumann.

In March 1919 the Hungarian People’s Republic, strained by Austria-Hungary’s loss in the Great War, collapsed. The Hungarian Soviet Republic, the world’s second Communist state, replaced it. It was a bad time to be a wealthy family in Budapest. The Hungarian Soviet lasted only a few months. It was crushed by the internal tension between city and countryside. By poorly-fought wars to restore the country’s pre-1914 borders. By the hostility of the Allied Powers. After the Communist leadership fled came a new Republic, and a pogrom. Europeans are never shy about finding reasons to persecute Jewish people. It was a bad time to be a Jewish family in Budapest.

Von Neumann was born to a wealthy, (non-observant) Jewish family in Budapest, in 1903. He acquired the honorific “von” in 1913. His father Max Neumann was honored for service to the Austro-Hungarian Empire and paid for a hereditary appellation.

It is, once again, difficult to encompass von Neumann’s work, and genius, in one piece. He was recognized as genius early. By 1923 he published a logical construction for the counting numbers that’s still the modern default. His 1926 doctoral thesis was in set theory. He was invited to lecture on quantum theory at Princeton by 1929. He was one of the initial six mathematics professors at the Institute for Advanced Study. We have a thing called von Neumann algebras after his work. He gave the first rigorous proof of an ergodic theorem. He partly solved one of Hilbert’s problems. He studied non-linear partial differential equations. He was one of the inventors of the electronic computer as we know it, both the theoretical and the practical ideas.

And, the sliver I choose to focus on today, he made game theory into a coherent field.

The term “game theory” makes it sound like a trifle. We don’t call “genius” anyone who comes up with a better way to play tic-tac-toe. The utility of the subject appears when we notice what von Neumann thought he was writing about. Von Neumann’s first paper on this came in 1928. In 1944 he with Oskar Morgenstern published the textbook Theory Of Games And Economic Behavior. In Chapter 1, Section 1, they set their goals:

The purpose of this book is to present a discussion of some fundamental questions of economic theory which require a treatment different from that which they have found thus far in the literature. The analysis is concerned with some basic problems arising from a study of economic behavior which have been the center of attention of economists for a long time. They have their origin in the attempts to find an exact description of the endeavor of the individual to obtain a maximum of utility, or in the case of the entrepreneur, a maximum of profit.

Somewhere along the line von Neumann became interested in how economics worked. Perhaps because his family had money. Perhaps because he saw how one could model an “ideal” growing economy — matching price and production and demand — as a linear programming question. Perhaps because economics is a big, complicated field with many unanswered questions. There was, for example, little good idea of how attendees at an auction should behave. What is the rational way to bid, to get the best chances of getting the things one wants at the cheapest price?

In 1928, von Neumann abstracted all sorts of economic questions into a basic model. The model has almost no features, so very many games look like it. In this, you have a goal, and a set of options for what to do, and an opponent, who also has options of what to do. Also some rounds to achieve your goal. You see how this abstract a structure describes many things one could do, from playing Risk to playing the stock market.

And von Neumann discovered that, in the right circumstances, you can find a rational way to bid at an auction. Or, at least, to get your best possible outcome whatever the other person does. The proof has the in-retrospect obviousness of brilliance. von Neumann used a fixed-point theorem. Fixed point theorems came to mathematics from thinking of functions as mappings. Functions match elements in a set called the domain to those in a set called the range. The function maps the domain into the range. If the range is also the domain? Then we can do an iterated mapping. Under the right circumstances, there’s at least one point that maps to itself.

In the light of game theory, a function is the taking of a turn. The domain and the range are the states of whatever’s in play. In this type of game, you know all the options everyone has. You know the state of the game. You know what the past moves have all been. You know what you and your opponent hope to achieve. So you can predict your opponent’s strategy. And therefore pick a strategy that gets you the best option available given your opponent is trying to do the same. So will your opponent. So you both end up with the best attainable outcome for the both of you; this is the minimax theorem.

It may strike you that, given this, the game doesn’t need to be played anymore. Just pick your strategy, let your opponent pick one, and the winner is determined. So it would, if we played our strategies perfectly, and if we didn’t change strategies mid-game. I would chuckle at the mathematical view that we study a game to relieve ourselves of the burden of playing. But I know how many grand strategy video games I have that I never have time to play.

After this 1928 paper von Neumann went on to other topics for about a dozen years. Why create a field of mathematics and then do nothing with it? For one, we see it as a gap only because we are extracting, after the fact, this thread of his life. He had other work, particularly in quantum mechanics, operators, measure theory, and lattice theory. He surely did not see himself abandoning a new field. He saw, having found an interesting result, new interesting questions..

But Philip Mirowski’s 1992 paper What Were von Neumann and Morgenstern Trying to Accomplish? points out some context. In September 1930 Kurt Gödel announced his incompleteness proof. Any logical system complex enough has things which are true and can’t be proven. The system doesn’t have to be that complex. Mathematical rigor must depend on something outside mathematics. This shook von Neumann. He would say that after Gödel published, von Neumann never bothered reading another paper on symbolic logic. Mirowski believes this drove von Neumann into what we now call artificial intelligence. At least, into mathematics that draws from empirical phenomena. von Neumann needed time to recover from the shock. And needed the prodding of Morgenstern to return to economics.

After publishing Theory Of Games And Economic Behavior the book … well, Mirowski calls it more “cited in reverence than actually read”. But game theory, as a concept? That took off. It seemed to offer a way to rationalize the world.

von Neumann would become a powerful public intellectual. He would join the Manhattan Project. He showed that the atomic bomb would be more destructive if it exploded kilometers above the ground, rather than at ground level. He was on the target selection committee which, ultimately, slated Hiroshima and Nagasaki for mass murder. He would become a consultant for the Weapons System Evaluation Group. They advised the United States Joint Chiefs of Staff on developing and using new war technology. He described himself, to a Senate committee, as “violently anti-communist and much more militaristic than the norm”. He is quoted in 1950 as remarking, “if you say why not bomb [ the Soviets ] tomorrow, I say, why not today? If you say today at five o’clock, I say why not one o’clock?”

The quote sounds horrifying. It makes game-theory sense, though. If war is inevitable, it is better fought when your opponent is weaker. And while the Soviet Union had won World War II, it was also ruined in the effort.

There is another game-theory-inspired horror for which we credit von Neumann. This is Mutual Assured Destruction. If any use of an atomic, or nuclear, weapon would destroy the instigator in retaliation, then no one would instigate war. So the nuclear powers need, not just nuclear arsenals. They need such vast arsenals that the remnant which survives the first strike can destroy the other powers in the second strike.

Perhaps the reasoning holds together. We did reach the destruction of the Soviet Union without using another atomic weapon in anger. But it is hard to say that was rationally accomplished. There were at least two points, in 1962 and in 1983, when a world-ruining war could too easily have happened, by people following the “obvious” strategy.

Which brings a flaw of game theory, at least as applied to something as complicated as grand strategy. Game theory demands the rules be known, and agreed on. (At least that there is a way of settling rule disputes.) It demands we have the relevant information known truthfully. It demands we know what our actual goals are. It demands that we act rationally, and that our opponent acts rationally. It demands that we agree on what rational is. (Think of, in Doctor Strangelove, the Soviet choice to delay announcing its doomsday machine’s completion.) Few of these conditions obtain in grand strategy. They barely obtain in grand strategy games. von Neumann was aware of at least some of these limitations, though he did not live long enough to address them. He died of either bone, pancreatic, or prostate cancer, likely caused by radiation exposure working at Los Alamos.

Game theory has been, and is, a great tool in many fields. It gives us insight into human interactions. It does good work in economics, in biology, in computer science, in management. But we can come to very bad conditions when we forget the difference between the game we play and the game we modelled. And if we forget that the game is value-indifferent. The theory makes no judgements about the ethical nature of the goal. It can’t, any more than the quadratic equation can tell us whether ‘x’ is which fielder will catch the fly ball or which person will be killed by a cannonball.

It makes an interesting parallel to the 19th century’s greatest fusion of mathematics and economics. This was utilitarianism, one famous attempt to bring scientific inquiry to the study of how society should be set up. Utilitarianism offers exciting insights into, say, how to allocate public services. But it struggles to explain why we should refrain from murdering someone whose death would be convenient. We need a reason besides the maximizing of utility.

No war is inevitable. One comes about only after many choices. Some are grand choices, such as a head of government issuing an ultimatum. Some are petty choices, such as the many people who enlist as the sergeants that make an army exist. We like to think we choose rationally. Psychological experiments, and experience, and introspection tell us we more often choose and then rationalize.

von Neumann was a young man, not yet in college, during the short life of the Hungarian Soviet Republic, and the White Terror that followed. I do not know his biography well enough to say how that experience motivated his life’s reasoning. I would not want to say that 1919 explained it all. The logic of a life is messier than that. I bring it up in part to fight the tendency of online biographic sketches to write as though he popped into existence, calculated a while, inspired a few jokes, and vanished. And to reiterate that even mathematics never exists without context. Even what seem to be pure questions on an abstract idea of a game is often inspired by a practical question. And that work is always done in a context that affects how we evaluate it.


Thank you all for reading. This grew a bit more serious than I had anticipated. This and all the other 2020 A-to-Z essays should appear at this link. Both the 2020 and all past A-to-Z essays should be at this link.

I am hosting the Playful Math Education Blog Carnival at the end of September, so appreciate any educational or recreational or fun mathematics material you know about. I’m hoping to publish next week and so hope that you can help me this week.

And, finally, I am open for mathematics topics starting with P, Q, and R to write about next month. I should be writing about them this month and getting ahead of deadline, but that seems not to be happening.

Reading the Comics, July 22, 2017: Counter-mudgeon Edition


I’m not sure there is an overarching theme to the past week’s gifts from Comic Strip Master Command. If there is, it’s that I feel like some strips are making cranky points and I want to argue against their cases. I’m not sure what the opposite of a curmudgeon is. So I shall dub myself, pending a better idea, a counter-mudgeon. This won’t last, as it’s not really a good name, but there must be a better one somewhere. We’ll see it, now that I’ve said I don’t know what it is.

Rabbits at a chalkboard. 'The result is not at all what we expected, Von Thump. According to our calculations, parallel universes may exist, and we may also be able to link them with our own by wormholes that, in strictly mathematical terms, end up in a black top hat.'
Niklas Eriksson’s Carpe Diem for the 17th of July, 2017. First, if anyone isn’t thinking of that Pixar short then I’m not sure we can really understand each other. Second, ‘von Thump’ is a fine name for a bunny scientist and if it wasn’t ever used in the rich lore of Usenet group alt.devilbunnies I shall be disappointed. Third, Eriksson made an understandable but unfortunate mistake in composing this panel. While both rabbits are wearing glasses, they’re facing away from the viewer. It’s always correct to draw animals wearing eyeglasses, or to photograph them so. But we should get to see them in full eyeglass pelage. You’d think they would teach that in Cartoonist School or something.

Niklas Eriksson’s Carpe Diem for the 17th features the blackboard full of equations as icon for serious, deep mathematical work. It also features rabbits, although probably not for their role in shaping mathematical thinking. Rabbits and their breeding were used in the simple toy model that gave us Fibonacci numbers, famously. And the population of Arctic hares gives those of us who’ve reached differential equations a great problem to do. The ecosystem in which Arctic hares live can be modelled very simply, as hares and a generic predator. We can model how the populations of both grow with simple equations that nevertheless give us surprises. In a rich, diverse ecosystem we see a lot of population stability: one year where an animal is a little more fecund than usual doesn’t matter much. In the sparse ecosystem of the Arctic, and the one we’re building worldwide, small changes can have matter enormously. We can even produce deterministic chaos, in which if we knew exactly how many hares and predators there were, and exactly how many of them would be born and exactly how many would die, we could predict future populations. But the tiny difference between our attainable estimate and the reality, even if it’s as small as one hare too many or too few in our model, makes our predictions worthless. It’s thrilling stuff.

Vic Lee’s Pardon My Planet for the 17th reads, to me, as a word problem joke. The talk about how much change Marian should get back from Blake could be any kind of minor hassle in the real world where one friend covers the cost of something for another but expects to be repaid. But counting how many more nickels one person has than another? That’s of interest to kids and to story-problem authors. Who else worries about that count?

Fortune teller: 'All of your money problems will soon be solved, including how many more nickels Beth has than Jonathan, and how much change Marian should get back from Blake.'
Vic Lee’s Pardon My Planet for the 17th of July, 2017. I am surprised she had no questions about how many dimes Jonathan must have, although perhaps that will follow obviously from knowing the Beth nickel situation.

Jef Mallet’s Frazz for the 17th straddles that triple point joining mathematics, philosophy, and economics. It seems sensible, in an age that embraces the idea that everything can be measured, to try to quantify happiness. And it seems sensible, in age that embraces the idea that we can model and extrapolate and act on reasonable projections, to try to see what might improve our happiness. This is so even if it’s as simple as identifying what we should or shouldn’t be happy about. Caulfield is circling around the discovery of utilitarianism. It’s a philosophy that (for my money) is better-suited to problems like how ought the city arrange its bus lines than matters too integral to life. But it, too, can bring comfort.

Corey Pandolph’s Barkeater Lake rerun for the 20th features some mischievous arithmetic. I’m amused. It turns out that people do have enough of a number sense that very few people would let “17 plus 79 is 4,178” pass without comment. People might not be able to say exactly what it is, on a glance. If you answered that 17 plus 79 was 95, or 102, most people would need to stop and think about whether either was right. But they’re likely to know without thinking that it can’t be, say, 56 or 206. This, I understand, is so even for people who aren’t good at arithmetic. There is something amazing that we can do this sort of arithmetic so well, considering that there’s little obvious in the natural world that would need the human animal to add 17 and 79. There are things about how animals understand numbers which we don’t know yet.

Alex Hallatt’s Human Cull for the 21st seems almost a direct response to the Barkeater Lake rerun. Somehow “making change” is treated as the highest calling of mathematics. I suppose it has a fair claim to the title of mathematics most often done. Still, I can’t get behind Hallatt’s crankiness here, and not just because Human Cull is one of the most needlessly curmudgeonly strips I regularly read. For one, store clerks don’t need to do mathematics. The cash registers do all the mathematics that clerks might need to do, and do it very well. The machines are cheap, fast, and reliable. Not using them is an affectation. I’ll grant it gives some charm to antiques shops and boutiques where they write your receipt out by hand, but that’s for atmosphere, not reliability. And it is useful the clerk having a rough idea what the change should be. But that’s just to avoid the risk of mistakes getting through. No matter how mathematically skilled the clerk is, there’ll sometimes be a price entered wrong, or the customer’s money counted wrong, or a one-dollar bill put in the five-dollar bill’s tray, or a clerk picking up two nickels when three would have been more appropriate. We should have empathy for the people doing this work.

Reading the Comics, June 25, 2016: What The Heck, Why Not Edition


I had figured to do Reading the Comics posts weekly, and then last week went and gave me too big a flood of things to do. I have no idea what the rest of this week is going to look like. But given that I had four strips dated before last Sunday I’m going to err on the side of posting too much about comic strips.

Scott Metzger’s The Bent Pinky for the 24th uses mathematics as something that dogs can be adorable about not understanding. Thus all the heads tilted, as if it were me in a photograph. The graph here is from economics, which has long had a challenging relationship with mathematics. This particular graph is qualitative; it doesn’t exactly match anything in the real world. But it helps one visualize how we might expect changes in the price of something to affect its sales. A graph doesn’t need to be precise to be instructional.

Dave Whamond’s Reality Check for the 24th is this essay’s anthropomorphic-numerals joke. And it’s a reminder that something can be quite true without being reassuring. It plays on the difference between “real” numbers and things that really exist. It’s hard to think of a way that a number such as two could “really” exist that doesn’t also allow the square root of -1 to “really” exist.

And to be a bit curmudgeonly, it’s a bit sloppy to speak of “the square root of negative one”, even though everyone does. It’s all right to expand the idea of square roots to cover stuff it didn’t before. But there’s at least two numbers that would, squared, equal -1. We usually call them i and -i. Square roots naturally have this problem,. Both +2 and -2 squared give us 4. We pick out “the” square root by selecting the positive one of the two. But neither i nor -i is “positive”. (Don’t let the – sign fool you. It doesn’t count.) You can’t say either i or -i is greater than zero. It’s not possible to define a “greater than” or “less than” for complex-valued numbers. And that’s even before we get into quaternions, in which we summon two more “square roots” of -1 into existence. Octonions can be even stranger. I don’t blame 1 for being worried.

Zach Weinersmith’s Saturday Morning Breakfast Cereal for the 24th is a pleasant bit of pop-mathematics debunking. I’ve explained in the past how I’m a doubter of the golden ratio. The Fibonacci Sequence has a bit more legitimate interest to it. That’s sequences of numbers in which the next term is the sum of the previous two terms. The famous one of that is 1, 1, 2, 3, 5, 8, 13, 21, et cetera. It may not surprise you to know that the Fibonacci Sequence has a link to the golden ratio. As it goes on, the ratio between one term and the next one gets close to the golden ratio.

The Harmonic Series is much more deeply weird. A series is the number we get from adding together everything in a sequence. The Harmonic Series grows out of the first sequence you’d imagine ever adding up. It’s 1 plus 1/2 plus 1/3 plus 1/4 plus 1/5 plus 1/6 plus … et cetera. The first time you hear of this you get the surprise: this sum doesn’t ever stop piling up. We say it ‘diverges’. It won’t on your computer; the floating-point arithmetic it does won’t let you add enormous numbers like ‘1’ to tiny numbers like ‘1/531,325,263,953,066,893,142,231,356,120’ and get the right answer. But if you actually added this all up, it would.

The proof gets a little messy. But it amounts to this: 1/2 plus 1/3 plus 1/4? That’s more than 1. 1/5 + 1/6 + 1/7 + 1/8 + 1/9 + 1/10 + 1/11 + 1/12? That’s also more than 1. 1/13 + 1/14 + 1/15 + et cetera up through + 1/32 + 1/33 + 1/34 is also more than 1. You need to pile up more and more terms each time, but a finite string of these numbers will add up to more than 1. So the whole series has to be more than 1 + 1 + 1 + 1 + 1 … and so more than any finite number.

That’s all amazing enough. And then the series goes on to defy all kinds of intuition. Obviously dropping a couple of terms from the series won’t change whether it converges or diverges. Multiplying alternating terms by -1, so you have (say) 1 – 1/2 + 1/3 – 1/4 + 1/5 et cetera produces something that looks like it converges. It equals the natural logarithm of 2. But if you take those terms and rearrange them, you can produce any real number, positive or negative, that you want.

And, as Weinersmith describes here, if you just skip the correct set of terms, you can make the sum converge. The ones with 9 in the denominator will be, then, 1/9, 1/19, 1/29, 1/90, 1/91, 1/92, 1/290, 1/999, those sorts of things. Amazing? Yes. Absurd? I suppose so. This is why mathematicians learn to be very careful when they do anything, even addition, infinitely many times.

John Deering’s Strange Brew for the 25th is a fear-of-mathematics joke. The sign the warrior’s carrying is legitimate algebra, at least so far as it goes. The right-hand side of the equation gets cut off. In time, it would get to the conclusion that x equals –19/2, or -9.5.

Reading the Comics, July 24, 2015: All The Popular Topics Are Here Edition


This week all the mathematically-themed comic strips seem to have come from Gocomics.com. Since that gives pretty stable URLs I don’t feel like I can include images of those comics. So I’m afraid it’s a bunch of text this time. I like to think you enjoy reading the text, though.

Mark Anderson’s Andertoons seemed to make its required appearance here with the July 20th strip. And the kid’s right about parentheses being very important in mathematics and “just” extra information in ordinary language. Parentheses as a way of grouping together terms appear as early as the 16th century, according to Florian Cajori. But the symbols wouldn’t become common for a couple of centuries. Cajori speculates that the use of parentheses in normal rhetoric may have slowed mathematicians’ acceptance of them. Vinculums — lines placed over a group of terms — and colons before and after the group seem to have been more popular. Leonhard Euler would use parentheses a good bit, and that settled things. Besides all his other brilliances, Euler was brilliant at setting notation. There are still other common ways of aggregating terms. But most of them are straight brackets or curled braces, which are almost the smallest possible changes from parentheses you can make.

Though his place was secure, Mark Anderson got in another strip the next day. This one’s based on the dangers of extrapolating mindlessly. One trouble with extrapolation is that if we just want to match the data we have then there are literally infinitely many possible extrapolations, each equally valid. But most of them are obvious garbage. If the high temperature the last few days was 78, 79, 80, and 81 degrees Fahrenheit, it may be literally true that we could extrapolate that to a high of 120,618 degrees tomorrow, but we’d be daft to believe it. If we understand the factors likely to affect our data we can judge what extrapolations are plausible and what ones aren’t. As ever, sanity checking, verifying that our results could be correct, is critical.

Bill Amend’s FoxTrot Classics (July 20) continues Jason’s attempts at baking without knowing the unstated assumptions of baking. See above comments about sanity checking. At least he’s ruling out the obviously silly rotation angle. (The strip originally ran the 22nd of July, 2004. You can see it in color, there, if you want to see things like that.) Some commenters have gotten quite worked up about Jason saying “degrees Kelvin” when he need only say “Kelvin”. I can’t join them. Besides the phenomenal harmlessness of saying “degrees Kelvin”, it wouldn’t quite flow for Jason to think “350 degrees” short for “350 Kelvin” instead of “350 degrees Kelvin”.

Nate Frakes’s Break of Day (July 21) is the pure number wordplay strip for this roundup. This might be my favorite of this bunch, mostly because I can imagine the way it would be staged as a bit on The Muppet Show or a similar energetic and silly show. John Atkinson’s Wrong Hands for July 23 is this roundup’s general mathematics wordplay strip. And Mark Parisi’s Off The Mark for July 22nd is the mathematics-literalist strip for this roundup.

Ruben Bolling’s Tom The Dancing Bug (July 23, rerun) is nominally an economics strip. Its premise is that since rational people do what maximizes their reward for the risk involved, then pointing out clearly how the risks and possible losses have changed will change their behavior. Underlying this are assumptions from probability and statistics. The core is the expectation value. That’s an average of what you might gain, or lose, from the different outcomes of something. That average is weighted by the probability of each outcome. A strictly rational person who hadn’t ruled anything in or out would be expected to do the thing with the highest expected gain, or the smallest expected loss. That people do not do things this way vexes folks who have not known many people.

Realistic Modeling


“Economic Realism (Wonkish)”, a blog entry by Paul Krugman in The New York Times, discusses a paper, “Chameleons: The Misuse Of Mathematical Models In Finance And Economics”, by Paul Pfleiderer of Stanford University, which surprises me by including a color picture of a chameleon right there on the front page, and in an academic paper at that, and I didn’t know you could have color pictures included just for their visual appeal in academia these days. Anyway, Pfleiderer discusses the difficulty of what they term filtering, making sure that the assumptions one makes to build a model — which are simplifications and abstractions of the real-world thing in which you’re interested — aren’t too far out of line with the way the real thing behaves.

This challenge, which I think of as verification or validation, is important when you deal with pure mathematical or physical models. Some of that will be at the theoretical stage: is it realistic to model a fluid as if it had no viscosity? Unless you’re dealing with superfluid helium or something exotic like that, no, but you can do very good work that isn’t too far off. Or there’s a classic model of the way magnetism forms, known as the Ising model, which in a very special case — a one-dimensional line — is simple enough that a high school student could solve it. (Well, a very smart high school student, one who’s run across an exotic function called the hyperbolic cosine, could do it.) But that model is so simple that it can’t model the phase change, that, if you warm a magnet up past a critical temperature it stops being magnetic. Is the model no good? If you aren’t interested in the phase change, it might be.

And then there is the numerical stage: if you’ve set up a computer program that is supposed to represent fluid flow, does it correctly find solutions? I’ve heard it claimed that the majority of time spent on a numerical project is spent in validating the results, and that isn’t even simply in finding and fixing bugs in the code. Even once the code is doing perfectly what we mean it to do, it must be checked that what we mean it to do is relevant to what we want to know.

Pfleiderer’s is an interesting paper and I think worth the read; despite its financial mathematics focus (and a brief chat about quantum mechanics) it doesn’t require any particularly specialized training. There’s some discussions of particular financial models, but what’s important are the assumptions being made behind those models, and those are intelligible without prior training in the field.

%d bloggers like this: