My 2018 Mathematics A To Z: Yamada Polynomial


I had another free choice. I thought I’d go back to one of the topics I knew and loved in grad school even though I didn’t have the time to properly study it then. It turned out I had forgotten some important points and spent a night crash-relearning knot theory. This isn’t a bad thing necessarily.

Cartoon of a thinking coati (it's a raccoon-like animal from Latin America); beside him are spelled out on Scrabble titles, 'MATHEMATICS A TO Z', on a starry background. Various arithmetic symbols are constellations in the background.
Art by Thomas K Dye, creator of the web comics Newshounds, Something Happens, and Infinity Refugees. His current project is Projection Edge. And you can get Projection Edge six months ahead of public publication by subscribing to his Patreon. And he’s on Twitter as @Newshoundscomic.

Yamada Polynomial.

This is a thing which comes from graphs. Not the graphs you ever drew in algebra class. Graphs as in graph theory. These figures made of spots called vertices. Pairs of vertices are connected by edges. There’s many interesting things to study about these.

One path to take in understanding graphs is polynomials. Of course I would bring things back to polynomials. But there’s good reasons. These reasons come to graph theory by way of knot theory. That’s an interesting development since we usually learn graph theory before knot theory. But knot theory has the idea of representing these complicated shapes as polynomials.

There are a bunch of different polynomials for any given graph. The oldest kind, the Alexander Polynomial, J W Alexander developed in the 1920s. And that was about it until the 1980s when suddenly everybody was coming up with good new polynomials. The definitions are different. They give polynomials that look different. Some are able to distinguish between a knot and the knot that’s its reflection across a mirror. Some, like the Alexander aren’t. But they’re common in some important ways. One is that they might not actually be, you know, polynomials. I mean, they’ll be the sum of numbers — whole numbers, even — times a variable raised to a power. The variable might be t, might be x. Might be something else, but it doesn’t matter. It’s a pure dummy variable. But the variable might be raised to a negative power, which isn’t really a polynomial. It might even be raised to, oh, one-half or three-halves, or minus nine-halves, or something like that. We can try saying this is “a polynomial in t-to-the-halves”. Mostly it’s because we don’t have a better name for it.

And going from a particular knot to a polynomial follows a pretty common procedure. At least it can, when you’re learning knot theory and feel a bit overwhelmed trying to prove stuff about “knot invariants” and “homologies” and all. Having a specific example can be such a comfort. You can work this out by an iterative process. Take a specific drawing of your knot. There’s places where the strands of the knot cross over one another. For each of those crossings you ponder some alternate cases where the strands cross over in a different way. And then you add together some coefficient times the polynomial of this new, different knot. The coefficient you get by the rules of whatever polynomial you’re making. The new, different knots are, usually, no more complicated than what you started with. They’re often simpler knots. This is what saves you from an eternity of work. You’re breaking the knot down into more but simpler knots. Just the fact of doing that can be satisfying enough. Eventually you get to something really simple, like a circle, and declare that’s some basic polynomial. Then there’s a lot bit of adding up coefficients and powers and all that. Tedious but not hard.

Knots are made from a continuous loop of … we’ll just call it thread. It can fold over itself many times. It has to, really, or it hasn’t got a chance of being more interesting than a circle. A graph is different. That there are vertices seems to change things. Less than you’d think, though. The thread of a knot can cross over and under itself. Edges of a graph can cross over and under other edges. This isn’t too different. We can also imagine replacing a spot where two edges cross over and under the other with an intersection and new vertex.

So we get to the Yamada polynomial by treating a graph an awful lot like we might treat a knot. Take the graph and split it up at each overlap. At each overlap we have something that looks, at least locally, kind of like an X. An upper left, upper right, lower left, and lower right intersection. The lower left connects to the upper right, and the upper left connects to the lower right. But these two edges don’t actually touch; one passes over the other. (By convention, the lower left going to the upper right is on top.)

There’s three alternate graphs. One has the upper left connected to the lower left, and the upper right connected to the lower right. This looks like replacing the X with a )( loop. The second alternate has the upper left connected to the upper right, and the lower left connected to the lower right. This looks like … well, that )( but rotated ninety degrees. I can’t do that without actually including a picture. The third alternate puts a vertex in the X. So now the upper left, upper right, lower left, and lower right all connect to the new vertex in the center.

Probably you’d agree that replacing the original X with a )( pattern, or its rotation, probably doesn’t make the graph any more complicated. And it might make the graph simpler. But adding that new vertex looks like trouble. It looks like it’s getting more complicated. We might get stuck in an infinite regression of more-complicated polynomials.

What saves us is the coefficient we’re multiplying the polynomials for these new graphs by. It’s called the “chromatic coefficient” and it reflects how many different colors you need to color in this graph. An edge needs to connect two different colors. And — what happens if an edge connects a vertex to itself? That is, the edge loops around back to where it started? That’s got a chromatic number of zero and the moment we get a single one of these loops anywhere in our graph we can stop calculating. We’re done with that branch of the calculations. This is what saves us.

There’s a catch. It’s a catch that knot polynomials have, too. This scheme writes a polynomial not just for a particular graph but a particular way of rendering this graph. There’s always other ways to draw it. If nothing else you can always twirl a edge over itself, into a loop like you get when Christmas tree lights start tangling themselves up. But you can move the vertices to different places. You can have an edge go outside the rest of the figure instead of inside, that sort of thing. Starting from a different rendition of the shape gets you to a different polynomial.

Superficially different, anyway. What you get from two different renditions of the same graph are polynomials different by your dummy variable raised to a whole number. Also maybe a plus-or-minus sign. You can see a difference between, say, t^{-1} - 2 + 3t (to make up an example) and t - 2t^2 + 3t^3 . But you can see that second polynomial is just t^2\left(t^{-1} - 2 + 3t\right) . It’s some confounding factor times something that is distinctive to the graph.

And that distinctive part, the thing that doesn’t change if you draw the graph differently? That’s the Yamada polynomial, at last. It’s a way to represent this collection of vertices and edges using only coefficients and exponents.

I would like to give an impressive roster of uses for these polynomials here. I’m afraid I have to let you down. There is the obvious use: if you suspect two graphs are really the same, despite how different they look, here’s a test. Calculate their Yamada polynomials and if they’re different, you know the graphs were different. It can be hard to tell. Get anything with more than, say, eight vertices and 24 edges in it and you’re not going to figure that out by sight.

I encountered the Yamada polynomial specifically as part of a textbook chapter about chemistry. It’s easy to imagine there should be great links between knots and graphs and the way that atoms bundle together into molecules. The shape of their structures describes what they will do. But I am not enough of a chemist to say how this description helps chemists understand molecules. It’s possible that it doesn’t: Yamada’s paper introducing the polynomial was published in 1989. My knot theory textbook might have brought it up because it looked exciting. There are trends and fashions in mathematical thought too. I don’t know what several more decades of work have done to the polynomial’s reputation. I’m glad to hear from people who know better.


There’s one more term in the Fall 2018 Mathematics A To Z to come. Will I get the article about it written before Friday? We’ll know on Saturday! At least I don’t have more Reading the Comics posts to write before Sunday.

Playful Mathematics Education Blog Carnival #121


Greetings one and all! Come, gather round! Wonder and spectate and — above all else — tell your friends of the Playful Mathematics Blog Carnival! Within is a buffet of delights and treats, fortifications for the mind and fire for the imagination.

121 is a special number. When I was a mere tot, growing in the wilds of suburban central New Jersey, it stood there. It held a spot of privilege in the multiplication tables on the inside front cover of composition books. On the forward diagonal, yet insulated from the borders. It anchors the safe interior. A square number, eleventh of that set in the positive numbers.

Cartoon of several circus tents, with numbered flags above them and balloons featuring arithmetic symbols. The text, in a carnival-poster font, is 'PLAYFUL MATH EDUCATION CARNIVAL'.
Art by Thomas K Dye, creator of the web comics Newshounds, Something Happens, and Infinity Refugees. His current project is Projection Edge. And you can get Projection Edge six months ahead of public publication by subscribing to his Patreon. And he’s on Twitter as @Newshoundscomic.

The First Tent

The first wonder to consider is Iva Sallay’s Find the Factors blog. She brings each week a sequence of puzzles, all factoring challenges. The result of each, done right, is a scrambling of the multiplication tables; it’s up to you the patron to find the scramble. She further examines each number in turn, finding its factors and its interesting traits. And furthermore, usually, when beginning a new century of digits opens a horserace, to see which of the numbers have the greatest number of factorizations. She furthermore was the host of this Playful Mathematics Education Carnival for August of 2018.

121 is more than just a square. It is the lone square known to be the sum of the first several powers of a prime number: it is 1 + 3 + 3^2 + 3^3 + 3^4 , a fantastic combination. If there is another square that is such a sum of primes, it is unknown to any human — and must be at least 35 digits long.

We look now for a moment at some astounding animals. From the renowned Dr Nic: Introducing Cat Maths cards, activities, games and lessons — a fine collection of feline companions, such toys as will enterain them. A dozen attributes each; twenty-seven value cards. These cats, and these cards, and these activity puzzles, promise games and delights, to teach counting, subtraction, statistics, and inference!

Next and no less incredible is the wooly Mathstodon. Christian Lawson-Perfect hosts this site, an instance of the open-source Twitter-like service Mastodon. Its focus: a place for people interested in mathematicians to write of what they know. To date over 1,300 users have joined, and have shared nearly 25,000 messages. You need not join to read many of these posts — your host here has yet to — but may sample its wares as you like.


The Second Tent

121 is one of only two perfect squares known to be four less than the cube of a whole number. The great Fermat conjectured that 4 and 121 are the only such numbers; no one has found a counter-example. Nor a proof.

Friends, do you know the secret to popularity? There is an astonishing truth behind it. Elias Worth of the MathSection blog explains the Friendship Paradox. This mind-warping phenomenon tells us your friends have more friends than you do. It will change forever how you look at your followers and following accounts.

And now to thoughts of learning. Stepping forward now is Monica Utsey, @Liveonpurpose47 of Chocolate Covered Boy Joy. Her declaration: “I incorporated Montessori Math materials with my right brain learner because he needed literal representations of the work we were doing. It worked and we still use it.” See now for yourself the representations, counting and comparing and all the joys of several aspects of arithmetic.

Take now a moment for your own fun. Blog Carnival patron and organizer Denise Gaskins wishes us to know: “The fun of mathematical coloring isn’t limited to one day. Enjoy these coloring resources all year ’round!” Happy National Coloring Book Day offers the title, and we may keep the spirit of National Coloring Book Day all the year round.

Confident in that? Then take on a challenge. Can you scroll down faster than Christian Lawson-Perfect’s web site can find factors? Prove your speed, prove your endurance, and see if you can overcome this infinite scroll.


The Third Tent

121 is a star number, the fifth of that select set. 121 identical items can be tiled to form a centered hexagon. You may have seen it in the German game of Chinese Checkers, as the board of that has 121 holes.

We come back again to teaching. “Many homeschoolers struggle with teaching their children math. Here are some tips to make it easier”, offers Denise Gaskins. Step forth and benefit from this FAQ: Struggling with Arithmetic, a collection of tips and thoughts and resources to help make arithmetic the more manageable.

Step now over to the arcade, and to the challenge of Pac-Man. This humble circle-inspired polygon must visit the entirety of a maze, and avoid ghosts as he does. Matthew Scroggs of Chalk Dust Magazine here seeks and shows us Optimal Pac-Man. Graph theory tells us there are thirteen billion different paths to take. Which of them is shortest? Which is fastest? Can it be known, and can it help you through the game?

And now a recreation, one to become useful if winter arrives. Think of the mysteries of the snowball rolling down a hill. How does it grow in size? How does it speed up? When does it stop? Rodolfo A Diaz, Diego L Gonzalez, Francisco Marin, and R Martinez satisfy your curiosity with Comparative kinetics of the snowball respect to other dynamical objects. Be warned! This material is best suited for the college-age student of the mathematical snow sciences.


The Fourth Tent

121 is furthermore the sixth of the centered octagonal numbers. 121 of a thing may be set into six concentric octagons of one, then two, then three, then four, then five, and then six of them on a side.

To teach is to learn! And we have here an example of such learning. James Sheldon writing for the American Mathematical Society Graduate Student blog offers Teaching Lessons from a Summer of Taking Mathematics Courses. What secrets has Sheldon to reveal? Come inside and learn what you may.

And now step over to the games area. The game Entanglement wraps you up in knots, challenging you to find the longest knot possible. David Richeson of Division By Zero sees in this A game for budding knot theorists. What is the greatest score that could be had in this game? Can it ever be found? Only Richeson has your answer.

Step now back to the amazing Mathstodon. Gaze in wonder at the account @dudeney_puzzles. Since the September of 2017 it has brought out challenges from Henry Ernest Dudeney’s Amusements in Mathematics. Puzzles given, yes, with answers that follow along. The impatient may find Dudeney’s 1917 book on Project Gutenberg among other places.


The Fifth Tent

Sum the digits of 121; you will find that you have four. Take its prime factors, 11 and 11, and sum their digits; you will find that this is four again. This makes 121 a Smith number. These marvels of the ages were named by Albert Wilansky, in honor of his brother-in-law, a man known to history as Harold Smith, and whose telephone number of 4,937,775 was one such.

Now let us consider terror. What is it to enter a PhD program? Many have attempted it; some have made it through. Mathieu Besançon gives to you a peek behind academia’s curtain. A year in PhD describes some of this life.

And now to an astounding challenge. Imagine an assassin readies your death. Can you protect yourself? At all? Tai-Danae Bradley invites you to consider: Is the Square a Secure Polygon? This question takes you on a tour of geometries familiar and exotic. Learn how mathematicians consider how to walk between places on a torus — and the lessons this has for a square room. The fate of the universe itself may depend on the methods described herein — the techniques used to study it relate to those that study whether a physical system can return to its original state. And then J2kun turned this into code, Visualizing an Assassin Puzzle, for those who dare to program it.

Have you overcome this challenge? Then step into the world of linear algebra, and this delight from the Mathstodon account of Christian Lawson-Perfect. The puzzle is built on the wonders of eigenvectors, those marvels of matrix multiplication. They emerge from multiplication longer or shorter but unchanged in direction. Lawson-Perfect uses whole numbers, represented by Scrabble tiles, and finds a great matrix with a neat eigenvalue. Can you prove that this is true?


The Sixth Tent

Another wonder of the digits of 121. Take them apart, then put them together again. Contorted into the form 112 they represent the same number. 121 is, in the base ten commonly used in the land, a Friedman Number, second of that line. These marvels, in the Arabic, the Roman, or even the Mayan numerals schemes, are named for Erich Friedman, a figure of mystery from the Stetson University.

We draw closer to the end of this carnival’s attractions! To the left I show a tool for those hoping to write mathematics: Donald E Knuth, Tracy Larrabee, and Paul M Roberts’s Mathematical Writing. It’s a compilation of thoughts about how one may write to be understood, or to avoid being misunderstood. Either would be a marvel for the ages.

To the right please see Gregory Taylor’s web comic Any ~Qs. Taylor — @mathtans on Twitter — brings a world of math-tans, personifications of mathematical concepts, together for adventures and wordplay. And if the strip is not to your tastes, Taylor is working on ε Project, a serialized written story with new installments twice a month.

If you will look above you will see the marvels of curved space. On YouTube, Eigenchris hopes to learn differential geometry, and shares what he has learned. While he has a series under way he suggested Episode 15, ‘Geodesics and Christoffel Symbols as one that new viewers could usefully try. Episode 16, ‘Geodesic Examples on Plane and Sphere, puts this work to good use.

And as we reach the end of the fairgrounds, please take a moment to try Find the Factors Puzzle number 121, a challenge from 2014 that still speaks to us today!

And do always stop and gaze in awe at the fantastic and amazing geometrical constructs of Robert Loves Pi. You shall never see stellations of its like elsewhere!


The Concessions Tent

With no thought of the risk to my life or limb I read the newspaper comics for mathematical topics they may illuminate! You may gape in awe at the results here. And furthermore this week and for the remainder of this calendar year of 2018 I dare to explain one and only one mathematical concept for each letter of our alphabet! I remind the sensitive patron that I have already done not one, not two, not three, but four previous entries all finding mathematical words for the letter “X” — will there be one come December? There is but one way you might ever know.

Denise Gaskins coordinates the Playful Mathematics Education Blog Carnival. Upcoming scheduled carnivals, including the chance to volunteer to host it yourself, or to recommend your site for mention, are listed here. And October’s 122nd Playful Mathematics Education Blog Carnival is scheduled to be hosted by Arithmophobia No More, and may this new host have the best of days!

Mathematics Stuff To Read Or Listen To


I concede January was a month around here that could be characterized as “lazy”. Not that I particularly skimped on the Reading the Comics posts. But they’re relatively easy to do: the comics tell me what to write about, and I could do a couple paragraphs on most anything, apparently.

While I get a couple things planned out for the coming month, though, here’s some reading for other people.

The above links to a paper in the Proceedings of the National Academy of Sciences. It’s about something I’ve mentioned when talking about knot before. And it’s about something everyone with computer cables or, like the tweet suggests, holiday lights finds. The things coil up. Spontaneous knotting of an agitated string by Dorian M Raymer and Douglas E Smith examines when these knots are likely to form, and how likely they are. It’s not a paper for the lay audience, but there are a bunch of fine pictures. The paper doesn’t talk about Christmas lights, no matter what the tweet does, but the mathematics carries over to this.

MathsByAGirl, meanwhile, had a post midmonth listing a couple of mathematics podcasts. I’m familiar with one of them, BBC Radio 4’s A Brief History of Mathematics, which was a set of ten-to-twenty-minute sketches of historically important mathematics figures. I’ll trust MathsByAGirl’s taste on other podcasts. I’d spent most of this month finishing off a couple of audio books (David Hackett Fischer’s Washington’s Crossing which I started listening to while I was in Trenton for a week, because that’s the sort of thing I think is funny, and Robert Louis Stevenson’s Doctor Jekyll and Mister Hyde And Other Stories) and so fell behind on podcasts. But now there’s some more stuff to listen forward to.

And then I’ll wrap up with this from KeplerLounge. It looks to be the start of some essays about something outside the scope of my Why Stuff Can Orbit series. (Which I figure to resume soon.) We start off talking about orbits as if planets were “point masses”. Which is what the name suggests: a mass that fills up a single point, with no volume, no shape, no features. This makes the mathematics easier. The mathematics is just as easy if the planets are perfect spheres, whether hollow or solid. But real planets are not perfect spheres. They’re a tiny bit blobby. And they’re a little lumpy as well. We can ignore that if we’re doing rough estimates of how orbits work. But if we want to get them right we can’t ignore that anymore. And this essay describes some of how we go about dealing with that.

The End 2016 Mathematics A To Z: Unlink


This is going to be a fun one. It lets me get into knot theory again.

Unlink.

An unlink is what knot theorists call that heap of loose rubber bands in that one drawer compartment.

The longer way around. It starts with knots. I love knots. If I were stronger on abstract reasoning and weaker on computation I’d have been a knot theorist. At least graph theory anyway. The mathematical idea of a knot is inspired by a string tied together. In making it a mathematical idea we perfect the string. It becomes as thin as a line, though it curves as much as we want. It can stretch out or squash down as much as we want. It slides frictionlessly against itself. Gravity doesn’t make it drop any. This removes the hassles of real-world objects from it. It also means actual strings or yarns or whatever can’t be knots anymore. Only something that’s a loop which closes back on itself can be a knot. The knot you might make in a shoelace, to use an example, could be undone by pushing the tip back through the ‘knot’. Since our mathematical string is frictionless we can do that, effortlessly. We’re left with nothing.

But you can create a pretty good approximation to a mathematical knot if you have some kind of cable that can be connected to its own end. Loop the thing around as you like, connect end to end, and you’ve got it. I recommend the glow sticks sold for people to take to parties or raves or the like. They’re fun. If you tie it up so that the string (rope, glow stick, whatever) can’t spread out into a simple O shape no matter how you shake it up (short of breaking the cable) then you have a knot. There are many of them. Trefoil knots are probably the easiest to get, but if you’re short on inspiration try looking at Celtic knot patterns.

But if the string can be shaken out until it’s a simple O shape, the sort of thing you can place flat on a table, then you have an unknot. Just from the vocabulary this you see why I like the subject so. Since this hasn’t quite got silly enough, let me assure you that an unknot is itself a kind of knot; we call it the trivial knot. It’s the knot that’s too simple to be a knot. I’m sure you were worried about that. I only hear people call it an unknot, but maybe there are heritages that prefer “trivial knot”.

So that’s knots. What happens if you have more than one thing, though? What if you have a couple of string-loops? Several cables. We know these things can happen in the real world, since we’ve looked behind the TV set or the wireless router and we know there’s somehow more cables there than there are even things to connect.

Even mathematicians wouldn’t want to ignore something that caught up with real world implications. And we don’t. We get to them after we’re pretty comfortable working with knots. Describing them, working out the theoretical tools we’d use to un-knot a proper knot (spoiler: we cut things), coming up with polynomials that describe them, that sort of thing. When we’re ready for a new trick there we consider what happens if we have several knots. We call this bundle of knots a “link”. Well, what would you call it?

A link is a collection of knots. By talking about a link we expect that at least some of the knots are going to loop around each other. This covers a lot of possibilities. We could picture one of those construction-paper chains, made of intertwined loops, that are good for elementary school craft projects to be a link. We can picture a keychain with a bunch of keys dangling from it to be a link. (Imagine each key is a knot, just made of a very fat, metal “string”. C’mon, you can give me that.) The mass of cables hiding behind the TV stand is not properly a link, since it’s not properly made out of knots. But if you can imagine taking the ends of each of those wires and looping them back to the origins, then the somehow vaster mess you get from that would be a link again.

And then we come to an “unlink”. This has two pieces. The first is that it’s a collection of knots, yes, but knots that don’t interlink. We can pull them apart without any of them tugging the others along. The second piece is that each of the knots is itself an unknot. Trivial knots. Whichever you like to call them.

The “unlink” also gets called the “trivial link”, since it’s as boring a link as you can imagine. Manifested in the real world, well, an unbroken rubber band is a pretty good unknot. And a pile of unbroken rubber bands will therefore be an unlink.

If you get into knot theory you end up trying to prove stuff about complicated knots, and complicated links. Often these are easiest to prove by chopping up the knot or the link into something simpler. Maybe you chop those smaller pieces up again. And you can’t get simpler than an unlink. If you can prove whatever you want to show for that then you’ve got a step done toward proving your whole actually interesting thing. This is why we see unknots and unlinks enough to give them names and attention.

Some Mathematical Tweets To Read


Can’t deny that I will sometimes stockpile links of mathematics stuff to talk about. Sometimes I even remember to post it. Sometimes it’s a tweet like this, which apparently I’ve been carrying around since April:

I admit I do not know whether the claim is true. It’s plausible enough. English has many variants in England alone, and any trade will pick up its own specialized jargon. The words are fun as it is.

From the American Mathematical Society there’s this:

I talk a good bit about knot theory. It captures the imagination and it’s good for people who like to doodle. And it has a lot of real-world applications. Tangled wires, protein strands, high-energy plasmas, they all have knots in them. Some work by Paul Sutcliffe and Fabian Maucher, both of Durham University, studies tangled vortices. These are vortices that are, er, tangled together, just like you imagine. Knot theory tells us much about this kind of vortex. And it turns out these tangled vortices can untangle themselves and smooth out again, even without something to break them up and rebuild them. It gives hope for power cords everywhere.

Nerds have a streak which compels them to make blueprints of things. It can be part of the healthier side of nerd culture, the one that celebrates everything. The side that tries to fill in the real-world things that the thing-celebrated would have if it existed. So here’s a bit of news about doing that:

I like the attempt to map Sir Thomas More’s Utopia. It’s a fun exercise in matching stuff to a thin set of data. But as mentioned in the article, nobody should take it too seriously. The exact arrangement of things in Utopia isn’t the point of the book. More probably didn’t have a map for it himself.

(Although maybe. I believe I got this from Simon Garfield’s On The Map: A Mind-Expanding Exploration Of The Way The World Looks and apologize generally if I’ve got it wrong. My understanding is Robert Louis Stevenson drew a map of Treasure Island and used it to make sure references in the book were consistent. Then the map was lost in the mail to his publishers. He had to read his text and re-create it as best he could. Which, if true, makes the map all the better. It makes it so good a lost-map story that I start instinctively to doubt it; it’s so colorfully perfect, after all.)

And finally there’s this gem from the Magic Realism Bot:

Happy reading.

A Leap Day 2016 Mathematics A To Z: Polynomials


I have another request for today’s Leap Day Mathematics A To Z term. Gaurish asked for something exciting. This should be less challenging than Dedekind Domains. I hope.

Polynomials.

Polynomials are everything. Everything in mathematics, anyway. If humans study it, it’s a polynomial. If we know anything about a mathematical construct, it’s because we ran across it while trying to understand polynomials.

I exaggerate. A tiny bit. Maybe by three percent. But polynomials are big.

They’re easy to recognize. We can get them in pre-algebra. We make them out of a set of numbers called coefficients and one or more variables. The coefficients are usually either real numbers or complex-valued numbers. The variables we usually allow to be either real or complex-valued numbers. We take each coefficient and multiply it by some power of each variable. And we add all that up. So, polynomials are things that look like these things:

x^2 - 2x + 1
12 x^4 + 2\pi x^2 y^3 - 4x^3 y - \sqrt{6}
\ln(2) + \frac{1}{2}\left(x - 2\right) - \frac{1}{2 \cdot 2^2}\left(x - 2\right)^2 + \frac{1}{2 \cdot 2^3}\left(x - 2\right)^3 - \frac{1}{2 \cdot 2^4}\left(x - 2\right)^4  + \cdots
a_n x^n + a_{n - 1}x^{n - 1} + a_{n - 2}x^{n - 2} + \cdots + a_2 x^2 + a_1 x^1 + a_0

The first polynomial maybe looks nice and comfortable. The second may look a little threatening, what with it having two variables and a square root in it, but it’s not too weird. The third is an infinitely long polynomial; you’re supposed to keep going on in that pattern, adding even more terms. The last is a generic representation of a polynomial. Each number a0, a1, a2, et cetera is some coefficient that we in principle know. It’s a good way of representing a polynomial when we want to work with it but don’t want to tie ourselves down to a particular example. The highest power we raise a variable to we call the degree of the polynomial. A second-degree polynomial, for example, has an x2 in it, but not an x3 or x4 or x18 or anything like that. A third-degree polynomial has an x3, but not x to any higher powers. Degree is a useful way of saying roughly how long a polynomial is, so it appears all over discussions of polynomials.

But why do we like polynomials? Why like them so much that MathWorld lists 1,163 pages that mention polynomials?

It’s because they’re great. They do everything we’d ever want to do and they’re great at it. We can add them together as easily as we add regular old numbers. We can subtract them as well. We can multiply and divide them. There’s even prime polynomials, just like there are prime numbers. They take longer to work out, but they’re not harder.

And they do great stuff in advanced mathematics too. In calculus we want to take derivatives of functions. Polynomials, we always can. We get another polynomial out of that. So we can keep taking derivatives, as many as we need. (We might need a lot of them.) We can integrate too. The integration produces another polynomial. So we can keep doing that as long as we need too. (We need to do this a lot, too.) This lets us solve so many problems in calculus, which is about how functions work. It also lets us solve so many problems in differential equations, which is about systems whose change depends on the current state of things.

That’s great for analyzing polynomials, but what about things that aren’t polynomials?

Well, if a function is continuous, then it might as well be a polynomial. To be a little more exact, we can set a margin of error. And we can always find polynomials that are less than that margin of error away from the original function. The original function might be annoying to deal with. The polynomial that’s as close to it as we want, though, isn’t.

Not every function is continuous. Most of them aren’t. But most of the functions we want to do work with are, or at least are continuous in stretches. Polynomials let us understand the functions that describe most real stuff.

Nice for mathematicians, all right, but how about for real uses? How about for calculations?

Oh, polynomials are just magnificent. You know why? Because you can evaluate any polynomial as soon as you can add and multiply. (Also subtract, but we think of that as addition.) Remember, x4 just means “x times x times x times x”, four of those x’s in the product. All these polynomials are easy to evaluate.

Even better, we don’t have to evaluate them. We can automate away the evaluation. It’s easy to set a calculator doing this work, and it will do it without complaint and with few unforeseeable mistakes.

Now remember that thing where we can make a polynomial close enough to any continuous function? And we can always set a calculator to evaluate a polynomial? Guess that this means about continuous functions. We have a tool that lets us calculate stuff we would want to know. Things like arccosines and logarithms and Bessel functions and all that. And we get nice easy to understand numbers out of them. For example, that third polynomial I gave you above? That’s not just infinitely long. It’s also a polynomial that approximates the natural logarithm. Pick a positive number x that’s between 0 and 4 and put it in that polynomial. Calculate terms and add them up. You’ll get closer and closer to the natural logarithm of that number. You’ll get there faster if you pick a number near 2, but you’ll eventually get there for whatever number you pick. (Calculus will tell us why x has to be between 0 and 4. Don’t worry about it for now.)

So through polynomials we can understand functions, analytically and numerically.

And they keep revealing things to us. We discovered complex-valued numbers because we wanted to find roots, values of x that make a polynomial of x equal to zero. Some formulas worked well for third- and fourth-degree polynomials. (They look like the quadratic formula, which solves second-degree polynomials. The big difference is nobody remembers what they are without looking them up.) But the formulas sometimes called for things that looked like square roots of negative numbers. Absurd! But if you carried on as if these square roots of negative numbers meant something, you got meaningful answers. And correct answers.

We wanted formulas to solve fifth- and higher-degree polynomials exactly. We can do this with second and third and fourth-degree polynomials, after all. It turns out we can’t. Oh, we can solve some of them exactly. The attempt to understand why, though, helped us create and shape group theory, the study of things that look like but aren’t numbers.

Polynomials go on, sneaking into everything. We can look at a square matrix and discover its characteristic polynomial. This allows us to find beautifully-named things like eigenvalues and eigenvectors. These reveal secrets of the matrix’s structure. We can find polynomials in the formulas that describe how many ways to split up a group of things into a smaller number of sets. We can find polynomials that describe how networks of things are connected. We can find polynomials that describe how a knot is tied. We can even find polynomials that distinguish between a knot and the knot’s reflection in the mirror.

Polynomials are everything.

Some More Mathematics Stuff To Read


And some more reasy reading, because, why not? First up is a new Twitter account from Chris Lusto (Lustomatical), a high school teacher with interest in Mathematical Twitter. He’s constructed the Math-Twitter-Blog-o-Sphere Bot, which retweets postings of mathematics blogs. They’re drawn from his blogroll, and a set of posts comes up a couple of times per day. (I believe he’s running the bot manually, in case it starts malfunctioning, for now.) It could be a useful way to find something interesting to read, or if you’ve got your own mathematics blog, a way to let other folks know you want to be found interesting.

Also possibly of interest is Gregory Taylor’s Any ~Qs comic strip blog. Taylor is a high school teacher and an amateur cartoonist. He’s chosen the difficult task of drawing a comic about “math equations as people”. It’s always hard to do a narrowly focused web comic. You can see Taylor working out the challenges of writing and drawing so that both story and teaching purposes are clear. I would imagine, for example, people to giggle at least at “tangent pants” even if they’re not sure what a domain restriction would have to do with anything, or even necessarily mean. But it is neat to see someone trying to go beyond anthropomorphized numerals in a web comic. And, after all, Math With Bad Drawings has got the hang of it.

Finally, an article published in Notices of the American Mathematical Society, and which I found by some reference now lost to me. The essay, “Knots in the Nursery:(Cats) Cradle Song of James Clerk Maxwell”, is by Professor Daniel S Silver. It’s about the origins of knot theory, and particularly of a poem composed by James Clerk Maxwell. Knot theory was pioneered in the late 19th century by Peter Guthrie Tait. Maxwell is the fellow behind Maxwell’s Equations, the description of how electricity and magnetism propagate and affect one another. Maxwell’s also renowned in statistical mechanics circles for explaining, among other things, how the rings of Saturn could work. And it turns out he could write nice bits of doggerel, with references Silver usefully decodes. It’s worth reading for the mathematical-history content.

A Summer 2015 Mathematics A to Z Roundup


Since I’ve run out of letters there’s little dignified to do except end the Summer 2015 Mathematics A to Z. I’m still organizing my thoughts about the experience. I’m quite glad to have done it, though.

For the sake of good organization, here’s the set of pages that this project’s seen created:

Fibonacci’s Biased Scarf


Here is a neat bit of crochet work with a bunch of nice recreational-mathematics properties. The first is that the distance between yellow rows, or between blue rows, represents the start of the Fibonacci sequence of numbers. I’m not sure if the Fibonacci sequence is the most famous sequence of whole numbers but it’s certainly among the most famous, and it’s got interesting properties and historical context.

The second recreational-mathematics property is that the pattern is rotationally symmetric. Rotate it 180 degrees and you get back the original pattern, albeit with blue and yellow swapped. You can form a group out of the ways that it’s possible to rotate an object and get back something that looks like the original. Symmetry groups can be things of simple aesthetic beauty, describing scarf patterns and ways to tile floors and the like. They can also describe things of deep physical significance. Much of the ability of quantum chromodynamics to describe nuclear physics comes from these symmetry groups.

The logo at top of the page is of a trefoil knot, which I’d mentioned a couple weeks back. A trefoil knot isn’t perfectly described by its silhouette. Where the lines intersect you have to imagine the string (or whatever makes up the knot) passing twice, once above and once below itself. If you do that crossing-over and crossing-under consistently you get the trefoil knot, the simplest loop that isn’t an unknot, that can’t be shaken loose into a simple circle.

Knot Theorist

FibonacciScarf

This scarf is totally biased. That’s not to say that it’s prejudiced, but that it was worked in the diagonal direction of the cloth.

My project was made from Julie Blagojevich’s free pattern Fibonacci’s Biased using Knit Picks Curio. The number of rows in each stripe is according to the numbers of the Fibonacci sequence up to 34. In other words, if you start at the blue side of the scarf and work your way right, the sequence of the number of yellow rows is 1, 1, 2, 3, 5, 8, 13, 21, 34. The sequence of the blue stripes are the same, but in the opposite direction. The effect is a rotationally symmetric scarf with few color changes at the edges and frequent color changes in the center. As I frequently tell my friends, math is beautiful.

If my geekiness hasn’t scared you away yet, here’s a random fun…

View original post 46 more words

A Summer 2015 Mathematics A To Z: knot


Knot.

It’s a common joke that mathematicians shun things that have anything to do with the real world. You can see where the impression comes from, though. Even common mathematical constructs, such as “functions”, are otherworldly abstractions once a mathematician is done defining them precisely. It can look like mathematicians find real stuff to be too dull to study.

Knot theory goes against the stereotype. A mathematician’s knot is just about what you would imagine: threads of something that get folded and twisted back around themselves. Every now and then a knot theorist will get a bit of human-interest news going for the department by announcing a new way to tie a tie, or to tie a shoelace, or maybe something about why the Christmas tree lights get so tangled up. These are really parts of the field, and applications that almost leap off the page as one studies. It’s a bit silly, admittedly. The only way anybody needs to tie a tie is go see my father and have him do it for you, and then just loosen and tighten the knot for the two or three times you’ll need it. And there’s at most two ways of tying a shoelace anybody needs. Christmas tree lights are a bigger problem but nobody can really help with getting them untangled. But studying the field encourages a lot of sketches of knots, and they almost cry out to be done out of some real material.

One amazing thing about knots is that they can be described as mathematical expressions. There are multiple ways to encode a description for how a knot looks as a polynomial. An expression like t + t^3 - t^4 contains enough information to draw one knot as opposed to all the others that might exist. (In this case it’s a very simple knot, one known as the right-hand trefoil knot. A trefoil knot is a knot with a trefoil-like pattern.) Indeed, it’s possible to describe knots with polynomials that let you distinguish between a knot and its mirror-image reflection.

Biology, life, is knots. The DNA molecules that carry and transmit genes tangle up on themselves, creating knots. The molecules that DNA encodes, proteins and enzymes and all the other basic tools of cells, can be represented as knots. Since at this level the field is about how molecules interact you probably would expect that much of chemistry can be seen as the ways knots interact. Statistical mechanics, the study of unspeakably large number of particles, do as well. A field you can be introduced to by studying your sneaker runs through the most useful arteries of science.

That said, mathematicians do make their knots of unreal stuff. The mathematical knot is, normally, a one-dimensional thread rather than a cylinder of stuff like a string or rope or shoelace. No matter; just imagine you’ve got a very thin string. And we assume that it’s frictionless; the knot doesn’t get stuck on itself. As a result a mathematician just learning knot theory would snootily point out that however tightly wound up your extension cord is, it’s not actually knotted. You could in principle push one of the ends of the cord all the way through the knot and so loosen it into an untangled string, if you could push the cord from one end and if the cord didn’t get stuck on itself. So, yes, real-world knots are mathematically not knots. After all, something that just falls apart with a little push hardly seems worth the name “knot”.

My point is that mathematically a knot has to be a closed loop. And it’s got to wrap around itself in some sufficiently complicated way. A simple circle of string is not a knot. If “not a knot” sounds a bit childish you might use instead the Lewis Carrollian term “unknot”.

We can fix that, though, using a surprisingly common mathematical trick. Take the shoelace or rope or extension cord you want to study. And extend it: draw lines from either end of the cord out to the edge of your paper. (This is a great field for doodlers.) And then pretend that the lines go out and loop around, touching each other somewhere off the sheet of paper, as simply as possible. What had been an unknot is now not an unknot. Study wisely.

Reading the Comics, June 16, 2015: The Carefully Targeted Edition


The past several days produced a good number of comic strips mentioning mathematical topics. Strangely, they seem to be carefully targeted to appeal to me. Here’s how.

Mason Mastroianni, Mick Mastroianni, and Perri Hart’s B.C. (June 12) is your classic resisting-the-world-problems joke. I admit I haven’t done anything at this level of mathematics in a long while. I’m curious if actual teachers, or students, could say whether problems with ridiculous numbers of fruits actually appear in word problems, or if this is one of those motifs that’s popular despite a nearly imaginary base in the real world.

Dan Thompson’s Brevity (June 13) is aimed very precisely at the professional knot theorist. Also, mathematics includes a thing called knot theory which is almost exactly what you imagine. For a while it looked like I might get into knot theory, although ultimately I wasn’t able to find a problem interesting enough to work on that I was able to prove anything interesting about. I’m delighted a field that so many people wouldn’t imagine existed got a comic strip in this manner; I wonder if this is what dinosaur researchers felt when The Far Side was still in production.

Steve Sicula’s Home and Away (June 14) name-drops the New Math, though the term’s taken literally. The joke feels anachronistic to me. Would a kid that age have even heard of a previous generation’s effort to make mathematics about understanding what you’re doing and why? New Math (admittedly, on the way out) was my elementary school thing.

Mark Litzler’s Joe Vanilla (June 15) tickles me with the caption, “the clarity of the equation befuddles”. It’s a funny idea. Ideally, the point of an equation is to provide clarity and insight, maybe by solving it, maybe by forming it. A befuddling equation is usually a signal the problem needs to be thought out some more.

Lincoln Pierce’s Big Nate: First Class (June 16, originally run June 11, 1991) is aimed at the Mathletes out there. It throws in a slide rule mention for good measure. Given Nate’s Dad’s age in the 1991 setting it’s plausible he’d have had a slide rule. (He’s still the same age in the comic strip being produced today, so he wouldn’t have had one if the strip were redrawn.) I don’t remember being on a competitive mathematics team in high school, although I did participate in some physics contests. My recollection is that I was an inconsistent performer, though. I don’t think I had the slightly obsessive competitive urge needed to really excel in high school academic competition.

And Larry Wright’s Motley Classics (June 16, originally run June 16, 1987) is a joke about using algebra in the real world. Or at least in the world of soap operas. Back in 1987 (United States) soap operas were still a thing.

Reading the Comics, November 28, 2014: Greatest Hits Edition?


I don’t ever try speaking for Comic Strip Master Command, and it almost never speaks to me, but it does seem like this week’s strips mentioning mathematical themes was trying to stick to the classic subjects: anthropomorphized numbers, word problems, ways to measure time and space, under-defined probability questions, and sudoku. It feels almost like a reunion weekend to have all these topics come together.

Dan Thompson’s Brevity (November 23) is a return to the world-of-anthropomorphic-numbers kind of joke, and a pun on the arithmetic mean, which is after all the statistic which most lends itself to puns, just edging out the “range” and the “single-factor ANOVA F-Test”.

Phil Frank Joe Troise’s The Elderberries (November 23, rerun) brings out word problem humor, using train-leaves-the-station humor as a representative of the kinds of thinking academics do. Nagging slightly at me is that I think the strip had established the Professor as one of philosophy and while it’s certainly not unreasonable for a philosopher to be interested in mathematics I wouldn’t expect this kind of mathematics to strike him as very interesting. But then there is the need to get the idea across in two panels, too.

Jonathan Lemon’s Rabbits Against Magic (November 25) brings up a way of identifying the time — “half seven” — which recalls one of my earliest essays around here, “How Many Numbers Have We Named?”, because the construction is one that I find charming and that was glad to hear was still current. “Half seven” strikes me as similar in construction to saying a number as “five and twenty” instead of “twenty-five”, although I’m ignorant as to whether the actually is any similarity.

Scott Hilburn’s The Argyle Sweater (November 26) brings out a joke that I thought had faded out back around, oh, 1978, when the United States decided it wasn’t going to try converting to metric after all, now that we had two-liter bottles of soda. The curious thing about this sort of hyperconversion (it’s surely a satiric cousin to the hypercorrection that makes people mangle a sentence in the misguided hope of perfecting it) — besides that the “yard” in Scotland Yard is obviously not a unit of measure — is the notion that it’d be necessary to update idiomatic references that contain old-fashioned units of measurement. Part of what makes idioms anything interesting is that they can be old-fashioned while still making as much sense as possible; “in for a penny, in for a pound” is a sensible thing to say in the United States, where the pound hasn’t been legal tender since 1857; why would (say) “an ounce of prevention is worth a pound of cure” be any different? Other than that it’s about the only joke easily found on the ground once you’ve decided to look for jokes in the “systems of measurement” field.

Mark Heath’s Spot the Frog (November 26, rerun) I’m not sure actually counts as a mathematics joke, although it’s got me intrigued: Surly Toad claims to have a stick in his mouth to use to give the impression of a smile, or 37 (“Sorry, 38”) other facial expressions. The stick’s shown as a bundle of maple twigs, wound tightly together and designed to take shapes easily. This seems to me the kind of thing that’s grown as an application of knot theory, the study of, well, it’s almost right there in the name. Knots, the study of how strings of things can curl over and around and cross themselves (or other strings), seemed for a very long time to be a purely theoretical playground, not least because, to be addressable by theory, the knots had to be made of an imaginary material that could be stretched arbitrarily finely, and could be pushed frictionlessly through it, which allows for good theoretical work but doesn’t act a thing like a shoelace. Then I think everyone was caught by surprise when it turned out the mathematics of these very abstract knots also describe the way proteins and other long molecules fold, and unfold; and from there it’s not too far to discovering wonderful structures that can change almost by magic with slight bits of pressure. (For my money, the most astounding thing about knots is that you can describe thermodynamics — the way heat works — on them, but I’m inclined towards thermodynamic problems.)

Veronica was out of town for a week; Archie's test scores improved. This demonstrates that test scores aren't everything.
Henry Scarpelli and Crag Boldman’s Archie for the 28th of November, 2014. Clearly we should subject this phenomenon to scientific inquiry!

Henry Scarpelli and Crag Boldman’s Archie (November 28, rerun) offers an interesting problem: when Veronica was out of town for a week, Archie’s test scores improved. Is there a link? This kind of thing is awfully interesting to study, and awfully difficult to: there’s no way to run a truly controlled experiment to see whether Veronica’s presence affects Archie’s test scores. After all, he never takes the same test twice, even if he re-takes a test on the same subject (and even if the re-test were the exact same questions, he would go into it the second time with relevant experience that he didn’t have the first time). And a couple good test scores might be relevant, or might just be luck, or it might be that something else happened to change that week that we haven’t noticed yet. How can you trace down plausible causal links in a complicated system?

One approach is an experimental design that, at least in the psychology textbooks I’ve read, gets called A-B-A, or A-B-A-B, experiment design: measure whatever it is you’re interested in during a normal time, “A”, before whatever it is whose influence you want to see has taken hold. Then measure it for a time “B” where something has changed, like, Veronica being out of town. Then go back as best as possible to the normal situation, “A” again; and, if your time and research budget allow, going back to another stretch of “B” (and, hey, maybe even “A” again) helps. If there is an influence, it ought to appear sometime after “B” starts, and fade out again after the return to “A”. The more you’re able to replicate this the sounder the evidence for a link is.

(We’re actually in the midst of something like this around home: our pet rabbit was diagnosed with a touch of arthritis in his last checkup, but mildly enough and in a strange place, so we couldn’t tell whether it’s worth putting him on medication. So we got a ten-day prescription and let that run its course and have tried to evaluate whether it’s affected his behavior. This has proved difficult to say because we don’t really have a clear way of measuring his behavior, although we can say that the arthritis medicine is apparently his favorite thing in the world, based on his racing up to take the liquid and his trying to grab it if we don’t feed it to him fast enough.)

Ralph Hagen’s The Barn (November 28) has Rory the sheep wonder about the chances he and Stan the bull should be together in the pasture, given how incredibly vast the universe is. That’s a subtly tricky question to ask, though. If you want to show that everything that ever existed is impossibly unlikely you can work out, say, how many pastures there are on Earth multiply it by an estimate of how many Earth-like planets there likely are in the universe, and take one divided by that number and marvel at Rory’s incredible luck. But that number’s fairly meaningless: among other obvious objections, wouldn’t Rory wonder the same thing if he were in a pasture with Dan the bull instead? And Rory wouldn’t be wondering anything at all if it weren’t for the accident by which he happened to be born; how impossibly unlikely was that? And that Stan was born too? (And, obviously, that all Rory and Stan’s ancestors were born and survived to the age of reproducing?)

Except that in this sort of question we seem to take it for granted, for instance, that all Stan’s ancestors would have done their part by existing and doing their part to bringing Stan around. And we’d take it for granted that the pasture should exist, rather than be a farmhouse or an outlet mall or a rocket base. To come up with odds that mean anything we have to work out what the probability space of all possible relevant outcomes is, and what the set of all conditions that satisfy the concept of “we’re stuck here together in this pasture” is.

Mark Pett’s Lucky Cow (November 28) brings up sudoku puzzles and the mystery of where they come from, exactly. This prompted me to wonder about the mechanics of making sudoku puzzles and while it certainly seems they could be automated pretty well, making your own amounts to just writing the digits one through nine nine times over, and then blanking out squares until the puzzle is hard. A casual search of the net suggests the most popular way of making sure you haven’t blanking out squares so that the puzzle becomes unsolvable (in this case, that there’s two or more puzzles that fit the revealed information) is to let an automated sudoku solver tell you. That’s true enough but I don’t see any mention of any algorithms by which one could check if you’re blanking out a solution-foiling set of squares. I don’t know whether that reflects there being no algorithm for this that’s more efficient than “try out possible solutions”, or just no algorithm being more practical. It’s relatively easy to make a computer try out possible solutions, after all.

A paper published by Mária Ercsey-Ravasz and Zoltán Toroczkai in Nature Scientific Reports in 2012 describes the recasting of the problem of solving sudoku into a deterministic, dynamical system, and matches the difficulty of a sudoku puzzle to chaotic behavior of that system. (If you’re looking at the article and despairing, don’t worry. Go to the ‘Puzzle hardness as transient chaotic dynamics’ section, and read the parts of the sentence that aren’t technical terms.) Ercsey-Ravasz and Toroczkai point out their chaos-theory-based definition of hardness matches pretty well, though not perfectly, the estimates of difficulty provided by sudoku editors and solvers. The most interesting (to me) result they report is that sudoku puzzles which give you the minimum information — 17 or 18 non-blank numbers to start — are generally not the hardest puzzles. 21 or 22 non-blank numbers seem to match the hardest of puzzles, though they point out that difficulty has got to depend on the positioning of the non-blank numbers and not just how many there are.