Tagged: knot theory Toggle Comment Threads | Keyboard Shortcuts

  • Joseph Nebus 6:00 pm on Wednesday, 1 February, 2017 Permalink | Reply
    Tags: cables, knot theory, , podcasts   

    Mathematics Stuff To Read Or Listen To 


    I concede January was a month around here that could be characterized as “lazy”. Not that I particularly skimped on the Reading the Comics posts. But they’re relatively easy to do: the comics tell me what to write about, and I could do a couple paragraphs on most anything, apparently.

    While I get a couple things planned out for the coming month, though, here’s some reading for other people.

    The above links to a paper in the Proceedings of the National Academy of Sciences. It’s about something I’ve mentioned when talking about knot before. And it’s about something everyone with computer cables or, like the tweet suggests, holiday lights finds. The things coil up. Spontaneous knotting of an agitated string by Dorian M Raymer and Douglas E Smith examines when these knots are likely to form, and how likely they are. It’s not a paper for the lay audience, but there are a bunch of fine pictures. The paper doesn’t talk about Christmas lights, no matter what the tweet does, but the mathematics carries over to this.

    MathsByAGirl, meanwhile, had a post midmonth listing a couple of mathematics podcasts. I’m familiar with one of them, BBC Radio 4’s A Brief History of Mathematics, which was a set of ten-to-twenty-minute sketches of historically important mathematics figures. I’ll trust MathsByAGirl’s taste on other podcasts. I’d spent most of this month finishing off a couple of audio books (David Hackett Fischer’s Washington’s Crossing which I started listening to while I was in Trenton for a week, because that’s the sort of thing I think is funny, and Robert Louis Stevenson’s Doctor Jekyll and Mister Hyde And Other Stories) and so fell behind on podcasts. But now there’s some more stuff to listen forward to.

    And then I’ll wrap up with this from KeplerLounge. It looks to be the start of some essays about something outside the scope of my Why Stuff Can Orbit series. (Which I figure to resume soon.) We start off talking about orbits as if planets were “point masses”. Which is what the name suggests: a mass that fills up a single point, with no volume, no shape, no features. This makes the mathematics easier. The mathematics is just as easy if the planets are perfect spheres, whether hollow or solid. But real planets are not perfect spheres. They’re a tiny bit blobby. And they’re a little lumpy as well. We can ignore that if we’re doing rough estimates of how orbits work. But if we want to get them right we can’t ignore that anymore. And this essay describes some of how we go about dealing with that.

    Advertisements
     
  • Joseph Nebus 6:00 pm on Monday, 19 December, 2016 Permalink | Reply
    Tags: , , knot theory, links,   

    The End 2016 Mathematics A To Z: Unlink 


    This is going to be a fun one. It lets me get into knot theory again.

    Unlink.

    An unlink is what knot theorists call that heap of loose rubber bands in that one drawer compartment.

    The longer way around. It starts with knots. I love knots. If I were stronger on abstract reasoning and weaker on computation I’d have been a knot theorist. At least graph theory anyway. The mathematical idea of a knot is inspired by a string tied together. In making it a mathematical idea we perfect the string. It becomes as thin as a line, though it curves as much as we want. It can stretch out or squash down as much as we want. It slides frictionlessly against itself. Gravity doesn’t make it drop any. This removes the hassles of real-world objects from it. It also means actual strings or yarns or whatever can’t be knots anymore. Only something that’s a loop which closes back on itself can be a knot. The knot you might make in a shoelace, to use an example, could be undone by pushing the tip back through the ‘knot’. Since our mathematical string is frictionless we can do that, effortlessly. We’re left with nothing.

    But you can create a pretty good approximation to a mathematical knot if you have some kind of cable that can be connected to its own end. Loop the thing around as you like, connect end to end, and you’ve got it. I recommend the glow sticks sold for people to take to parties or raves or the like. They’re fun. If you tie it up so that the string (rope, glow stick, whatever) can’t spread out into a simple O shape no matter how you shake it up (short of breaking the cable) then you have a knot. There are many of them. Trefoil knots are probably the easiest to get, but if you’re short on inspiration try looking at Celtic knot patterns.

    But if the string can be shaken out until it’s a simple O shape, the sort of thing you can place flat on a table, then you have an unknot. Just from the vocabulary this you see why I like the subject so. Since this hasn’t quite got silly enough, let me assure you that an unknot is itself a kind of knot; we call it the trivial knot. It’s the knot that’s too simple to be a knot. I’m sure you were worried about that. I only hear people call it an unknot, but maybe there are heritages that prefer “trivial knot”.

    So that’s knots. What happens if you have more than one thing, though? What if you have a couple of string-loops? Several cables. We know these things can happen in the real world, since we’ve looked behind the TV set or the wireless router and we know there’s somehow more cables there than there are even things to connect.

    Even mathematicians wouldn’t want to ignore something that caught up with real world implications. And we don’t. We get to them after we’re pretty comfortable working with knots. Describing them, working out the theoretical tools we’d use to un-knot a proper knot (spoiler: we cut things), coming up with polynomials that describe them, that sort of thing. When we’re ready for a new trick there we consider what happens if we have several knots. We call this bundle of knots a “link”. Well, what would you call it?

    A link is a collection of knots. By talking about a link we expect that at least some of the knots are going to loop around each other. This covers a lot of possibilities. We could picture one of those construction-paper chains, made of intertwined loops, that are good for elementary school craft projects to be a link. We can picture a keychain with a bunch of keys dangling from it to be a link. (Imagine each key is a knot, just made of a very fat, metal “string”. C’mon, you can give me that.) The mass of cables hiding behind the TV stand is not properly a link, since it’s not properly made out of knots. But if you can imagine taking the ends of each of those wires and looping them back to the origins, then the somehow vaster mess you get from that would be a link again.

    And then we come to an “unlink”. This has two pieces. The first is that it’s a collection of knots, yes, but knots that don’t interlink. We can pull them apart without any of them tugging the others along. The second piece is that each of the knots is itself an unknot. Trivial knots. Whichever you like to call them.

    The “unlink” also gets called the “trivial link”, since it’s as boring a link as you can imagine. Manifested in the real world, well, an unbroken rubber band is a pretty good unknot. And a pile of unbroken rubber bands will therefore be an unlink.

    If you get into knot theory you end up trying to prove stuff about complicated knots, and complicated links. Often these are easiest to prove by chopping up the knot or the link into something simpler. Maybe you chop those smaller pieces up again. And you can’t get simpler than an unlink. If you can prove whatever you want to show for that then you’ve got a step done toward proving your whole actually interesting thing. This is why we see unknots and unlinks enough to give them names and attention.

     
  • Joseph Nebus 6:00 pm on Wednesday, 17 August, 2016 Permalink | Reply
    Tags: , knot theory, , sheep, , Utopia, ,   

    Some Mathematical Tweets To Read 


    Can’t deny that I will sometimes stockpile links of mathematics stuff to talk about. Sometimes I even remember to post it. Sometimes it’s a tweet like this, which apparently I’ve been carrying around since April:

    I admit I do not know whether the claim is true. It’s plausible enough. English has many variants in England alone, and any trade will pick up its own specialized jargon. The words are fun as it is.

    From the American Mathematical Society there’s this:

    I talk a good bit about knot theory. It captures the imagination and it’s good for people who like to doodle. And it has a lot of real-world applications. Tangled wires, protein strands, high-energy plasmas, they all have knots in them. Some work by Paul Sutcliffe and Fabian Maucher, both of Durham University, studies tangled vortices. These are vortices that are, er, tangled together, just like you imagine. Knot theory tells us much about this kind of vortex. And it turns out these tangled vortices can untangle themselves and smooth out again, even without something to break them up and rebuild them. It gives hope for power cords everywhere.

    Nerds have a streak which compels them to make blueprints of things. It can be part of the healthier side of nerd culture, the one that celebrates everything. The side that tries to fill in the real-world things that the thing-celebrated would have if it existed. So here’s a bit of news about doing that:

    I like the attempt to map Sir Thomas More’s Utopia. It’s a fun exercise in matching stuff to a thin set of data. But as mentioned in the article, nobody should take it too seriously. The exact arrangement of things in Utopia isn’t the point of the book. More probably didn’t have a map for it himself.

    (Although maybe. I believe I got this from Simon Garfield’s On The Map: A Mind-Expanding Exploration Of The Way The World Looks and apologize generally if I’ve got it wrong. My understanding is Robert Louis Stevenson drew a map of Treasure Island and used it to make sure references in the book were consistent. Then the map was lost in the mail to his publishers. He had to read his text and re-create it as best he could. Which, if true, makes the map all the better. It makes it so good a lost-map story that I start instinctively to doubt it; it’s so colorfully perfect, after all.)

    And finally there’s this gem from the Magic Realism Bot:

    Happy reading.

     
  • Joseph Nebus 3:00 pm on Monday, 4 April, 2016 Permalink | Reply
    Tags: , knot theory, , ,   

    A Leap Day 2016 Mathematics A To Z: Polynomials 


    I have another request for today’s Leap Day Mathematics A To Z term. Gaurish asked for something exciting. This should be less challenging than Dedekind Domains. I hope.

    Polynomials.

    Polynomials are everything. Everything in mathematics, anyway. If humans study it, it’s a polynomial. If we know anything about a mathematical construct, it’s because we ran across it while trying to understand polynomials.

    I exaggerate. A tiny bit. Maybe by three percent. But polynomials are big.

    They’re easy to recognize. We can get them in pre-algebra. We make them out of a set of numbers called coefficients and one or more variables. The coefficients are usually either real numbers or complex-valued numbers. The variables we usually allow to be either real or complex-valued numbers. We take each coefficient and multiply it by some power of each variable. And we add all that up. So, polynomials are things that look like these things:

    x^2 - 2x + 1
    12 x^4 + 2\pi x^2 y^3 - 4x^3 y - \sqrt{6}
    \ln(2) + \frac{1}{2}\left(x - 2\right) - \frac{1}{2 \cdot 2^2}\left(x - 2\right)^2 + \frac{1}{2 \cdot 2^3}\left(x - 2\right)^3 - \frac{1}{2 \cdot 2^4}\left(x - 2\right)^4  + \cdots
    a_n x^n + a_{n - 1}x^{n - 1} + a_{n - 2}x^{n - 2} + \cdots + a_2 x^2 + a_1 x^1 + a_0

    The first polynomial maybe looks nice and comfortable. The second may look a little threatening, what with it having two variables and a square root in it, but it’s not too weird. The third is an infinitely long polynomial; you’re supposed to keep going on in that pattern, adding even more terms. The last is a generic representation of a polynomial. Each number a0, a1, a2, et cetera is some coefficient that we in principle know. It’s a good way of representing a polynomial when we want to work with it but don’t want to tie ourselves down to a particular example. The highest power we raise a variable to we call the degree of the polynomial. A second-degree polynomial, for example, has an x2 in it, but not an x3 or x4 or x18 or anything like that. A third-degree polynomial has an x3, but not x to any higher powers. Degree is a useful way of saying roughly how long a polynomial is, so it appears all over discussions of polynomials.

    But why do we like polynomials? Why like them so much that MathWorld lists 1,163 pages that mention polynomials?

    It’s because they’re great. They do everything we’d ever want to do and they’re great at it. We can add them together as easily as we add regular old numbers. We can subtract them as well. We can multiply and divide them. There’s even prime polynomials, just like there are prime numbers. They take longer to work out, but they’re not harder.

    And they do great stuff in advanced mathematics too. In calculus we want to take derivatives of functions. Polynomials, we always can. We get another polynomial out of that. So we can keep taking derivatives, as many as we need. (We might need a lot of them.) We can integrate too. The integration produces another polynomial. So we can keep doing that as long as we need too. (We need to do this a lot, too.) This lets us solve so many problems in calculus, which is about how functions work. It also lets us solve so many problems in differential equations, which is about systems whose change depends on the current state of things.

    That’s great for analyzing polynomials, but what about things that aren’t polynomials?

    Well, if a function is continuous, then it might as well be a polynomial. To be a little more exact, we can set a margin of error. And we can always find polynomials that are less than that margin of error away from the original function. The original function might be annoying to deal with. The polynomial that’s as close to it as we want, though, isn’t.

    Not every function is continuous. Most of them aren’t. But most of the functions we want to do work with are, or at least are continuous in stretches. Polynomials let us understand the functions that describe most real stuff.

    Nice for mathematicians, all right, but how about for real uses? How about for calculations?

    Oh, polynomials are just magnificent. You know why? Because you can evaluate any polynomial as soon as you can add and multiply. (Also subtract, but we think of that as addition.) Remember, x4 just means “x times x times x times x”, four of those x’s in the product. All these polynomials are easy to evaluate.

    Even better, we don’t have to evaluate them. We can automate away the evaluation. It’s easy to set a calculator doing this work, and it will do it without complaint and with few unforeseeable mistakes.

    Now remember that thing where we can make a polynomial close enough to any continuous function? And we can always set a calculator to evaluate a polynomial? Guess that this means about continuous functions. We have a tool that lets us calculate stuff we would want to know. Things like arccosines and logarithms and Bessel functions and all that. And we get nice easy to understand numbers out of them. For example, that third polynomial I gave you above? That’s not just infinitely long. It’s also a polynomial that approximates the natural logarithm. Pick a positive number x that’s between 0 and 4 and put it in that polynomial. Calculate terms and add them up. You’ll get closer and closer to the natural logarithm of that number. You’ll get there faster if you pick a number near 2, but you’ll eventually get there for whatever number you pick. (Calculus will tell us why x has to be between 0 and 4. Don’t worry about it for now.)

    So through polynomials we can understand functions, analytically and numerically.

    And they keep revealing things to us. We discovered complex-valued numbers because we wanted to find roots, values of x that make a polynomial of x equal to zero. Some formulas worked well for third- and fourth-degree polynomials. (They look like the quadratic formula, which solves second-degree polynomials. The big difference is nobody remembers what they are without looking them up.) But the formulas sometimes called for things that looked like square roots of negative numbers. Absurd! But if you carried on as if these square roots of negative numbers meant something, you got meaningful answers. And correct answers.

    We wanted formulas to solve fifth- and higher-degree polynomials exactly. We can do this with second and third and fourth-degree polynomials, after all. It turns out we can’t. Oh, we can solve some of them exactly. The attempt to understand why, though, helped us create and shape group theory, the study of things that look like but aren’t numbers.

    Polynomials go on, sneaking into everything. We can look at a square matrix and discover its characteristic polynomial. This allows us to find beautifully-named things like eigenvalues and eigenvectors. These reveal secrets of the matrix’s structure. We can find polynomials in the formulas that describe how many ways to split up a group of things into a smaller number of sets. We can find polynomials that describe how networks of things are connected. We can find polynomials that describe how a knot is tied. We can even find polynomials that distinguish between a knot and the knot’s reflection in the mirror.

    Polynomials are everything.

     
    • gaurish 3:40 pm on Monday, 4 April, 2016 Permalink | Reply

      Beautiful post!
      Recently I studied Taylor’s Theorem & Weierstrass approximation theorem. These theorems illustrate your ideas :)

      Like

      • Joseph Nebus 6:38 pm on Monday, 4 April, 2016 Permalink | Reply

        Thank you kindly. And yeah, the Taylor Theorem and Weierstrauss Approximation Theorem are the ideas I was sneaking around without trying to get too technical. (Maybe I should start including a postscript of technical talk to these essays.)

        Like

  • Joseph Nebus 10:00 pm on Wednesday, 20 January, 2016 Permalink | Reply
    Tags: , , , knot theory, Peter Guthrie Tait,   

    Some More Mathematics Stuff To Read 


    And some more reasy reading, because, why not? First up is a new Twitter account from Chris Lusto (Lustomatical), a high school teacher with interest in Mathematical Twitter. He’s constructed the Math-Twitter-Blog-o-Sphere Bot, which retweets postings of mathematics blogs. They’re drawn from his blogroll, and a set of posts comes up a couple of times per day. (I believe he’s running the bot manually, in case it starts malfunctioning, for now.) It could be a useful way to find something interesting to read, or if you’ve got your own mathematics blog, a way to let other folks know you want to be found interesting.

    Also possibly of interest is Gregory Taylor’s Any ~Qs comic strip blog. Taylor is a high school teacher and an amateur cartoonist. He’s chosen the difficult task of drawing a comic about “math equations as people”. It’s always hard to do a narrowly focused web comic. You can see Taylor working out the challenges of writing and drawing so that both story and teaching purposes are clear. I would imagine, for example, people to giggle at least at “tangent pants” even if they’re not sure what a domain restriction would have to do with anything, or even necessarily mean. But it is neat to see someone trying to go beyond anthropomorphized numerals in a web comic. And, after all, Math With Bad Drawings has got the hang of it.

    Finally, an article published in Notices of the American Mathematical Society, and which I found by some reference now lost to me. The essay, “Knots in the Nursery:(Cats) Cradle Song of James Clerk Maxwell”, is by Professor Daniel S Silver. It’s about the origins of knot theory, and particularly of a poem composed by James Clerk Maxwell. Knot theory was pioneered in the late 19th century by Peter Guthrie Tait. Maxwell is the fellow behind Maxwell’s Equations, the description of how electricity and magnetism propagate and affect one another. Maxwell’s also renowned in statistical mechanics circles for explaining, among other things, how the rings of Saturn could work. And it turns out he could write nice bits of doggerel, with references Silver usefully decodes. It’s worth reading for the mathematical-history content.

     
    • elkement (Elke Stangl) 1:55 pm on Friday, 22 January, 2016 Permalink | Reply

      Your blog is really an awesome resource for all things math, no doubt!!

      Like

      • Joseph Nebus 5:02 am on Sunday, 24 January, 2016 Permalink | Reply

        That’s awfully kind of you to say. I’ve really just been grabbing the occasional thing that comes across my desk and passing that along, though, part of the great chain of vaguely sourced references.

        Liked by 1 person

        • elkement (Elke Stangl) 8:48 am on Sunday, 24 January, 2016 Permalink | Reply

          But ‘curating’ as they say today is an art, too, and after all you manage to make things accessible, e.g. by summarizing posts you reblog so neatly…. and manage to do so without much images!!

          Like

          • Joseph Nebus 10:12 pm on Tuesday, 26 January, 2016 Permalink | Reply

            Well, thank you again. I do feel like if I’m pointing to or reblogging someone else’s work I should provide a bit of context and original writing. It’s too easy to just pass around a link and say “here’s a good link”, which I wouldn’t blame anyone for doubting.

            Liked by 1 person

  • Joseph Nebus 3:00 pm on Friday, 24 July, 2015 Permalink | Reply
    Tags: , , knot theory, , ,   

    A Summer 2015 Mathematics A to Z Roundup 


    Since I’ve run out of letters there’s little dignified to do except end the Summer 2015 Mathematics A to Z. I’m still organizing my thoughts about the experience. I’m quite glad to have done it, though.

    For the sake of good organization, here’s the set of pages that this project’s seen created:

     
  • Joseph Nebus 7:21 pm on Tuesday, 14 July, 2015 Permalink | Reply
    Tags: , knot theory, , scarves,   

    Fibonacci’s Biased Scarf 


    Here is a neat bit of crochet work with a bunch of nice recreational-mathematics properties. The first is that the distance between yellow rows, or between blue rows, represents the start of the Fibonacci sequence of numbers. I’m not sure if the Fibonacci sequence is the most famous sequence of whole numbers but it’s certainly among the most famous, and it’s got interesting properties and historical context.

    The second recreational-mathematics property is that the pattern is rotationally symmetric. Rotate it 180 degrees and you get back the original pattern, albeit with blue and yellow swapped. You can form a group out of the ways that it’s possible to rotate an object and get back something that looks like the original. Symmetry groups can be things of simple aesthetic beauty, describing scarf patterns and ways to tile floors and the like. They can also describe things of deep physical significance. Much of the ability of quantum chromodynamics to describe nuclear physics comes from these symmetry groups.

    The logo at top of the page is of a trefoil knot, which I’d mentioned a couple weeks back. A trefoil knot isn’t perfectly described by its silhouette. Where the lines intersect you have to imagine the string (or whatever makes up the knot) passing twice, once above and once below itself. If you do that crossing-over and crossing-under consistently you get the trefoil knot, the simplest loop that isn’t an unknot, that can’t be shaken loose into a simple circle.

    Like

    Knot Theorist

    FibonacciScarf

    This scarf is totally biased. That’s not to say that it’s prejudiced, but that it was worked in the diagonal direction of the cloth.

    My project was made from Julie Blagojevich’s free pattern Fibonacci’s Biased using Knit Picks Curio. The number of rows in each stripe is according to the numbers of the Fibonacci sequence up to 34. In other words, if you start at the blue side of the scarf and work your way right, the sequence of the number of yellow rows is 1, 1, 2, 3, 5, 8, 13, 21, 34. The sequence of the blue stripes are the same, but in the opposite direction. The effect is a rotationally symmetric scarf with few color changes at the edges and frequent color changes in the center. As I frequently tell my friends, math is beautiful.

    If my geekiness hasn’t scared you away yet, here’s a random fun…

    View original post 46 more words

     
  • Joseph Nebus 2:49 pm on Wednesday, 17 June, 2015 Permalink | Reply
    Tags: , biology, , knot theory, , , trefoils,   

    A Summer 2015 Mathematics A To Z: knot 


    Knot.

    It’s a common joke that mathematicians shun things that have anything to do with the real world. You can see where the impression comes from, though. Even common mathematical constructs, such as “functions”, are otherworldly abstractions once a mathematician is done defining them precisely. It can look like mathematicians find real stuff to be too dull to study.

    Knot theory goes against the stereotype. A mathematician’s knot is just about what you would imagine: threads of something that get folded and twisted back around themselves. Every now and then a knot theorist will get a bit of human-interest news going for the department by announcing a new way to tie a tie, or to tie a shoelace, or maybe something about why the Christmas tree lights get so tangled up. These are really parts of the field, and applications that almost leap off the page as one studies. It’s a bit silly, admittedly. The only way anybody needs to tie a tie is go see my father and have him do it for you, and then just loosen and tighten the knot for the two or three times you’ll need it. And there’s at most two ways of tying a shoelace anybody needs. Christmas tree lights are a bigger problem but nobody can really help with getting them untangled. But studying the field encourages a lot of sketches of knots, and they almost cry out to be done out of some real material.

    One amazing thing about knots is that they can be described as mathematical expressions. There are multiple ways to encode a description for how a knot looks as a polynomial. An expression like t + t^3 - t^4 contains enough information to draw one knot as opposed to all the others that might exist. (In this case it’s a very simple knot, one known as the right-hand trefoil knot. A trefoil knot is a knot with a trefoil-like pattern.) Indeed, it’s possible to describe knots with polynomials that let you distinguish between a knot and its mirror-image reflection.

    Biology, life, is knots. The DNA molecules that carry and transmit genes tangle up on themselves, creating knots. The molecules that DNA encodes, proteins and enzymes and all the other basic tools of cells, can be represented as knots. Since at this level the field is about how molecules interact you probably would expect that much of chemistry can be seen as the ways knots interact. Statistical mechanics, the study of unspeakably large number of particles, do as well. A field you can be introduced to by studying your sneaker runs through the most useful arteries of science.

    That said, mathematicians do make their knots of unreal stuff. The mathematical knot is, normally, a one-dimensional thread rather than a cylinder of stuff like a string or rope or shoelace. No matter; just imagine you’ve got a very thin string. And we assume that it’s frictionless; the knot doesn’t get stuck on itself. As a result a mathematician just learning knot theory would snootily point out that however tightly wound up your extension cord is, it’s not actually knotted. You could in principle push one of the ends of the cord all the way through the knot and so loosen it into an untangled string, if you could push the cord from one end and if the cord didn’t get stuck on itself. So, yes, real-world knots are mathematically not knots. After all, something that just falls apart with a little push hardly seems worth the name “knot”.

    My point is that mathematically a knot has to be a closed loop. And it’s got to wrap around itself in some sufficiently complicated way. A simple circle of string is not a knot. If “not a knot” sounds a bit childish you might use instead the Lewis Carrollian term “unknot”.

    We can fix that, though, using a surprisingly common mathematical trick. Take the shoelace or rope or extension cord you want to study. And extend it: draw lines from either end of the cord out to the edge of your paper. (This is a great field for doodlers.) And then pretend that the lines go out and loop around, touching each other somewhere off the sheet of paper, as simply as possible. What had been an unknot is now not an unknot. Study wisely.

     
    • Lily Lau 6:09 pm on Wednesday, 17 June, 2015 Permalink | Reply

      Knots, I see! I should have studied sciences, they always sound fascinating.

      Like

      • Joseph Nebus 7:08 pm on Thursday, 18 June, 2015 Permalink | Reply

        Oh, they’re better than fascinating. They’re fun. This is a field of mathematics you actually study by imagining the cutting and splicing of threads. You can bring arts and crafts to your thesis defense and it’ll belong. I ended up in numerical mathematics and statistical mechanics; all I could bring was color transparencies of simulation results.

        Like

    • Ken Dowell 9:19 pm on Wednesday, 17 June, 2015 Permalink | Reply

      That’s a lot more knot than I had every given much thought to. But your post did make me think about knotting ties and made me wonder why we all tie our ties the same way rather than using any of dozens of different kinds of knots that would create a different look.

      Like

      • Joseph Nebus 7:22 pm on Thursday, 18 June, 2015 Permalink | Reply

        I would imagine that most people settle on one or two ways of tying their ties because there’s not much point to picking up something more exotic. It takes effort to learn and do, and the payoff is almost secret; you might get a bit “Oh, that’s neat”, but not other recognition. We just don’t see tie-knotting as an artistic endeavor worth comment.

        It’s a bit of an open question how many different ways there are to tie a tie. It depends heavily on how you you define “different ways”, and so that makes ties an interesting application of knot theory. Last year Dan Hirsch, Ingemar Markström, Meredith L Patterson, Anders Sandberg, and Mikael Vejdemo-Johansson got a bit of human-interest coverage by declaring there were at most 177,147 different ways to tie a tie, if you make certain assumptions about what makes a legitimate tying. They’ve since revised the estimate to 266,682 kinds of knots that seem achievable.

        Like

    • sunesiss 12:23 am on Thursday, 18 June, 2015 Permalink | Reply

      Hey Joseph thank you for stopping by my blog i really appreciate it, that was awesome of you. I nominated you for the first post challenge. dont know if you do them or have already done it, but heres the link. https://sunesiss.wordpress.com/2015/06/18/your-first-post-challenge/ i really hope you stop by!

      Like

  • Joseph Nebus 9:04 pm on Tuesday, 16 June, 2015 Permalink | Reply
    Tags: , knot theory, , , slide rules, soap operas,   

    Reading the Comics, June 16, 2015: The Carefully Targeted Edition 


    The past several days produced a good number of comic strips mentioning mathematical topics. Strangely, they seem to be carefully targeted to appeal to me. Here’s how.

    Mason Mastroianni, Mick Mastroianni, and Perri Hart’s B.C. (June 12) is your classic resisting-the-world-problems joke. I admit I haven’t done anything at this level of mathematics in a long while. I’m curious if actual teachers, or students, could say whether problems with ridiculous numbers of fruits actually appear in word problems, or if this is one of those motifs that’s popular despite a nearly imaginary base in the real world.

    Dan Thompson’s Brevity (June 13) is aimed very precisely at the professional knot theorist. Also, mathematics includes a thing called knot theory which is almost exactly what you imagine. For a while it looked like I might get into knot theory, although ultimately I wasn’t able to find a problem interesting enough to work on that I was able to prove anything interesting about. I’m delighted a field that so many people wouldn’t imagine existed got a comic strip in this manner; I wonder if this is what dinosaur researchers felt when The Far Side was still in production.

    Steve Sicula’s Home and Away (June 14) name-drops the New Math, though the term’s taken literally. The joke feels anachronistic to me. Would a kid that age have even heard of a previous generation’s effort to make mathematics about understanding what you’re doing and why? New Math (admittedly, on the way out) was my elementary school thing.

    Mark Litzler’s Joe Vanilla (June 15) tickles me with the caption, “the clarity of the equation befuddles”. It’s a funny idea. Ideally, the point of an equation is to provide clarity and insight, maybe by solving it, maybe by forming it. A befuddling equation is usually a signal the problem needs to be thought out some more.

    Lincoln Pierce’s Big Nate: First Class (June 16, originally run June 11, 1991) is aimed at the Mathletes out there. It throws in a slide rule mention for good measure. Given Nate’s Dad’s age in the 1991 setting it’s plausible he’d have had a slide rule. (He’s still the same age in the comic strip being produced today, so he wouldn’t have had one if the strip were redrawn.) I don’t remember being on a competitive mathematics team in high school, although I did participate in some physics contests. My recollection is that I was an inconsistent performer, though. I don’t think I had the slightly obsessive competitive urge needed to really excel in high school academic competition.

    And Larry Wright’s Motley Classics (June 16, originally run June 16, 1987) is a joke about using algebra in the real world. Or at least in the world of soap operas. Back in 1987 (United States) soap operas were still a thing.

     
    • Ken Dowell 11:03 am on Wednesday, 17 June, 2015 Permalink | Reply

      Which is more time worn? Soap operas or jokes about using algebra in the real world?

      Like

      • Joseph Nebus 7:03 pm on Thursday, 18 June, 2015 Permalink | Reply

        I think the algebra jokes are the more timeworn, just because they’ve been made longer. The disappearance of daytime soap operas as a (United States) cultural phenomenon is much more recent; they only really evaporated in the 1990s and it’s only recently that it’s been noticed.

        But soap operas have a much longer history of poor jokes being made about them, mostly by people who think they’re superior to the genre and won’t be bothered learning enough about the subject to get the jokes right. I think the best example of that is how SCTV had an ongoing soap opera spoof, The Days Of The Week, which stood out from soap opera spoofs previously because it had noticed things like how soaps didn’t use heavy organ music for everything, and hadn’t for a long time. (And as happens, they made a spoof soap opera so credibly that it worked as a real soap opera.)

        Like

    • ivasallay 8:16 pm on Tuesday, 23 June, 2015 Permalink | Reply

      Those old well used slide rules deserve a place of honor, too.

      Like

      • Joseph Nebus 3:33 am on Thursday, 25 June, 2015 Permalink | Reply

        They do, yes. I suspect slide rules would still be useful ways to teach logarithms, particularly the way multiplication and addition of logs are linked. If nothing else physical objects and playthings like that are so very useful.

        Like

  • Joseph Nebus 8:56 pm on Friday, 28 November, 2014 Permalink | Reply
    Tags: , experiment design, hypercorrection, , inventions, knot theory, , ,   

    Reading the Comics, November 28, 2014: Greatest Hits Edition? 


    I don’t ever try speaking for Comic Strip Master Command, and it almost never speaks to me, but it does seem like this week’s strips mentioning mathematical themes was trying to stick to the classic subjects: anthropomorphized numbers, word problems, ways to measure time and space, under-defined probability questions, and sudoku. It feels almost like a reunion weekend to have all these topics come together.

    Dan Thompson’s Brevity (November 23) is a return to the world-of-anthropomorphic-numbers kind of joke, and a pun on the arithmetic mean, which is after all the statistic which most lends itself to puns, just edging out the “range” and the “single-factor ANOVA F-Test”.

    Phil Frank Joe Troise’s The Elderberries (November 23, rerun) brings out word problem humor, using train-leaves-the-station humor as a representative of the kinds of thinking academics do. Nagging slightly at me is that I think the strip had established the Professor as one of philosophy and while it’s certainly not unreasonable for a philosopher to be interested in mathematics I wouldn’t expect this kind of mathematics to strike him as very interesting. But then there is the need to get the idea across in two panels, too.

    Jonathan Lemon’s Rabbits Against Magic (November 25) brings up a way of identifying the time — “half seven” — which recalls one of my earliest essays around here, “How Many Numbers Have We Named?”, because the construction is one that I find charming and that was glad to hear was still current. “Half seven” strikes me as similar in construction to saying a number as “five and twenty” instead of “twenty-five”, although I’m ignorant as to whether the actually is any similarity.

    Scott Hilburn’s The Argyle Sweater (November 26) brings out a joke that I thought had faded out back around, oh, 1978, when the United States decided it wasn’t going to try converting to metric after all, now that we had two-liter bottles of soda. The curious thing about this sort of hyperconversion (it’s surely a satiric cousin to the hypercorrection that makes people mangle a sentence in the misguided hope of perfecting it) — besides that the “yard” in Scotland Yard is obviously not a unit of measure — is the notion that it’d be necessary to update idiomatic references that contain old-fashioned units of measurement. Part of what makes idioms anything interesting is that they can be old-fashioned while still making as much sense as possible; “in for a penny, in for a pound” is a sensible thing to say in the United States, where the pound hasn’t been legal tender since 1857; why would (say) “an ounce of prevention is worth a pound of cure” be any different? Other than that it’s about the only joke easily found on the ground once you’ve decided to look for jokes in the “systems of measurement” field.

    Mark Heath’s Spot the Frog (November 26, rerun) I’m not sure actually counts as a mathematics joke, although it’s got me intrigued: Surly Toad claims to have a stick in his mouth to use to give the impression of a smile, or 37 (“Sorry, 38”) other facial expressions. The stick’s shown as a bundle of maple twigs, wound tightly together and designed to take shapes easily. This seems to me the kind of thing that’s grown as an application of knot theory, the study of, well, it’s almost right there in the name. Knots, the study of how strings of things can curl over and around and cross themselves (or other strings), seemed for a very long time to be a purely theoretical playground, not least because, to be addressable by theory, the knots had to be made of an imaginary material that could be stretched arbitrarily finely, and could be pushed frictionlessly through it, which allows for good theoretical work but doesn’t act a thing like a shoelace. Then I think everyone was caught by surprise when it turned out the mathematics of these very abstract knots also describe the way proteins and other long molecules fold, and unfold; and from there it’s not too far to discovering wonderful structures that can change almost by magic with slight bits of pressure. (For my money, the most astounding thing about knots is that you can describe thermodynamics — the way heat works — on them, but I’m inclined towards thermodynamic problems.)

    Veronica was out of town for a week; Archie's test scores improved. This demonstrates that test scores aren't everything.

    Henry Scarpelli and Crag Boldman’s Archie for the 28th of November, 2014. Clearly we should subject this phenomenon to scientific inquiry!

    Henry Scarpelli and Crag Boldman’s Archie (November 28, rerun) offers an interesting problem: when Veronica was out of town for a week, Archie’s test scores improved. Is there a link? This kind of thing is awfully interesting to study, and awfully difficult to: there’s no way to run a truly controlled experiment to see whether Veronica’s presence affects Archie’s test scores. After all, he never takes the same test twice, even if he re-takes a test on the same subject (and even if the re-test were the exact same questions, he would go into it the second time with relevant experience that he didn’t have the first time). And a couple good test scores might be relevant, or might just be luck, or it might be that something else happened to change that week that we haven’t noticed yet. How can you trace down plausible causal links in a complicated system?

    One approach is an experimental design that, at least in the psychology textbooks I’ve read, gets called A-B-A, or A-B-A-B, experiment design: measure whatever it is you’re interested in during a normal time, “A”, before whatever it is whose influence you want to see has taken hold. Then measure it for a time “B” where something has changed, like, Veronica being out of town. Then go back as best as possible to the normal situation, “A” again; and, if your time and research budget allow, going back to another stretch of “B” (and, hey, maybe even “A” again) helps. If there is an influence, it ought to appear sometime after “B” starts, and fade out again after the return to “A”. The more you’re able to replicate this the sounder the evidence for a link is.

    (We’re actually in the midst of something like this around home: our pet rabbit was diagnosed with a touch of arthritis in his last checkup, but mildly enough and in a strange place, so we couldn’t tell whether it’s worth putting him on medication. So we got a ten-day prescription and let that run its course and have tried to evaluate whether it’s affected his behavior. This has proved difficult to say because we don’t really have a clear way of measuring his behavior, although we can say that the arthritis medicine is apparently his favorite thing in the world, based on his racing up to take the liquid and his trying to grab it if we don’t feed it to him fast enough.)

    Ralph Hagen’s The Barn (November 28) has Rory the sheep wonder about the chances he and Stan the bull should be together in the pasture, given how incredibly vast the universe is. That’s a subtly tricky question to ask, though. If you want to show that everything that ever existed is impossibly unlikely you can work out, say, how many pastures there are on Earth multiply it by an estimate of how many Earth-like planets there likely are in the universe, and take one divided by that number and marvel at Rory’s incredible luck. But that number’s fairly meaningless: among other obvious objections, wouldn’t Rory wonder the same thing if he were in a pasture with Dan the bull instead? And Rory wouldn’t be wondering anything at all if it weren’t for the accident by which he happened to be born; how impossibly unlikely was that? And that Stan was born too? (And, obviously, that all Rory and Stan’s ancestors were born and survived to the age of reproducing?)

    Except that in this sort of question we seem to take it for granted, for instance, that all Stan’s ancestors would have done their part by existing and doing their part to bringing Stan around. And we’d take it for granted that the pasture should exist, rather than be a farmhouse or an outlet mall or a rocket base. To come up with odds that mean anything we have to work out what the probability space of all possible relevant outcomes is, and what the set of all conditions that satisfy the concept of “we’re stuck here together in this pasture” is.

    Mark Pett’s Lucky Cow (November 28) brings up sudoku puzzles and the mystery of where they come from, exactly. This prompted me to wonder about the mechanics of making sudoku puzzles and while it certainly seems they could be automated pretty well, making your own amounts to just writing the digits one through nine nine times over, and then blanking out squares until the puzzle is hard. A casual search of the net suggests the most popular way of making sure you haven’t blanking out squares so that the puzzle becomes unsolvable (in this case, that there’s two or more puzzles that fit the revealed information) is to let an automated sudoku solver tell you. That’s true enough but I don’t see any mention of any algorithms by which one could check if you’re blanking out a solution-foiling set of squares. I don’t know whether that reflects there being no algorithm for this that’s more efficient than “try out possible solutions”, or just no algorithm being more practical. It’s relatively easy to make a computer try out possible solutions, after all.

    A paper published by Mária Ercsey-Ravasz and Zoltán Toroczkai in Nature Scientific Reports in 2012 describes the recasting of the problem of solving sudoku into a deterministic, dynamical system, and matches the difficulty of a sudoku puzzle to chaotic behavior of that system. (If you’re looking at the article and despairing, don’t worry. Go to the ‘Puzzle hardness as transient chaotic dynamics’ section, and read the parts of the sentence that aren’t technical terms.) Ercsey-Ravasz and Toroczkai point out their chaos-theory-based definition of hardness matches pretty well, though not perfectly, the estimates of difficulty provided by sudoku editors and solvers. The most interesting (to me) result they report is that sudoku puzzles which give you the minimum information — 17 or 18 non-blank numbers to start — are generally not the hardest puzzles. 21 or 22 non-blank numbers seem to match the hardest of puzzles, though they point out that difficulty has got to depend on the positioning of the non-blank numbers and not just how many there are.

     
    • ivasallay 5:35 am on Saturday, 29 November, 2014 Permalink | Reply

      My favorite this time was the Elderberries strip.

      Like

      • Joseph Nebus 12:15 am on Monday, 1 December, 2014 Permalink | Reply

        You know, I was busy enough writing about them I didn’t stop to consider which might be my favorite. I’m not sure which is. Spot the Frog fired my imagination the most, at least, since the gadget seems like it could almost be real.

        Like

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: