So here’s a couple things I haven’t had the time to read and think about, but that I want someone to, possibly even me. First, a chain reference:
Paulos’s link in that URL was mistaken and in one of the responses to it he posted a correction. But it’s about this:
And ultimately about what seems a ridiculously impossible condition. Suppose that you have two games, both of which you expect to lose. Or two strategies to play a game, both of which you expect will lose. How do you apply them so that you maximize your chance of winning? Indeed, under the right circumstances, how can you have a better than 50% chance of winning? I have actually read this, but what I haven’t had is the chance to think about it. It may come in handy for pinball league though.
Here, MikesMathPage posts A simplified version of the Banach-Tarski paradox for kids. The Banach-Tarski paradox is one of those things I’m surprised isn’t more common in pop mathematics. It offers this wondrous and absolutely anti-intuitive consequence. Take a sphere the size of a golf ball. Slice it perfectly, using mathematically precise tools that could subdivide atoms, that is, more perfectly than mere matter could ever do. Cut it into pieces and take them apart. Then reassemble the pieces. You have two spheres, and they’re both the size of a planet. You can see why when you get this as a mathematics major the instinct is the say you’ve heard something wrong. There being as many rationals as whole numbers, sure. There being more irratonal numbers than rationals, that’s fine. There being as many points in a one-segment line segment as in an infinitely large ten-dimensional volume of space? Shaky but all right. But this? This? Still, you can kind of imagine that well, maybe there’s some weird thing where you make infinitely many cuts into uncountably infinitely many pieces and then you find out you just need five slices. Four, if you don’t use the point at the very center of the golf ball. Then you get cranky. Anyway the promise of the title, forming a version of this that kids will be comfortable with, is a big one.
This one I’m pretty sure I ended up from by way of Analysis Fact of the day. John D Cook’s Cover time of a graph: cliques, chains, and lollipops is about graphs. I mean graph theory graphs, which look kind of like those circuit-board mass transit diagrams. All dots and lines connecting them. Cook’s question: how long does it take to visit every point in one of these graphs, if you take a random walk? That is, each time you’re at a stop, you take one of the paths randomly? With equal chance of taking any of the paths connected there? There’s some obviously interesting shapes and Cook looks into how you walk over them.
That should do for now. I really need to get caught up on my reading. Please let me know if I’ve made a disastrous mistake with any of this.