## My 2019 Mathematics A To Z: Julia set

Today’s A To Z term is my pick again. So I choose the Julia Set. This is named for Gaston Julia, one of the pioneers in chaos theory and fractals. He was born earlier than you imagine. No, earlier than that: he was born in 1893.

The early 20th century saw amazing work done. We think of chaos theory and fractals as modern things, things that require vast computing power to understand. The computers help, yes. But the foundational work was done more than a century ago. Some of these pioneering mathematicians may have been able to get some numerical computing done. But many did not. They would have to do the hard work of thinking about things which they could not visualize. Things which surely did not look like they imagined.

# Julia set.

We think of things as moving. Even static things we consider as implying movement. Else we’d think it odd to ask, “Where does that road go?” This carries over to abstract things, like mathematical functions. A function is a domain, a range, and a rule matching things in the domain to things in the range. It “moves” things as much as a dictionary moves words.

Yet we still think of a function as expressing motion. A common way for mathematicians to write functions uses little arrows, and describes what’s done as “mapping”. We might write $f: D \rightarrow R$. This is a general idea. We’re expressing that it maps things in the set D to things in the set R. We can use the notation to write something more specific. If ‘z’ is in the set D, we might write $f : z \rightarrow z^2 + \frac{1}{2}$. This describes the rule that matches things in the domain to things in the range. $f(2)$ represents the evaluation of this rule at a specific point, the one where the independent variable has the value ‘2’. $f(z)$ represents the evaluation of this rule at a specific point without committing to what that point is. $f(D)$ represents a collection of points. It’s the set you get by evaluating the rule at every point in D.

And it’s not bad to think of motion. Many functions are models of things that move. Particles in space. Fluids in a room. Populations changing in time. Signal strengths varying with a sensor’s position. Often we’ll calculate the development of something iteratively, too. If the domain and the range of a function are the same set? There’s no reason that we can’t take our z, evaluate f(z), and then take whatever that thing is and evaluate f(f(z)). And again. And again.

My age cohort, at least, learned to do this almost instinctively when we discovered you could take the result on a calculator and hit a function again. Calculate something and keep hitting square root; you get a string of numbers that eventually settle on 1. Or you started at zero. Calculate something and keep hitting square; you settle at either 0, 1, or grow to infinity. Hitting sine over and over … well, that was interesting since you might settle on 0 or some other, weird number. Same with tangent. Cosine you wouldn’t settle down to zero.

Serious mathematicians look at this stuff too, though. Take any set ‘D’, and find what its image is, f(D). Then iterate this, figuring out what f(f(D)) is. Then f(f(f(D))). f(f(f(f(D)))). And so on. What happens if you keep doing this? Like, forever?

We can say some things, at least. Even without knowing what f is. There could be a part of D that all these many iterations of f will send out to infinity. There could be a part of D that all these many iterations will send to some fixed point. And there could be a part of D that just keeps getting shuffled around without ever finishing.

Some of these might not exist. Like, $f: z \rightarrow z + 4$ doesn’t have any fixed points or shuffled-around points. It sends everything off to infinity. $f: z \rightarrow \frac{1}{10} z$ has only a fixed point; nothing from it goes off to infinity and nothing’s shuffled back and forth. $f: z \rightarrow -z$ has a fixed point and a lot of points that shuffle back and forth.

Thinking about these fixed points and these shuffling points gets us Julia Sets. These sets are the fixed points and shuffling-around points for certain kinds of functions. These functions are ones that have domain and range of the complex-valued numbers. Complex-valued numbers are the sum of a real number plus an imaginary number. A real number is just what it says on the tin. An imaginary number is a real number multiplied by $\imath$. What is $\imath$? It’s the imaginary unit. It has the neat property that $\imath^2 = -1$. That’s all we need to know about it.

Oh, also, zero times $\imath$ is zero again. So if you really want, you can say all real numbers are complex numbers; they’re just themselves plus $0 \imath$. Complex-valued functions are worth a lot of study in their own right. Better, they’re easier to study (at the introductory level) than real-valued functions are. This is such a relief to the mathematics major.

And now let me explain some little nagging weird thing. I’ve been using ‘z’ to represent the independent variable here. You know, using it as if it were ‘x’. This is a convention mathematicians use, when working with complex-valued numbers. An arbitrary complex-valued number tends to be called ‘z’. We haven’t forgotten x, though. We just in this context use ‘x’ to mean “the real part of z”. We also use “y” to carry information about the imaginary part of z. When we write ‘z’ we hold in trust an ‘x’ and ‘y’ for which $z = x + y\imath$. This all comes in handy.

But we still don’t have Julia Sets for every complex-valued function. We need it to be a rational function. The name evokes rational numbers, but that doesn’t seem like much guidance. $f:z \rightarrow \frac{3}{5}$ is a rational function. It seems too boring to be worth studying, though, and it is. A “rational function” is a function that’s one polynomial divided by another polynomial. This whether they’re real-valued or complex-valued polynomials.

So. Start with an ‘f’ that’s one complex-valued polynomial divided by another complex-valued polynomial. Start with the domain D, all of the complex-valued numbers. Find f(D). And f(f(D)). And f(f(f(D))). And so on. If you iterated this ‘f’ without limit, what’s the set of points that never go off to infinity? That’s the Julia Set for that function ‘f’.

There are some famous Julia sets, though. There are the Julia sets that we heard about during the great fractal boom of the 1980s. This was when computers got cheap enough, and their graphic abilities good enough, to automate the calculation of points in these sets. At least to approximate the points in these sets. And these are based on some nice, easy-to-understand functions. First, you have to pick a constant C. This C is drawn from the complex-valued numbers. But that can still be, like, ½, if that’s what interests you. For whatever your C is? Define this function:

$f_C: z \rightarrow z^2 + C$

And that’s it. Yes, this is a rational function. The numerator function is $z^2 + C$. The denominator function is $1$.

This produces many different patterns. If you picked C = 0, you get a circle. Good on you for starting out with something you could double-check. If you picked C = -2? You get a long skinny line, again, easy enough to check. If you picked C = -1? Well, now you have a nice interesting weird shape, several bulging ovals with peninsulas of other bulging ovals all over. Pick other numbers. Pick numbers with interesting imaginary components. You get pinwheels. You get jagged streaks of lightning. You can even get separate islands, whole clouds of disjoint threatening-looking blobs.

There is some guessing what you’ll get. If you work out a Julia Set for a particular C, you’ll see a similar-looking Julia Set for a different C that’s very close to it. This is a comfort.

You can create a Julia Set for any rational function. I’ve only ever seen anyone actually do it for functions that look like what we already had. $z^3 + C$. Sometimes $z^4 + C$. I suppose once, in high school, I might have tried $z^5 + C$ but I don’t remember what it looked like. If someone’s done, say, $\frac{1}{z^2 + C}$ please write in and let me know what it looks like.

The Julia Set has a famous partner. Maybe the most famous fractal of them all, the Mandelbrot Set. That’s the strange blobby sea surrounded by lightning bolts that you see on the cover of every pop mathematics book from the 80s and 90s. If a C gives us a Julia Set that’s one single, contiguous patch? Then that C is in the Mandelbrot Set. Also vice-versa.

The ideas behind these sets are old. Julia’s paper about the iterations of rational functions first appeared in 1918. Julia died in 1978, the same year that the first computer rendering of the Mandelbrot set was done. I haven’t been able to find whether that rendering existed before his death. Nor have I decided which I would think the better sequence.

Thanks for reading. All of Fall 2019 A To Z posts should be at this link. And next week I hope to get to the letters ‘K’ and ‘L’. Sunday, yes, I hope to get back to the comics.

## Reading the Comics, April 1, 2017: Connotations Edition

Last week ended with another little string of mathematically-themed comic strips. Most of them invited, to me, talk about the cultural significance of mathematics and what connotations they have. So, this title for an artless essay.

Berkeley Breathed’s Bloom County 2017 for the 28th of March uses “two plus two equals” as the definitive, inarguable truth. It always seems to be “two plus two”, doesn’t it? Never “two plus three”, never “three plus three”. I suppose I’ve sometimes seen “one plus one” or “two times two”. It’s easy to see why it should be a simple arithmetic problem, nothing with complicated subtraction or division or numbers as big as six. Maybe the percussive alliteration of those repeated two’s drives the phrase’s success. But then why doesn’t “two times two” show up nearly as often? Maybe the phrase isn’t iambic enough. “Two plus two” allows (to my ear) the “plus” sink in emphasis, while “times” stays a little too prominent. We need a wordsmith in to explore it. (I’m open to other hypotheses, including that “two times two” gets used more than my impression says.)

Christiann MacAuley’s Sticky Comics for the 28th uses mathematics as the generic “more interesting than people” thing that nerds think about. The thing being thought of there is the Mandelbrot Set. It’s built on complex-valued numbers. Pick a complex number, any you like; that’s called ‘C’. Square the number and add ‘C’ back to itself. This will be some new complex-valued number. Square that new number and add the original ‘C’ back to it again. Square that new number and add the original ‘C’ back once more. And keep at this. There are two things that might happen. These squared numbers might keep growing infinitely large. They might be negative, or imaginary, or (most likely) complex-valued, but their size keeps growing. Or these squared numbers might not grow arbitrarily large. The Mandelbrot Set is the collection of ‘C’ values for which the numbers don’t just keep growing in size. That’s the sort of lumpy kidney bean shape with circles and lightning bolts growing off it that you saw on every pop mathematics book during the Great Fractal Boom of the 80s and 90s. There’s almost no point working it out in your head; the great stuff about fractals almost requires a computer. They take a lot of computation. But if you’re just avoiding conversation, well, anything will do.

Olivia Walch’s Imogen Quest for the 29th riffs on the universe-as-simulation hypothesis. It’s one of those ideas that catches the mind and is hard to refute as long as we don’t talk to the people in the philosophy department, which we’re secretly scared of. Anyway the comic shows one of the classic uses of statistical modeling: try out a number of variations of a model in the hopes of understanding real-world behavior. This is an often-useful way to balance how the real world has stuff going on that’s important and that we don’t know about, or don’t know how to handle exactly.

Mason Mastroianni’s The Wizard of Id for the 31st uses a sprawl of arithmetic as symbol of … well, of status, really. The sort of thing that marks someone a white-collar criminal. I suppose it also fits with the suggestion of magic that accompanies huge sprawls of mathematical reasoning. Bundle enough symbols together and it looks like something only the intellectual aristocracy, or at least secret cabal, could hope to read.

Bob Shannon’s Tough Town for the 1st name-drops arithmetic. And shows off the attitude that anyone we find repulsive must also be stupid, as proven by their being bad at arithmetic. I admit to having no discernable feelings about the Kardashians; but I wouldn’t be so foolish as to conflate intelligence and skill-at-arithmetic.

I’ve found a good way to procrastinate on the next essay in the Why Stuff Can Orbit series. (I’m considering explaining all of differential calculus, or as much as anyone really needs, to save myself a little work later on.) In the meanwhile, though, here’s some interesting reading that’s come to my attention the last few weeks and that you might procrastinate your own projects with. (Remember Benchley’s Principle!)

First is Jeremy Kun’s essay Habits of highly mathematical people. I think it’s right in describing some of the worldview mathematics training instills, or that encourage people to become mathematicians. It does seem to me, though, that most everything Kun describes is also true of philosophers. I’m less certain, but I strongly suspect, that it’s also true of lawyers. These concentrations all tend to encourage thinking about we mean by things, and to test those definitions by thought experiments. If we suppose this to be true, then what implications would it have? What would we have to conclude is also true? Does it include anything that would be absurd to say? And is are the results useful enough we can accept a bit of apparent absurdity?

New York magazine had an essay: Jesse Singal’s How Researchers Discovered the Basketball “Hot Hand”. The “Hot Hand” phenomenon is one every sports enthusiast, and most casual fans, know: sometimes someone is just playing really, really well. The problem has always been figuring out whether it exists. Do anything that isn’t a sure bet long enough and there will be streaks. There’ll be a stretch where it always happens; there’ll be a stretch where it never does. That’s how randomness works.

But it’s hard to show that. The messiness of the real world interferes. A chance of making a basketball shot is not some fixed thing over the course of a career, or over a season, or even over a game. Sometimes players do seem to be hot. Certainly anyone who plays anything competitively experiences a feeling of being in the zone, during which stuff seems to just keep going right. It’s hard to disbelieve something that you witness, even experience.

So the essay describes some of the challenges of this: coming up with a definition of a “hot hand”, for one. Coming up with a way to test whether a player has a hot hand. Seeing whether they’re observed in the historical record. Singal’s essay writes about some of the history of studying hot hands. There is a lot of probability, and of psychology, and of experimental design in it.

And then there’s this intriguing question Analysis Fact Of The Day linked to: did Gaston Julia ever see a computer-generated image of a Julia Set? There are many Julia Sets; they and their relative, the Mandelbrot Set, became trendy in the fractals boom of the 1980s. If you knew a mathematics major back then, there was at least one on her wall. It typically looks like a craggly, lightning-rimmed cloud. Its shapes are not easy to imagine. It’s almost designed for the computer to render. Gaston Julia died in March of 1978. Could he have seen a depiction?

It’s not clear. The linked discussion digs up early computer renderings. It also brings up an example of a late-19th-century hand-drawn depiction of a Julia-like set, and compares it to a modern digital rendition of the thing. Numerical simulation saves a lot of tedious work; but it’s always breathtaking to see how much can be done by reason.