## The Summer 2017 Mathematics A To Z: Integration

One more mathematics term suggested by Gaurish for the A-To-Z today, and then I’ll move on to a couple of others. Today’s is a good one.

# Integration.

Stand on the edge of a plot of land. Walk along its boundary. As you walk the edge pay attention. Note how far you walk before changing direction, even in the slightest. When you return to where you started consult your notes. Contained within them is the area you circumnavigated.

If that doesn’t startle you perhaps you haven’t thought about how odd that is. You don’t ever touch the interior of the region. You never do anything like see how many standard-size tiles would fit inside. You walk a path that is as close to one-dimensional as your feet allow. And encoded in there somewhere is an area. Stare at that incongruity and you realize why integrals baffle the student so. They have a deep strangeness embedded in them.

We who do mathematics have always liked integration. They grow, in the western tradition, out of geometry. Given a shape, what is a square that has the same area? There are shapes it’s easy to find the area for, given only straightedge and compass: a rectangle? Easy. A triangle? Just as straightforward. A polygon? If you know triangles then you know polygons. A lune, the crescent-moon shape formed by taking a circular cut out of a circle? We can do that. (If the cut is the right size.) A circle? … All right, we can’t do that, but we spent two thousand years trying before we found that out for sure. And we can do some excellent approximations.

That bit of finding-a-square-with-the-same-area was called “quadrature”. The name survives, mostly in the phrase “numerical quadrature”. We use that to mean that we computed an integral’s approximate value, instead of finding a formula that would get it exactly. The otherwise obvious choice of “numerical integration” we use already. It describes computing the solution of a differential equation. We’re not trying to be difficult about this. Solving a differential equation is a kind of integration, and we need to do that a lot. We could recast a solving-a-differential-equation problem as a find-the-area problem, and vice-versa. But that’s bother, if we don’t need to, and so we talk about numerical quadrature and numerical integration.

Integrals are built on two infinities. This is part of why it took so long to work out their logic. One is the infinity of number; we find an integral’s value, in principle, by adding together infinitely many things. The other is an infinity of smallness. The things we add together are infinitesimally small. That we need to take things, each smaller than any number yet somehow not zero, and in such quantity that they add up to something, seems paradoxical. Their geometric origins had to be merged into that of arithmetic, of algebra, and it is not easy. Bishop George Berkeley made a steady name for himself in calculus textbooks by pointing this out. We have worked out several logically consistent schemes for evaluating integrals. They work, mostly, by showing that we can make the error caused by approximating the integral smaller than any margin we like. This is a standard trick, or at least it is, now that we know it.

That “in principle” above is important. We don’t actually work out an integral by finding the sum of infinitely many, infinitely tiny, things. It’s too hard. I remember in grad school the analysis professor working out by the proper definitions the integral of 1. This is as easy an integral as you can do without just integrating zero. He escaped with his life, but it was a close scrape. He offered the integral of x as a way to test our endurance, without actually doing it. I’ve never made it through that.

But we do integrals anyway. We have tools on our side. We can show, for example, that if a function obeys some common rules then we can use simpler formulas. Ones that don’t demand so many symbols in such tight formation. Ones that we can use in high school. Also, ones we can adapt to numerical computing, so that we can let machines give us answers which are near enough right. We get to choose how near is “near enough”. But then the machines decide how long we’ll have to wait to get that answer.

The greatest tool we have on our side is the Fundamental Theorem of Calculus. Even the name promises it’s the greatest tool we might have. This rule tells us how to connect integrating a function to differentiating another function. If we can find a function whose derivative is the thing we want to integrate, then we have a formula for the integral. It’s that function we found. What a fantastic result.

The trouble is it’s so hard to find functions whose derivatives are the thing we wanted to integrate. There are a lot of functions we can find, mind you. If we want to integrate a polynomial it’s easy. Sine and cosine and even tangent? Yeah. Logarithms? A little tedious but all right. A constant number raised to the power x? Also tedious but doable. A constant number raised to the power x2? Hold on there, that’s madness. No, we can’t do that.

There is a weird grab-bag of functions we can find these integrals for. They’re mostly ones we can find some integration trick for. An integration trick is some way to turn the integral we’re interested in into a couple of integrals we can do and then mix back together. A lot of a Freshman Calculus course is a heap of tricks we’ve learned. They have names like “u-substitution” and “integration by parts” and “trigonometric substitution”. Some of them are really exotic, such as turning a single integral into a double integral because that leads us to something we can do. And there’s something called “differentiation under the integral sign” that I don’t know of anyone actually using. People know of it because Richard Feynman, in his fun memoir What Do You Care What Other People Think: 250 Pages Of How Awesome I Was In Every Situation Ever, mentions how awesome it made him in so many situations. Mathematics, physics, and engineering nerds are required to read this at an impressionable age, so we fall in love with a technique no textbook ever mentions. Sorry.

I’ve written about all this as if we were interested just in areas. We’re not. We like calculating lengths and volumes and, if we dare venture into more dimensions, hypervolumes and the like. That’s all right. If we understand how to calculate areas, we have the tools we need. We can adapt them to as many or as few dimensions as we need. By weighting integrals we can do calculations that tell us about centers of mass and moments of inertial, about the most and least probable values of something, about all quantum mechanics.

As often happens, this powerful tool starts with something anyone might ponder: what size square has the same area as this other shape? And then think seriously about it.

## Reading the Comics, June 26, 2015: June 23, 2016 Plus Golden Lizards Edition

And now for the huge pile of comic strips that had some mathematics-related content on the 23rd of June. I admit some of them are just using mathematics as a stand-in for “something really smart people do”. But first, another moment with the Magic Realism Bot:

So, you know, watch the lizards and all.

Tom Batiuk’s Funky Winkerbean name-drops E = mc2 as the sort of thing people respect. If the strip seems a little baffling then you should know that Mason’s last name is Jarr. He was originally introduced as a minor player in a storyline that wasn’t about him, so the name just had to exist. But since then Tom Batiuk’s decided he likes the fellow and promoted him to major-player status. And maybe Batiuk regrets having a major character with a self-consciously Funny Name, which is an odd thing considering he named his long-running comic strip for original lead character Funky Winkerbean.

Charlie Podrebarac’s CowTown depicts the harsh realities of Math Camp. I assume they’re the realities. I never went to one myself. And while I was on the Physics Team in high school I didn’t make it over to the competitive mathematics squad. Yes, I noticed that the not-a-numbers-person Jim Smith can’t come up with anything other than the null symbol, representing nothing, not even zero. I like that touch.

Ryan North’s Dinosaur Comics rerun is about Richard Feynman, the great physicist whose classic memoir What Do You Care What Other People Think? is hundreds of pages of stories about how awesome he was. Anyway, the story goes that Feynman noticed one of the sequences of digits in π and thought of the joke which T-Rex shares here.

π is believed but not proved to be a “normal” number. This means several things. One is that any finite sequence of digits you like should appear in its representation, somewhere. Feynman and T-Rex look for the sequence ‘999999’, which sure enough happens less than eight hundred digits past the decimal point. Lucky stroke there. There’s no reason to suppose the sequence should be anywhere near the decimal point. There’s no reason to suppose the sequence has to be anywhere in the finite number of digits of π that humanity will ever know. (This is why Carl Sagan’s novel Contact, which has as a plot point the discovery of a message apparently encoded in the digits of π, is not building on a stupid idea. That any finite message exists somewhere is kind-of certain. That it’s findable is not.)

e, mentioned in the last panel, is similarly thought to be a normal number. It’s also not proved to be. We are able to say that nearly all numbers are normal. It’s in much the way we can say nearly all numbers are irrational. But it is hard to prove that any numbers are. I believe that the only numbers humans have proved to be normal are a handful of freaks created to show normal numbers exist. I don’t know of any number that’s interesting in its own right that’s also been shown to be normal. We just know that almost all numbers are.

But it is imaginable that π or e aren’t. They look like they’re normal, based on how their digits are arranged. It’s an open question and someone might make a name for herself by answering the question. It’s not an easy question, though.

Missy Meyer’s Holiday Doodles breaks the news to me the 23rd was SAT Math Day. I had no idea and I’m not sure what that even means. The doodle does use the classic “two trains leave Chicago” introduction, the “it was a dark and stormy night” of Boring High School Algebra word problems.

Stephan Pastis’s Pearls Before Swine is about everyone who does science and mathematics popularization, and what we worry someone’s going to reveal about us. Um. Except me, of course. I don’t do this at all.

Ashleigh Brilliant’s Pot-Shots rerun is a nice little averages joke. It does highlight something which looks paradoxical, though. Typically if you look at the distributions of values of something that can be measured you get a bell cure, like Brilliant drew here. The value most likely to turn up — the mode, mathematicians say — is also the arithmetic mean. “The average”, is what everybody except mathematicians say. And even they say that most of the time. But almost nobody is at the average.

Looking at a drawing, Brilliant’s included, explains why. The exact average is a tiny slice of all the data, the “population”. Look at the area in Brilliant’s drawing underneath the curve that’s just the blocks underneath the upside-down fellow. Most of the area underneath the curve is away from that.

There’s a lot of results that are close to but not exactly at the arithmetic mean. Most of the results are going to be close to the arithmetic mean. Look at how many area there is under the curve and within four vertical lines of the upside-down fellow. That’s nearly everything. So we have this apparent contradiction: the most likely result is the average. But almost nothing is average. And yet almost everything is nearly average. This is why statisticians have their own departments, or get to make the mathematics department brand itself the Department of Mathematics and Statistics.

## My Math Blog Statistics, October 2014

So now let me go over the mathematics blog statistics for October. I’ll get to listing countries; people like that.

It was a good month in terms of getting people to read: total number of pages viewed was 625, up from 558, and this is the fourth-highest month on record. The number of unique visitors was up too, from 286 in September to 323 in October, and that’s the third-highest since WordPress started giving me those statistics. The views per visitor barely changed, going from 1.95 to 1.93, which I’m comfortable supposing is a statistical tie. I reached 18,507 total page views by the end of October, and maybe I’ll reach that nice round-ish 19,000 by the end of November.

The countries sending me the most visitors were the usual set: the United States with 393, the United Kingdom with 35, and Austria with 23. Curiously, Argentina sent me 20 readers, while Canada plummeted down to a mere nine. Did I say something wrong, up there? On the bright side my Indian readership has grown to nine, which is the kind of trend I like. Sending just a single reader this past month were Albania, Brazil, Denmark, Estonia, Finland, Indonesia, Japan, the Netherlands, Nicaragua, Norway, Poland, Saint Kitts and Nevis, Serbia, Spain, Sweden, Taiwan, Turkey, and the United Arab Emirates. Brazil, Estonia, Finland, the Netherlands, and Sweden were single-reader countries last month, and Finland and Sweden also the month before. I feel embarrassed by the poor growth in my Scandinavian readership, but at least it isn’t dwindling.

The most popular posts in October got a little bit away from the comics posts; the ones most often read were:

There weren’t any really great bits of search term poetry this month, but there were still some evocative queries that brought people to me, among them:

• where did negative numbers come from
• show me how to make a comic stip for rationalnumbers
• desert island logarithm
• herb jamaal math ludwig
• in the figure shown below, Δabc and Δdec are right triangles. if de = 6, ab = 20, and be = 21, what is the area of Δdec?
• origin is the gateway to your entire gaming universe.

That “origin is the gateway” thing has come up before. I stil don’t know what it means. I’m a little scared by it.

## How Richard Feynman Got From The Square Root of 2 to e

I wanted to bring to greater prominence something that might have got lost in comments. Elke Stangl, author of the Theory And Practice Of Trying To Combine Just Anything blog, noticed that among the Richard Feynman Lectures on Physics, and available online, is his derivation of how to discover e — the base of natural logarithms — from playing around.

e is an important number, certainly, but it’s tricky to explain why it’s important; it hasn’t got a catchy definition like pi has, and even the description that most efficiently says why it’s interesting (“the base of the natural logarithm”) sounds perilously close to technobabble. As an explanation for why e should be interesting Feynman’s text isn’t economical — I make it out as something around two thousand words — but it’s a really good explanation since it starts from a good starting point.

That point is: it’s easy to understand what you mean by raising a number, say 10, to a positive integer: 104, for example, is four tens multiplied together. And it doesn’t take much work to extend that to negative numbers: 10-4 is one divided by the product of four tens multiplied together. Fractions aren’t too bad either: 101/2 would be the number which, multiplied by itself, gives you 10. 103/2 would be 101/2 times 101/2 times 101/2; or if you think this is easier (it might be!), the number which, multiplied by itself, gives you 103. But what about the number $10^{\sqrt{2}}$? And if you can work that out, what about the number $10^{\pi}$?

There’s a pretty good, natural way to go about writing that and as Feynman shows you find there’s something special about some particular number pretty close to 2.71828 by doing so.