The reruns of Donald Duck comics which appear at creators.com recently offered the above daily strip, featuring Ludwig von Drake and one of those computers of the kind movies and TV shows and comic strips had before anybody had computers of their own and, of course, the classic IBM motto that maybe they still have but I never hear anyone talking about it except as something from the distant and musty past. (Unfortunately, creators.com doesn’t note the date a strip originally ran, so all I can say is the strip first ran sometime after September of 1961 and before whenever Disney stopped having original daily strips drawn; I haven’t been able to find any hint of when *that* was other than not earlier than 1969 when cartoonist Al Taliaferro retired from it.)

Anyway, anyone who’s tried doing a numerical computation and gotten a surprising result would understand von Drake’s scolding of his machine. But it’s also exhilarating to get a result on the computer which goes against what you expected it would be, often even a bit fun. It sets off a round of groaning and a lot of double-checking, of course, as you try to figure out whether the computer program was buggy to start with (and it can take a dishearteningly long while to convince yourself you haven’t got a mistake in the coding), or whether the model you were trying to compute using was wrong (this, in my experience, is harder than writing or debugging code, because it requires thinking critically of what you want the computer to do and working out whether what you had earlier thought it should do is actually what you wanted), and going through quite a few rounds of working on problems which are similar enough to whatever you’re researching yet which you understand perfectly and so can verify the computer’s answers.

That last can be particularly maddening because *if* you have a mistake in your model or in your code representing the model, it’s likely to be in the interesting new stuff you’re trying to research, which by definition is not in the simpler problem you understand perfectly.

However, when you do have that convergence of a model you’re certain is correct, and a coding that you’re certain implements it right, and a numerical result that confounds your intuition — well, that’s magnificent. Many people wonder whether mathematics is something people invent or which they discover, and I’m not strongly committed on the matter; but that moment of seeing the results come up in surprising ways, and challenging you to make sense of them, certainly feels like discovery to me.

I enjoy numerical mathematics for the creativity used in the methods. At a basic level, Euler’s difference (forward, backwards, central) methods are motivated by 3 points in a grid and the difference between them. That is all that is required. We then understand the convergence for each method by use of algebra and some analysis. We use geometrical intuition to develop a method of finding numerical solutions to equations. I like that.

At a higher level, the methods get even more creative – the Crank Nickolson scheme is based on a random walk situated in a two dimensional lattice (position and time). Other schemes have triangular grids, some have parameters which give different methods. Numerical mathematics is fantastic!

LikeLike

Oh, now, the Crank-Nicolson method (and other methods of handling differential equations) — I like them considerably and if I were confident my audience wouldn’t desert me I might try explaining some of them around here. They can be quite fun.

LikeLike