Care and treatment of singularities

My favorite numerical analysis book is Numerical Methods that Work. In the hardcover version of the book, the title was impressed in the cover and colored in silver letters. Before the last word of the title, there was another word impressed but not colored in. You had to look at the cover carefully to see that it actually says “Numerical methods that usually work.”

First published in 1970, the book’s specific references to computing power are amusingly dated. But book’s advice is timeless.

The chapter entitled “The care and treatment of singularities” gives several approaches for integrating functions that have a singularity. This post will elaborate on the simplest example from that chapter.

Suppose you want to compute the following integral.

\int_0^1 \frac{dx}{\sqrt{\sin(x)}}

The integrand is infinite at 0. There are a couple common hacks to get around this problem, and neither works well. One is simply to re-define the integrand so that it has some finite value at 0. That keeps the integration program from complaining about a division by zero, but it doesn’t help evaluate the integral accurately.

Another hack is to change the lower limit of integration from 0 to something a little larger. OK, but how much larger? And even if you replace 0 with something small, you still have the problem that your integrand is nearly singular near 0. Simple integration routines assume integrands are polynomial-like, and this integrand is not polynomial-like near zero.

The way to compute this integral is to subtract off the singularity. Since sin(x) is asymptotically equal to x as x goes to 0, √sin(x) is asymptotically √x. So if we subtract 1/√x, we’re left with a bounded integrand, and one that is considerably more polynomial-like than the one we started with. We then integrate by hand what we subtracted off. That is, we replace our original integral with the following pair of integrals.

\int_0^1 \left( \frac{1}{\sqrt{\sin(x)}} - \frac{1}{\sqrt{x}} \right) \,dx + \int_0^1 \frac{dx}{\sqrt{x}}

Compute the first numerically and the second analytically. Simpson’s rule with intervals of size 1/64 gives 0.03480535 for the first integral, which is correct to seven decimal places. The second integral is simply 2, so the total is 2.03480535.

Now lets look back at how our two hacks would have worked. Suppose we define our integrand f(x) to be 0 at x = 0. Then applying Simpson’s rule with step size 1/64 would give an integral of 3.8775, which is completely wrong. Bringing the step size down to 2^-20 makes things worse: the integral would then be 4.036. What if instead of defining the integrand to be 0 at 0, we define it to be some large value, say 10^6? In that case Simpson’s rule with step size 1/64 returns 5212.2 as the integral. Lowering the step size to 2^-20 improves the integral to 4.351, but still every figure is wrong since the integral is approximately 2.

What if we’d replaced 0 with 0.000001 and used Simpson’s rule on the original integral? That way we’d avoid the problem of having to define the integrand at 0. Then we’d have two sources of error. First the error from not integrating from 0 to 0.000001. That error is 0.002, which is perhaps larger than you’d expect: the change to the integral is 2000x larger than the change to the lower limit of integration. But this problem is insignificant compared to the next.

The more serious problem is applying Simpson’s rule to the integral even if we avoid 0 as the lower limit of integration. We still have a vertical asymptote, even if our limits of integration avoid the very singularity, and no polynomial ever had a vertical asymptote. With such decidedly non-polynomial behavior, we’d expect Simpsons rule and its ilk to perform poorly.

Indeed that’s what happens. With step size 1/64, Simpson’s rule gives a value of 9.086 for the integral, which is completely wrong since we know the integral is roughly 2.0348. Even with step size 2^-20, i.e. over a million function evaluations, Simpson’s rule gives 4.0328 for the integral.

Incidentally, if we had simply approximated our integrand 1/√sin(x) with 1/√x, we would have estimated the integral as 2.0 and been correct to two significant figures, while Simpson’s rule applied naively gets no correct figures. On the other hand, Simpson’s rule applied cleverly (i.e. by subtracting off the singularity) gives eight correct figures with little effort. As I concluded here, a crude method cleverly applied beats a clever technique crudely applied.

More numerical posts

Thoughts on the new Windows logo

I appreciate spare design, but the new Windows logo is just boring.

Here’s the rationale for the new logo according to The Windows Blog:

But if you look back to the origins of the logo you see that it really was meant to be a window. “Windows” really is a beautiful metaphor for computing and with the new logo we wanted to celebrate the idea of a window, in perspective. Microsoft and Windows are all about putting technology in people’s hands to empower them to find their own perspectives. And that is what the new logo was meant to be. We did less of a re-design and more to return it to its original meaning and bringing Windows back to its roots – reimagining the Windows logo as just that – a window.

Greg Hewgill had a different perspective:

If you think about it, the new logo sort of looks like deck chairs on the Titanic when it stern was up in the air…

What’s your backplane?

When I hear someone I respect rave about a software tool I’ll take a look at it. When I do, it often leaves me cold and I wonder how they could think it’s so great. Is this person just easily impressed? I think I understand now why this happens.

Tools are used in a context, and that context is often missing from a discussion. Software developers talk more about their tools than how they string their tools together. In hardware terms, they talk about components but not the backplane that holds the components together. To understand how someone works, you need to know his or her backplane and not just a list of tools.

Tom Ryder had an interesting series of blog posts entitled Unix as IDE. He brings up some ideas that I believe would be of interest to people who don’t care about Unix. He’s talking about backplanes.

When you use Visual Studio or Eclipse as an IDE, that software is your backplane. It integrates your software development tasks. Many developers see their IDE as a godsend and couldn’t imaging working without it. Others disagree. It’s not that some people like an integrated development environment (IDE) and others prefer a disintegrated development environment (DDE). Some developers, like Ryder, have a different way of integrating activities. He explains that he’s not trying to condemn IDEs:

I don’t think IDEs are bad; I think they’re brilliant … In particular, I’m not going to try to convince you to scrap your hard-won Eclipse or Microsoft Visual Studio knowledge for the sometimes esoteric world of the command line. All I want to do is show you what we’re doing on the other side of the fence.

I found it interesting that although Ryder is fond of Vim, he’s not advocating Vim per se as an IDE. He sees Unix itself as his backplane and Vim as a component.

… the two grand old text editors Emacs and Vi (GNU Emacs and Vim) have such active communities developing plugins to make them support pretty much any kind of editing task. There are plugins to do pretty much anything you could really want to do in programming in both editors … the developers concerned are trying to make these text editors into IDEs in their own right. There are posts about never needing to leave Vim, or never needing to leave Emacs. But I think that trying to shoehorn Vim or Emacs into becoming something that it’s not isn’t quite thinking about the problem in the right way. [emphasis added]

I agree. I like Emacs, and I appreciate using one program where I used to use several. However, you quickly reach diminishing return when you try to do everything in Emacs. Ryder goes on later to discuss Vim and how it fits into his ecosystem.

Part of the reason Vim is thought of as a toy or relic by a lot of programmers used to GUI-based IDEs is its being seen as just a tool for editing files on servers, rather than a very capable editing component for the shell in its own right. Its own built-in features being so composable with external tools on Unix-friendly systems makes it into a text editing powerhouse that sometimes surprises even experienced users. [emphasis added]

Composability is the key idea of backplanes. See this post on usability versus composability.

Many developers live in Visual Studio, but what happens when Visual Studio doesn’t integrate everything you need to do? Some would say that Windows is their backplane just as Unix is Ryder’s backplane. They use the file system, the clipboard, etc. But if you want to be more automated, you have the option of writing Visual Studio plugins or writing scripts to glue things together using COM or PowerShell. The transition between doing something manually and automatically is steeper on Windows.

Related post: Career advice regarding tools

Would you rather serve a market or a boss?

Here’s an idea to chew on. Hayek argues that you either have to serve a market or a boss, and that the former is preferable.

Man in a complex society can have no choice but between adjusting himself to what to him must seem the blind forces of the social process and obeying the orders of a superior. So long as he knows only the hard discipline of the market, he may well think the direction by some other intelligent human brain preferable; but, when he tries it, he soon discovers that the former still leaves him at least some choice, while the latter leaves him none, and that it is better to have a choice between several unpleasant alternatives than being coerced into one.

From the essay Individualism: True and False in Individualism and Economic Order

Book review: Functional Analysis

Functional Analysis (ISBN 0691113874) by Elias Stein and Rami Shakarchi is a fast-paced book on functional analysis and related topics. By page 60, you’ve had a decent course in functional analysis and you’ve got 360 pages left.

This book is the last in a series of four volumes based on a series of lectures that began at Princeton in 2000. The first three volumes are devoted to

  1. Fourier series and integrals
  2. Complex analysis
  3. Measure theory, Lebesgue integration, and Hilbert spaces.

The first three books are not necessarily prerequisites for the fourth book, though the final book does assume familiarity with the basics of the topics in the earlier books. The final book does make fairly frequent references to its predecessors. Someone who has not read the first three volumes — I have not — can let these references go by.

Stein and Shakarchi bring in several topics that may not be considered functional analysis per se but are often included in functional analysis books, namely harmonic analysis and generalized functions. It goes into territory less often included in a functional analysis text: probability, Brownian motion, and an introduction to several complex variables. This broad selection of topics is in keeping with the stated aims of the lecture series

to present, in an integrated manner, the core areas of analysis … to make plain the organic unity that exists between the various parts of the subject …

The goal of integrating various parts of analysis may be most clearly seen in the fourth chapter: Applications of the Baire Category Theorem. The material here is not organized by result but rather by proof technique.

Each chapter ends with a set of “exercises” and a set of “problems.” The former are closely related to the material in the book and include generous hints. The latter are more challenging and go beyond the scope of the book.

Related: Applied functional analysis

Using C# like a scripting language

Clift Norris wrote a clever little batch file csrun.bat several years ago. I thought I’d posted it here, but apparently not. If you have a C# program in foo.cs, you can type csrun foo.cs to compile and run the program.

The batch file doesn’t do much at all, but it might change how you think about a C# program. The C# code is still compiled, but since the compilation step is hidden, it feels more like an interpreted language.

When someone says they like interpreted languages, maybe what they really mean is that they enjoy running code quickly without the overhead of starting up an IDE, compiling the code, navigating to the directory containing the compiled executable, etc. This has nothing to do with whether a language is compiled or interpreted.

@echo off
REM : Compile and run a C# source file.
REM : The C# compiler (csc.exe) must be in your PATH.
REM : The command line argument is expected to be something like foo.cs

if "%1"=="" goto USAGE

csc.exe /nologo /out:%1.exe  %1
if ERRORLEVEL 1 goto EXIT

%1.exe	%2  %3  %4  %5
goto EXIT

:USAGE
echo You must specify an argument representing the C# file you want to run.

:EXIT

This batch file does not set references; you’d need to modify it if you want to reference an assembly.

Update: CsharpRepl is a REPL (read-evaluate-print-loop) that lets you write C# at the command line as you would most scripting languages. CsharpRepl is part of Mono and works cross-platform. Thanks to MikeG in the comments.

Related post: Visual Studio 2010 is a pig

Teach yourself Fourier analysis in two weeks

From William Thompson (Lord Kelvin), 1840:

I had become filled with the utmost admiration for the splendor and poetry of Fourier. … I asked [John Pringle] Nichol if he thought I could read Fourier. He replied ‘perhaps.’ He thought the book a work of most transcendent merit. So on the 1st of May … I took Fourier out of the University Library; and in a fortnight I had mastered it — gone right through it.

Source

More Fourier posts