next up previous [pdf]

Next: A hundred iterations Up: Preconditioning Previous: SeaBeam

GIANT PROBLEMS

This book does not solve giant problems, but it does solve personal-computer-sized problems in the manner of giant problems. There is big money in solving giant problems. Big money brings specialist solutions beyond the scope of this book. But let us take a look. Closest to me is the seismic survey industry. Model space is 3-D, a cube, roughly a thousand 2-D screen fulls, a screen full being roughly 1,000$ \times$ 1,000, a gigabyte in total. Data space is 5-dimensional. A seismogram is a thousand time points. Our energy source lies in two dimensions on the Earth surface plane, as do our receivers. 1+2+2=5. All this compounds roughly to 1,000 to the $ 5^{\rm th}$ power, a thousand terabytes, a petabyte. Fully convergent solutions needing $ 10^{15}$ iterations of operators is ridiculous, while more than a handful are nearly so. We think mainly of using only the adjoint. Theory and experimentation offer some guidance. Remember that adjoints are great when unitary (already an inverse). Adjoints are improved if they can be made more unitary. The basic strategy for improving an adjoint is finding one good diagonal-weighting function before the adjoint and another after it. Recalling ``IID,'' adjoints are also made more unitary by filter matrices that have the effect of whitening output. Simple filter matrices are the gradient and the Laplacian. More generally, a compact way to whiten spectra is multidimensional autoregression, expounded in Chapter [*].



Subsections
next up previous [pdf]

Next: A hundred iterations Up: Preconditioning Previous: SeaBeam

2015-05-07