A Faster Fourier Transform

· ·

Mark Anderson, Technology Review:

In the mid-1960s, a computer-friendly algorithm called the fast Fourier transform (FFT) was developed. Anyone who’s marveled at the tiny size of an MP3 file compared with the same recording in an uncompressed form has seen the power of the FFT at work.

With the new algorithm, called the sparse Fourier transform (SFT), streams of data can be processed 10 to 100 times faster than was possible with the FFT.

I’m sure many of meteorology colleagues would be interested in this new algorithm (if they haven’t already played with it yet).

In my day job, I numerically simulate atmospheric boundary layer flows. The spectral density (energy per unit wavenumber) of a variable offers an important window into how well the model reproduces the cascade of energy from large to small scales. To calculate the spectral density requires use of the FFT algorithm.

As an example, I recently conducted simulations on a numerical grid with size 512x512x100 points, over a window of twelve hours, with data collected every one minute. To calculate the one-dimensional spectral density, I take the FFT in the along-wind direction, that is to say, over 512 points. The sum of all FFTs are averaged in the cross-wind direction. This is repeated for every vertical level and at every time.

In short, to calculate the spectral density of one variable for the entire simulation requires using the FFT over 36.8 million times. I generally do these computations for six variables, meaning the FFT is needed over 221 million times! Needless to say, this can be rather time consuming. The promise of a new FFT algorithm that is 10 to 100 times faster than traditional methods is welcomed. I look forward to testing this out in the future.