Thalasar Ventures

Nyquist Limit and the Environment

Yesterday I found myself in a debate over global warming at Jane Galt. One the commenters had an interesting comment about Nyquist limit and the dataset for global warming.


The question of course remains how long the data set needs to be in order to fully understand the amplitude of global climate change. This post is going to serve as a springboard on that issue. Poster M.Simon provides this background

Re: Nyquist limit.
First it is dependent on the frequency you want the peak of with respect to the sampling rate and the required accuracy.
Since the peak of a sine wave is fairly flat you find the required accuracy from a sine table.
Say you wanted at least 99% accuracy in one cycle.
Do arcos of .99 = 8.1096… deg. Divide that into 360 and do the inverse. You need about 45 samples per cycle to get that accuracy on the first pass. If you have to do more passes because of a lower sampling rate it will depend on the phase difference between the sampling frequency and inspected frequency.
However that assumes the frequency is of a constant amplitude over time. Hardly ever true for natural processes.
All this assumes reconstruction of the signal based on summing sine and cosine waves of various multiples of the fundamental. An inverse Fourier Transform. And of course the deconstruction is done by doing a Fourier Transform (or one of its variations).
If you like square waves better there is the Walsh Transform.
http://mathworld.wolfram.com/WalshFunction.html
The Fourier Transform is generally considered more useful since a square wave contains all the odd harmonics in declining amplitude.

That said what should be our sampling rate for climate data and how long should that data run?

Both comments and pings are currently closed.

Comments are closed.