<div dir="ltr"><div></div>If you took the fft squared magnitude, perfectly noisy data should have a chi-squared distribution in each bin (I think). If you assumed that model and calculated the parameters of the distribution on each block, you'd find out how much information is in each of those peaks relative to the assumed distribution and just add it up.<br>
<div><br></div><div><div><div class="gmail_extra">What ever algorithm you choose probably needs to pass some "common sense" tests like what you mention Martin, noise has more entropy than a sine wave. Also, if you take noise and just apply a comparison > 0, you get a signal with less entropy.<br>
</div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Feb 27, 2013 at 7:54 AM, Martin Peach <span dir="ltr"><<a href="mailto:martin.peach@sympatico.ca" target="_blank">martin.peach@sympatico.ca</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Why not do an FFT and measure the variance of the channels?<br>
For instance white noise has maximum entropy and all the bins of its FFT will be more or less the same, while a sine wave has low entropy and one bin will be much larger than the others.<span class="HOEnZb"><font color="#888888"><br>
<br>
<br>
Martin</font></span><div class="HOEnZb"><div class="h5"><br>
<br>
On 2013-02-27 08:40, ronni montoya wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Hi, why is not possible? Instead of analysing the real time value of<br>
the signal , maybe i can have a memory or buffer that store the a<br>
piece of signal ( groups of samples) from time to time and then<br>
analize that group of values.<br>
<br>
Maybe it can convert that group of values into a string and then:<br>
<br>
<a href="http://www.shannonentropy.netmark.pl/calculate" target="_blank">http://www.shannonentropy.<u></u>netmark.pl/calculate</a><br>
<br>
<br>
<br>
Other idea : ive seen using shannon entropy for calculating complexity<br>
in terms of spatial configuration.<br>
<br>
Maybe other option could be converting my signal into image for<br>
example using similarity matrix and then analyze that image to get<br>
entropy values.<br>
<br>
<br>
<br>
<br>
cheers<br>
<br>
<br>
R<br>
<br>
<br>
<br>
<br>
<br>
2013/2/26, Charles Z Henry <<a href="mailto:czhenry@gmail.com" target="_blank">czhenry@gmail.com</a>>:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Hi Ronni<br>
<br>
How do you mean to do it?<br>
<br>
Shannon entropy is not an independent measurement--the information in a<br>
observation is relative to the distribution of all it's possible values.<br>
<br>
If I just take one sample and it's evenly distributed between -0.98 and 1<br>
and it's quantized in 0.02 increments (to make the math easier), then the<br>
information of any value observed is:<br>
-0.01*log(0.01)<br>
<br>
Then--if I had a signal that's N samples long, I have N times as much<br>
information. Or perhaps think of it as a rate of information.<br>
<br>
But for real numbers and continuous distributions, this doesn't work. The<br>
information in a single observation diverges. So, doing that with floating<br>
point numbers is not practical.<br>
<br>
You often see Shannon entropy describing digital signals. If the signal<br>
just switches between 0 and 1, we can generate a distribution of the data<br>
and see what the probability is empirically. The entropy of each new<br>
sample is relative to the distribution. Likewise, then if you know the<br>
maximum rate of switching, you can figure out the maximum rate of<br>
information in the signal.<br>
<br>
Just a few thoughts...<br>
<br>
Chuck<br>
<br>
<br>
<br>
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya<br>
<<a href="mailto:ronni.montoya@gmail.com" target="_blank">ronni.montoya@gmail.com</a>><u></u>wrote:<br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Hi , i was wondering if anybody have implemented the shannon entropy<br>
function in pd?<br>
<br>
Do anybody have tried measuring entropy of a signal?<br>
<br>
<br>
cheeers<br>
<br>
<br>
<br>
R.<br>
<br>
______________________________<u></u>_________________<br>
<a href="mailto:Pd-list@iem.at" target="_blank">Pd-list@iem.at</a> mailing list<br>
UNSUBSCRIBE and account-management -><br>
<a href="http://lists.puredata.info/listinfo/pd-list" target="_blank">http://lists.puredata.info/<u></u>listinfo/pd-list</a><br>
<br>
</blockquote>
<br>
</blockquote>
<br>
______________________________<u></u>_________________<br>
<a href="mailto:Pd-list@iem.at" target="_blank">Pd-list@iem.at</a> mailing list<br>
UNSUBSCRIBE and account-management -> <a href="http://lists.puredata.info/listinfo/pd-list" target="_blank">http://lists.puredata.info/<u></u>listinfo/pd-list</a><br>
<br>
<br>
</blockquote>
<br>
</div></div></blockquote></div><br></div></div></div></div>