<div dir="ltr"><br><div class="gmail_extra"><br><br><div class="gmail_quote">On Wed, Feb 27, 2013 at 7:40 AM, ronni montoya <span dir="ltr"><<a href="mailto:ronni.montoya@gmail.com" target="_blank">ronni.montoya@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi, why is not possible? </blockquote><div><br></div><div>What I mean is using floating point numbers, as an approximation of real numbers. We have a finite number of samples, so it's impossible to work with continuous distributions, except by approximation.<br>
</div><div></div><div>However--brainstorming a few methods of approximation is good. I'm not particularly an expert on the subject of entropy, but I enjoy it.<br></div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Instead of analysing the real time value of<br>
the signal , maybe i can have a memory or buffer that store the a<br>
piece of signal ( groups of samples) from time to time and then<br>
analize that group of values.<br></blockquote><div><br></div><div>If you're analyzing only pieces you might wonder if the signals behave the same all the time. There are many "bursting" phenomena that are interesting. Those kinds of signals have long-term correlations that have lower entropy--but any small segment does not capture the behavior.<br>
</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Maybe it can convert that group of values into a string and then:<br>
<br>
<a href="http://www.shannonentropy.netmark.pl/calculate" target="_blank">http://www.shannonentropy.netmark.pl/calculate</a><br>
<br></blockquote><div><br></div><div>That would do something, but may be meaningless--It would be just one way of converting the signal from real numbers to a discrete set of things/symbols that is easier to calculate.<br>
<br></div><div>Since you brought up the topic---I was reading on wikipedia about how shannon entropy is used to obtain lower bounds on compression ratios. There are some types of audio compression--could you find a connection there?<br>
<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Other idea : ive seen using shannon entropy for calculating complexity<br>
in terms of spatial configuration.<br>
<br>
Maybe other option could be converting my signal into image for<br>
example using similarity matrix and then analyze that image to get<br>
entropy values.<br>
<br>
<br>
<br>
<br>
cheers<br>
<br>
<br>
R<br>
<br>
<br>
<br>
<br>
<br>
2013/2/26, Charles Z Henry <<a href="mailto:czhenry@gmail.com">czhenry@gmail.com</a>>:<br>
<div class="HOEnZb"><div class="h5">> Hi Ronni<br>
><br>
> How do you mean to do it?<br>
><br>
> Shannon entropy is not an independent measurement--the information in a<br>
> observation is relative to the distribution of all it's possible values.<br>
><br>
> If I just take one sample and it's evenly distributed between -0.98 and 1<br>
> and it's quantized in 0.02 increments (to make the math easier), then the<br>
> information of any value observed is:<br>
> -0.01*log(0.01)<br>
><br>
> Then--if I had a signal that's N samples long, I have N times as much<br>
> information. Or perhaps think of it as a rate of information.<br>
><br>
> But for real numbers and continuous distributions, this doesn't work. The<br>
> information in a single observation diverges. So, doing that with floating<br>
> point numbers is not practical.<br>
><br>
> You often see Shannon entropy describing digital signals. If the signal<br>
> just switches between 0 and 1, we can generate a distribution of the data<br>
> and see what the probability is empirically. The entropy of each new<br>
> sample is relative to the distribution. Likewise, then if you know the<br>
> maximum rate of switching, you can figure out the maximum rate of<br>
> information in the signal.<br>
><br>
> Just a few thoughts...<br>
><br>
> Chuck<br>
><br>
><br>
><br>
> On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya<br>
> <<a href="mailto:ronni.montoya@gmail.com">ronni.montoya@gmail.com</a>>wrote:<br>
><br>
>> Hi , i was wondering if anybody have implemented the shannon entropy<br>
>> function in pd?<br>
>><br>
>> Do anybody have tried measuring entropy of a signal?<br>
>><br>
>><br>
>> cheeers<br>
>><br>
>><br>
>><br>
>> R.<br>
>><br>
>> _______________________________________________<br>
>> <a href="mailto:Pd-list@iem.at">Pd-list@iem.at</a> mailing list<br>
>> UNSUBSCRIBE and account-management -><br>
>> <a href="http://lists.puredata.info/listinfo/pd-list" target="_blank">http://lists.puredata.info/listinfo/pd-list</a><br>
>><br>
><br>
</div></div></blockquote></div><br></div></div>