[PD] measuring entropy of a signal?

ronni montoya ronni.montoya at gmail.com
Sat Mar 2 19:28:14 CET 2013


Hi, Charles, my  idea in using shannons entropy  is to measure self
generated songs.

For example if you have a patch that generate sound structures using a
generative rules  it would be nice to measure that sound structure and
use that measurement to evolve the rules that generate that sound
structure in order to create more complex structures for example.

But how to measure a sound structure using shannons entropy?

I was experimenting taking only short pieces of a larger sound ,
converting each  piece into a string and evaluate the shannon entropy
of each string.

In this case entropy varies with time and what i am interested in are
the entropy trayectories.

You can plot this trayectories and compare different trayectories from
different songs .

More complex sound structures should have more complex trayectories ,
not chaotic , not periodic but more complex . The problem for me  is
that  i need to plot or visualize the entropy trayectories (values) in
order to see the complexity of a sound structure.

It would be nice to find a way to automate , for example find a way of
measure different trayectories algorithmically and that computer can
tells automatically which one is more complex.

Do you have an idea?

I have a question, why do you say it would be meaning less to convert
signal into symbols?




Other way i was experimenting is using this with video and images, for
example converting an image into a array of characters iterating over
all the pixels an getting the color of each pixel , then converting
those values into characters and then evaluating the shannons entropy
of each image.

I would like to expand this and use it also for self generated 3d
structure, but im still thinking about this.


cheers.


R.







can you please explain me why do you say it would be meaningless?

"That would do something, but may be meaningless--It would be just one
way of converting the signal from real numbers to a discrete set of
things/symbols that is easier to calculate."



2013/2/27, Charles Z Henry <czhenry at gmail.com>:
> If you took the fft squared magnitude, perfectly noisy data should have a
> chi-squared distribution in each bin (I think).  If you assumed that model
> and calculated the parameters of the distribution on each block, you'd find
> out how much information is in each of those peaks relative to the assumed
> distribution and just add it up.
>
> What ever algorithm you choose probably needs to pass some "common sense"
> tests like what you mention Martin, noise has more entropy than a sine
> wave.  Also, if you take noise and just apply a comparison > 0, you get a
> signal with less entropy.
>
> On Wed, Feb 27, 2013 at 7:54 AM, Martin Peach
> <martin.peach at sympatico.ca>wrote:
>
>> Why not do an FFT and measure the variance of the channels?
>> For instance white noise has maximum entropy and all the bins of its FFT
>> will be more or less the same, while a sine wave has low entropy and one
>> bin will be much larger than the others.
>>
>>
>> Martin
>>
>>
>> On 2013-02-27 08:40, ronni montoya wrote:
>>
>>> Hi, why is not possible? Instead of analysing the real time value of
>>> the signal , maybe i can have  a memory or buffer  that store the a
>>> piece of signal ( groups of samples) from time to time and then
>>> analize that group of values.
>>>
>>> Maybe it can convert that group of values into a string and then:
>>>
>>> http://www.shannonentropy.**netmark.pl/calculate<http://www.shannonentropy.netmark.pl/calculate>
>>>
>>>
>>>
>>> Other idea : ive seen using shannon entropy for calculating complexity
>>> in terms of spatial configuration.
>>>
>>> Maybe other option could be converting my signal into image for
>>> example using similarity matrix and then analyze that image to get
>>> entropy values.
>>>
>>>
>>>
>>>
>>> cheers
>>>
>>>
>>> R
>>>
>>>
>>>
>>>
>>>
>>> 2013/2/26, Charles Z Henry <czhenry at gmail.com>:
>>>
>>>> Hi Ronni
>>>>
>>>> How do you mean to do it?
>>>>
>>>> Shannon entropy is not an independent measurement--the information in a
>>>> observation is relative to the distribution of all it's possible
>>>> values.
>>>>
>>>> If I just take one sample and it's evenly distributed between -0.98 and
>>>> 1
>>>> and it's quantized in 0.02 increments (to make the math easier), then
>>>> the
>>>> information of any value observed is:
>>>> -0.01*log(0.01)
>>>>
>>>> Then--if I had a signal that's N samples long, I have N times as much
>>>> information.  Or perhaps think of it as a rate of information.
>>>>
>>>> But for real numbers and continuous distributions, this doesn't work.
>>>>  The
>>>> information in a single observation diverges.  So, doing that with
>>>> floating
>>>> point numbers is not practical.
>>>>
>>>> You often see Shannon entropy describing digital signals.  If the
>>>> signal
>>>> just switches between 0 and 1, we can generate a distribution of the
>>>> data
>>>> and see what the probability is empirically.  The entropy of each new
>>>> sample is relative to the distribution.  Likewise, then if you know the
>>>> maximum rate of switching, you can figure out the maximum rate of
>>>> information in the signal.
>>>>
>>>> Just a few thoughts...
>>>>
>>>> Chuck
>>>>
>>>>
>>>>
>>>> On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya
>>>> <ronni.montoya at gmail.com>**wrote:
>>>>
>>>>  Hi , i was wondering if anybody have implemented the shannon entropy
>>>>> function in pd?
>>>>>
>>>>> Do anybody have tried measuring entropy of a signal?
>>>>>
>>>>>
>>>>> cheeers
>>>>>
>>>>>
>>>>>
>>>>> R.
>>>>>
>>>>> ______________________________**_________________
>>>>> Pd-list at iem.at mailing list
>>>>> UNSUBSCRIBE and account-management ->
>>>>> http://lists.puredata.info/**listinfo/pd-list<http://lists.puredata.info/listinfo/pd-list>
>>>>>
>>>>>
>>>>
>>> ______________________________**_________________
>>> Pd-list at iem.at mailing list
>>> UNSUBSCRIBE and account-management -> http://lists.puredata.info/**
>>> listinfo/pd-list <http://lists.puredata.info/listinfo/pd-list>
>>>
>>>
>>>
>>
>



More information about the Pd-list mailing list