# [PD] measuring entropy of a signal?

Charles Z Henry czhenry at gmail.com
Wed Nov 27 18:53:49 CET 2013

```Hi Ronni,

I've been traveling and I was unable to say much about the book at the
time.  This book seems to be an original work--I think it's a little
strange that most of the references are the original works about entropy,
fractal dimension, and the like.  I would have liked to see some references
to journal articles where his ideas about entropy, etc.. are published.
Bader goes into great depth about origins and roots for trying to apply
entropy and fractal dimension.  I think you'll like the book--and I'm glad
to hear you're doing more with these interesting topics.

Bader's entropy definitions are a basic approach based on box-counting
methods (and that's the method of calculation as well).  I think it lacks
some analytical treatment, that would be necessary to show the calculations
converge to something exact that makes sense.

On Mon, Nov 25, 2013 at 10:29 AM, Ronni Montoya <ronni.montoya at gmail.com>wrote:

> Oh thank you, im gonna get that book.
>
>
> I been working with entropy lately , experimenting new ways of using
> entropy, and the way i discovered of using entropy in music in a
> useful way  is this way:
>
> First i generate  strings of characters using L-systems. Then i use
> shannon entropy to classify each string based on its level of entropy.
>

This sounds really cool.

>
> Strings with low entropy are gonna be very repetitive and strings with
> high entropy are gonna be very random. In the middle level you can
> have complex strings ( no so random  and no so repetitive).
>
> Basically what i was trying is to use entropy as a way of doing
> aesthetic measure of self generated sound structures , i get this idea
> because once talking with a musician he told me that the function of a
> musician is to make sonic structures that are no so repetitive and not
> so random, in other words beauty is a point where sound structures are
> no very repetitive and also not very random, you need a little bit of
> repetitiveness and also you need surprise , so shannon entropy can be
> a useful way of measuring this.
>

That's also a topic here, except it's mostly applied to instrumental sounds
in the Bader book (much shorter time scales), but with the same basic
trend:  expressive and meaningful tones/timbres have some level of
complexity between repetitive and random.

> After making my classification of strings i sonify them at different
> time scales, using different shannon entropy values depending on the
> time scale and also in the sound im using ( for example for
> percussions i use low entropy strings) ....
>
>
> What do you think?  if anybody is interested i can send the app, but i
>
>
> cheers
>
>
> R.
>
>
>
>
> 2013/11/23, Charles Z Henry <czhenry at gmail.com>:
> > Hey Ronni, I realize you may not be interested in this topic any more,
> but
> > I recently came across a book with relevant sections on Entropy as
> applied
> > to music:
> > Nonlinearities and Synchronization in musical acoustics and music
> > psychology by Rolf Bader
> >
> > Chuck
> > On Feb 26, 2013 4:11 AM, "ronni montoya" <ronni.montoya at gmail.com>
> wrote:
> >
> >> Hi , i was wondering if anybody have implemented the shannon entropy
> >> function in pd?
> >>
> >> Do anybody have tried measuring entropy of a signal?
> >>
> >>
> >> cheeers
> >>
> >>
> >>
> >> R.
> >>
> >> _______________________________________________
> >> Pd-list at iem.at mailing list
> >> UNSUBSCRIBE and account-management ->
> >> http://lists.puredata.info/listinfo/pd-list
> >>
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.puredata.info/pipermail/pd-list/attachments/20131127/ddf025ea/attachment.htm>
```