[PD] derivative function

Charles Henry czhenry at gmail.com
Tue Jun 27 21:09:23 CEST 2006


> > I don't know such as object, but maybe this will help you a little bit:
> > Because you're dealing with finite discrete signals, the derivative of a
> > signal becomes a difference between two consecutive samples. You can
> > easily implement it using [z~] (from zexy, I think) and then subtract
> > the delayed and original sequence:
> >
> > y[n]= x[n]-x[n-1]
>
> hmm... I tried this with [z~] and [fexpr~], achieving the same result....
>
> I wonder if there are errors in this way of computing derivative...

yes, this is THE worst derivative approximation.  No other derivative
approximation has worse error terms.

all of your derivatives can be written using FIR filters (integrators
require IIR filtering).  We construct our approximate deriv's using
convolution

simplest (and worst) is the y[n] = (x[n] - x[n-1])/ delta-x
(obtained by truncating the Taylor series after the first derivative term)
next, better is y[n] = (x[n+1] - x[n-1]) / (2*delta-x)
(obtained by truncating the Taylor series after the second derivative term)
These two are the obvious choices.  Very simple, low latency.  Even in
the second example, we had to know one sample ahead, before
calculating the derivative, so it's one-sample latency.

as convolutional series

first forward difference: (1 -1 0)*fs   (the conv. listed is reverse
order, fs is sampling freq.)
second derivative approx: (1 -2 1)*fs^2
central divided difference: (.5 0 -.5)*fs
third derivative (convolution of 1st and 2nd derivs.): (.5 -1 0 1 -.5)*fs^3
so an improved derivative approx. can be obtained from a Taylor series expansion

f'(0) = (f(t) - f(0) - f''(0)/2! * t^2 - f'''(0)/3! * t^3 - ... ) / t

( 1/t = fs)

f'(n) = ((f(n+1) -f(n) -(f(n+1) -2f(n) +f(n-1))/2 -(.5f(n+2) -f(n+1)
+f(n-1) -.5f(n-2))/6)*fs

and it works out to be (-1/12  2/3 0 -2/3 -1/12)
which will have better error characteristics than the derivatives
mentioned before

consult a numerical analysis textbook; numerical derivatives are
different from your typical calculus definitions

>
> I am looking at my math notebook, where I read that derivatives are the
> "limit for the incremental ratio of a function", where h is the
> increment, and f(x) is the function, and I have this formula:
>
>
> y' = lim {h >> 0}  ( f(x+h) - f(x) ) / h
>
> translating this into a fexpr~ I do:
>
> [osc~]                    [float (h)]
> |                         |
> [fexpr~ ($x1-$x1[-$f2])/$f2]
>
> recalling the rules above, for f(x)=sin(x),
> I should have f'(x)=cos(x), however its amplitude lower as frequency
> lowers... and phase offset of [osc~]' is 180° not 90°....
> what's the problem? h isn't enough close to zero?
>
> is there an "ideal" derivator? or I am say something totally wrong?
Nope, there's not an "ideal" differentiator.
You should be seeing the correct behavior of your differentiator.
Amplitude goes to zero as frequency goes to zero, under
differentiation.  Also, the phase shift for ALL frequencies is the
same, 90 degrees.

Chuck




More information about the Pd-list mailing list