[PD] Re:[OT] How do your performance environments looks like?

Hans-Christoph Steiner hans at eds.org
Fri Apr 18 04:06:37 CEST 2003


I am personally very interested in figuring out how to provide useful
haptic feedback for methods of synthesis where there isn't a correlation 
between the physical world, and the way the sound is generating, as there
is with physical modelling.  

I am currently building an instrument to control a Phase Vocoder using a 
Force Feedback Joystick and a regular mouse.  I am working on
mappings; first mapping the axes and buttons to controlling the sound,
then next on mapping the sound to the haptic feedback.  

The most basic
idea, which I got from the U of York's Cymatic, is to have the mouse act
like a bow of a violin, with the sound being fed directly into the mouse
so that as you move the mouse, it vibrates with the sounds its making
(unfortunately, the drivers haven't been written yet for Linux, so this
will happen in the future).  I think this will greatly enhance the
experience of playing it.  It is that direct vibration that I miss from my
days of playing the trumpet.

The mapping of the joystick forces is what I am still struggling with.  So
far I have only really thought of things like making the force get
higher as the pitch higher, or making small notches of resistance
representing the borders between changes in parameters.  But this seems
too literal, and basically just an emulation of a knob with
detents.  There must be more useful mappings, that's what I am struggling
with.

.hc


On Thu, 17 Apr 2003, vanDongen-Gilcher wrote:

> Maurizio Umberto Puxeddu said at "Re: [PD] Re:[OT] How do your performance 
> environments looks like?."r[2003/04/16 18:08]
> 
> > > For an interesting example of a haptic musical interface, check out the
> > > University of York's Cymatic:  http://www-users.york.ac.uk/~smr12/
> > main.htm
> > 
> > This gives me a better idea. It is not by chance that they focus on
> > physical modeling. There are many ways to make sounds with a computer
> > where haptic interfaces makes much less sense.
> > 
> Of course. Although I think that there are many "pure" synthetic synthesis 
> methods were haptic feedback can be useful as well.
> example:
> A midi-fader is also gives some form of haptic feedback, you know the 
> position by touch without having to look at the screen.
> What I am thinking of are situations where a single controller axis 
> controls multiple interconnected parameters.
> One parameters is obviously sensable because of the position. Pitch for 
> instance.
> But an external sound source might determine the portamento or some kind of 
> modulation effect on that sound.
> Using ff I can make the response of the joystick reflect the synthesis 
> better. 
> For me it is all about designing a performance interface that matches my 
> musical concepts.
> 
> 
> 
> > During a recent improvisation workshop, some people complained that I
> > was totally inexpressive while playing, even if I was able to make
> > "gestural" sounds and make use of the whole dynamic range which is more
> > extreme than the rest of the players (instrumental).
> > 
> > I noticed that my teacher can (on occasion) use an artificial gestuality
> > (that is, not justified by strict interaction with the device). 
> > For me this is not necessarily a problem, this split being a part of the
> > nature of playing electronic devices. 
> > 
> This split is also part of accoustic devices I think.
> It is also an old discussion in classical music. With a piano the only 
> factors determining the sound are the speed of depressing the keys and the 
> position of the pedals. All movements before and after are "artificial" . 
> There are a lot of bad pianist with fake expresionist movements, there are 
> also very good pianist who move a lot and very good ones who sit like a 
> statue. 
> And then there are conductors and solo guitarist in rockbands, singers 
> ......:)
> 
> But I think movement does have an effect, even if only indirectly. 
> It is often easier to get in the rhythm if your body moves with it, for 
> example. 
> I like to move when I perform, an I always stand when I play electronics. 
> 
> My dislike of laptop performances is more complex than this. I think that a 
> mouse is too simple and too single dimensional an controller for serious 
> music performance and improvisation It is for me anyway. 
> And I think that I can hear this in performances, and see it reflected in 
> the way of sitting at a desk behind a laptop.
> The attention of the performer seems focused on moving the cursor to place 
> X on the screen, and not on making sound Y in the room.
> 
> What I am more interested in is finding ways of making a computer a musical 
> instrument without losing the possibilities.
> The biggest difference between a computer and a more traditional instrument 
> is in the past and future time of the performance.
> A traditional instruments acts in the now. A computer has access to what 
> has gone before, and you can project into the future.
> What I am trying to do is to make an interface that gives me the 
> flexibility of a traditional instrument (instant change and reaction) 
> without losing the extended time-scale of the computer program. To do this 
> I want to have the interface reflect the state of the machine to the player 
> in an intuitive or at least learnable way. One way is graphical, another is 
> haptic. 
> In the example above, the external sound source could also be what I played 
> 1 minute ago. Or something choosen from what I played by pattern matching 
> with neural nets.
> 
> 
> So, that is long mail.
> Hope you find it interesting and not too rambling.
> 
> Gerard
> 

	zen
	   \
	    \
	     \





More information about the Pd-list mailing list