[prev] [thread] [next] [lurker] [Date index for 2006/12/11]
* David King <dking@xxxxxxxxx.xxx> [2006-12-11 07:15]: > Not to diminish this obviously hateful situation, but what is > the alternative? Make them learn an entirely new interface > metaphor that's useless outside of the software? Absolutely. Why do I have sliders for the Attack Decay Sustain Release parameters of an FM synth, say, instead of being able to drag around control points on a curve display? Hardware interfaces are constrained by their existence as physical objects. Software interfaces are not. Why should the latter be willfully restricted to the possibilities of the former? > Should someone that knows how to operate these devices not be > able to sit in front of software that does the same thing and > be able to use it? I don't operate on the assumption that skilled people are stupid. I expect anyone who uses audio processing gear regularly has at least a moderate understanding of how sound works, and I expect that with such a background, the directly controllable abstract visualisations that a software UI could coffer would be quite easy to grasp and *much* nicer to work with. > Or is it just that their hardware is hateful from the start? Yep. Pro audio processing involves a lot of highly parametrisable components. That just isn't possible to model in hardware without hundreds of sliders and knobs, and it's not possible to provide intuitive immediate feedback without visualisation of some form. Using the mouse or keyboard to directly change the shape of a visualisation, eg. dragging control points on a curve for an FM synth as I mentioned above, would be a much better metaphor for a lot of audio processing components. You just can't do that in a hardware interface. Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>There's stuff above here
Generated at 22:02 on 27 Dec 2006 by mariachi 0.52