octal-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Re:machine types as instrument abstractions, or IA-Machines


From: David O'Toole
Subject: Re: Re:machine types as instrument abstractions, or IA-Machines
Date: Tue Mar 13 18:48:02 2001

On 13 Mar 2001 17:49:19 -0500, address@hidden wrote:
> I follow you _most_ of the way, but I'm confused about what exactly the
> distinction is between an IA machine and any other machine.  Do you envision 
> the
> IA machine as purely a means for specifying an -- er, instrument useage policy
> (for lack of a better phrase) -- and that then gets used by an audio 
> generator? 

No, the machine *is* the generator. The "policy" is just how the machine
acts. There would be no single "instrument setup" window, because
instruments are defined by configuring a machine with the properties and
the mapper you want. I guess we should really use the general term
"Generator" instead of "ia-machine". 

I guess what I'm trying to say is that all machine types (other than
effects) specify a new instrument type, complete with its own parameters
for the user to configure. Imagine a machine that doesn't use wave
samples at all.... a pure synth like the one Matt is writing. That is an
instrument type. The Octal API is really an API for defining instrument
types whose user interface is created by Octal but whose behavior is
defined by your C/C++ machine implementation. Synths, samplers, physical
modelling.... all of these are instrument types with certain parameters.
Octal provides control widgets (and eventually automations) so that the
user can control + script all the parameters of an instrument
******without the instrument model being built in to Octal ahead of
time.*****

For machines that need waves we can use exactly the same idea, with a
slight tweak. Instead of embedding one instrument model into Octal, we
let sample-based machine plugins describe their instrument model as
OX_API parameters, just like a synth machine does now. Now Buzz had a
very simple wavetable, and the only way you could change waves was by
selecting it in the wave column. In Octal we'd be selecting a mapper
instead of just one wave. To change sounds, we can switch mappers
instead of just switching waves (or switching "instruments" like in
trackers.) 

Instead of one wave-based machine model built into Octal, we'd have a
general mechanism for configuring any kind of sample-based machine with
the waves it needs to play sound.  The point is that Octal doesn't
hardcode how the samplers are supposed to work.  It just provides a way
to give them waves. So if people come up with new and interesting ways
to use waves in an instrument (perhaps combined with synthesis?) that we
didn't think of ahead of time, we don't have to extend the host, they
can just release their machine. 

For example, think of a machine that primarily is a synth tone
generator, but it vocodes or otherwise modulates each synth note with
the "texture" of an atonal sample loop that you select. This will have
its own attributes (how much of the wave should be used? how many times
does it repeat? does it decay? does the wave restart looping on every
synth note?). These attributes are unique to the instrument model that
this machine implements, and won't make sense to other machines that are
based on different ideas. So my argument is that since the Octal API is
designed to create instrument type plugins, we should put only the most
common wave attributes into the wavetable and leave the rest to
machines. There is no loss of configurability.

So "instrument == generator machine" whether it uses waves or not. The
typical instrument model in a tracker really is only leaving room for
one piece of code to interpret the data in a particular way, because in
trackers that don't use plugins usually all the tracks are the same. In
Octal the concept of an instrument and a machine type are the same
(except for effects.) Anything that you could configure about an
instrument can be made into a parameter of a machine. 

Now, all that being said.... the implementation will end up being
simple. The only two API-level consequences to Ben's idea are:

1. The wave_mapper function that selects waves based on note/velocity
etc.... this was being planned anyway
2. Only putting basic attributes on waves, and leaving machines to
define instrument models (which will be no harder than writing a machine
to implement someone else's instrument model.) 

It's a very simple thing to deal with but it will be more flexible. All
the other things I wrote were just explaining the "why" part.    

> maybe -- oo, here's an idea for you -- maybe env's should just be rolled up 
> into
> the category of "control signal" along with LFO's and be produced (if at all) 
> by
> machines designed specifically for that purpose.  Lots more flexible than 
> having

How would this work for envelopes? Every instance of it would be at a
different point in the envelope... some notes starting, some stopping,
etc. They couldn't all sync to the same signal coming from one source
like an LFO. 






reply via email to

[Prev in Thread] Current Thread [Next in Thread]