octal-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Re[2]: Sequencers


From: Brendan Howell
Subject: Re: Re[2]: Sequencers
Date: Thu Sep 5 07:34:02 2002

It all sounds pretty sensible to me. The one comment I want to add is that we should keep the data pipes flexible. Right now the goal is to control audio but it would be really cool to use the same core to control a sequence of say video, vector graphics, text, network data, etc. It also keeps it flexible to use different data formats for audio. I'm thinking of the way that Max/MSP has been extended to not only manipulate audio but also video and some other more exotic types of data. This should not add much complexity since the core does not handle any manipulation of the data, just routing the signals. It would require picking standards and identifying what type of data an input would expect, these definitions could easily be included in the API. As a particular data standard becomes more popular, common methods can be added to make machine coding easier (maybe separate libraries, lots of code can be taken from other projects). Is this feasible or am I just pipe dreaming?

-Brendan


On Tuesday, Sep 3, 2002, at 13:14 Europe/Berlin, Unkargherth wrote:

I suspect that all of this should ( if everyone agree) as the core phylosofy
of the project

1.- The octal core sould have no hardcoded input,output,UI, event generators

2.- It should provide a Generic (more on this later) API to manipulate data
from all of them without taking into account:
        .- Where the input data comes
        .- where the output data goes
        .- How the data is manipulated
        .- How the data is modified

3.- It should provide a generic signal routing Engine, a generic timing
engine and a basic UI for representing and manipulating
the routing structure

4.- This implies that writting a Unit ( event generator, sound generator,
sound modifier, Audio,input,audio output) becomes
very complex compared with ie Buzz, VST Instrument/effect,DX Instrument
effect,etc). To solve this i suggest the unit developers
make a draft and a support library to simplify the common questions ( a
default simpe UI for every  of the basic units type,etc)

5.- Units should be developed independent from it's own UI. A unit should
have all of its functionality and then develop a UI to
manipulate the programa OUTSIDE the unit ( ie develop Two classes. one that
works and One that shows)

6.- UI of any unit should be replaceable

.- (only an opinion) This is only an idea. DTO Attn: We should think a way that FORCES a Unit developer that his UNITS can only be public,free,Open Source code. GPL or LGPL is not enought. We need something that explicitly
prohibits any wayof making money with any (all
parts of the project in any way, in any future,in any situation. A kind of
Unit Official registry coudl be usefull

On the generic API point:

I see the core as follows

Octal core should be a kind of low level,generic Signal Routing engine
optimized for audio signals. This isn't necesary that it knows nothing about stereo or surround ( it should apply for any number of audio signals in a machines), nothing about if the signals are coming in from outside or being generate by an event/sound generator combination,nothing about where the signal goes when it reaches an output unit,etc. It on ly should provide a fast , realiable way of routing between units, and an accurate timing for
this

We have obviously two kinds of singnals: Audio an control. And althought i suspect the correct way of work for the engine should be make no difference between them. In the way of optimization we could agree that will be treted
differenlyu ( there' no need to process control signals 44100 times per
second). This implies that units should declare wich input/output signal is audio and wich is control. But the engine should not make any asumptions on
this.

Obviously this core is not enough to provide a common way to write units for
the units developer. So we shoudl agree in a common middle layer
Between core an units. It was suggested some time ago something called
GearLib, but phylosophy was completeluy diferent. Now we nedd something more genereic, less "synthetizer" oriented, common lib. something like "Yes. The
core API take into acount of sending Input TO YOUR UNIT and getting The
Ouput but how you know that the data comming in is "Note On MSG or raw audio
data or Control change dat oe..) Here  Middle layer comesDefine  the
standard ways to do the standard things, but leave the direct access to CORE
API open to do the Stange Things

That middle layer is who defines for example:
1.-How control data should be packked to send it from an event generator to
a sound generator
2.-How asound generator "exports" de description of its parameters to an
Event generator or an UI Element
3.- How to midi sync an eventgenerator to an External Hardware/software
sequencer
4.- One or more Unit templates
5.- One Or more default UI's for every basic kind of unit
etc


This way we are approaching the well knowk philosophy of "Do one thing, but
doit well"

I'll thanks any opinion about all of this


-----Mensaje original-----
De: address@hidden [mailto:address@hidden nombre de
David O'Toole
Enviado el: viernes, 30 de agosto de 2002 22:38
Para: address@hidden
Asunto: Re: Re[2]: Sequencers


I think the important idea is to view an editor as a UI element, a way to
access
a sequencer, whereas the sequencer actually does the event-generating.
You
wouldn't _have_ to make the editors platform dependent, but because of
their
UI-oriented nature, they're the element most prone to fall into that. If
you
separate them from the sequencer conceptually, though, it doesn't screw up
the
whole design if that happens.

This is pretty well different from the typical tracker concept that Octal
is an
outgrowth of, where the user interface to the sequence data is closely
tied to
its internal representation.

You've hit the nail right on the head: that's been the salient design
change. Allowing multiple editors/"sequence views" requires a more
generic concept of events and sequences.

OX_API plugins have different parameters, right? Well a change in a
parameter is considered an event. A list of events is a sequence, etc.
(It's trivial to add a "discriminator" to the collection list, which
could for instance only enumerate events matching a certain channel
number etc. So when the user wishes to edit by channel etc.)

The sequences of events and their API exist independently of any viewing
method. Basically this is a lot more like a MIDI sequencer than a
tracker, but ultimately it will be a lot more flexible. This is part of
a little research project into musical GUI's so trust me, there'll be
some interesting developments there and I'm always open to suggestions
and experiences people have had with wide varieties of musical
gear/software.

Anyway, the sequencer is nothing more than an object that processes a
sequence (event list) and dispatches events to a plugin at the correct
time. But it's possible (as another poster pointed out) to have
sequencers that, say, algorithmically generate their data instead of
read it from a sequence. I'm expecting this will be used more for
automatic fades/sweeps of plugin parameters than for algorithmic
composition, but it should make arpeggiators and such possible.

Not sure yet on what the GUI will be like, but the current codebase
already has the GUI builder code separate from the plugin API, so it
should be possible (trivial?) to allow "sequencer plugins" that export
OX_API parameters and allow the same GUI builder code to do the work.




_______________________________________________
Octal-dev mailing list
address@hidden
http://mail.gnu.org/mailman/listinfo/octal-dev



_______________________________________________
Octal-dev mailing list
address@hidden
http://mail.gnu.org/mailman/listinfo/octal-dev





reply via email to

[Prev in Thread] Current Thread [Next in Thread]