traverso-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Traverso-devel] zoom factor, related interpolation


From: Remon
Subject: [Traverso-devel] zoom factor, related interpolation
Date: Fri, 9 Feb 2007 15:15:08 +0100
User-agent: KMail/1.9.6

Hi,

Nicola has some issues with his ppc, and zooming, but thought this would be 
interesting to send to the list:


> Zooming is a difficult story on my PPC: It is very slow, and the
> hold-action delay drives me nuts! But on the dual core it works very well.
> One thing that could be improved, though: The zoom step size is rather big,
>  sometimes I have the choice between too small and too large. Could you
> reduce the factor a bit? Or even better: Make it an option in the config
> file?

Niklas had some good points too in respect to zooming, with playhead going 
over regions (looping), and more stuff.
This seems a good deal related, zooming by means of bars, beats, or based on 
seconds, minutes and so on.
The logic for all those are the same (I think), and requires a floating point 
zoom factor.

Right now, zoom factor is a value of a power of 2, the way peaks are 
calculated, and stored on hard disk.

It means, that if we e.g. a zoom factor between a power of 2, we need to do 
some interpolation.

Peaks start by calculating the highest value of the first 64 samples, next 64 
samples and so on.
So thats level 1/64.
Calculating level 1/128 is simple, pick the largest value of the first two 
1/64
Level 1/256 -> largest value of first 2 1/128 and so on

If a zoom factor is say of level 720, then the lower one is 512, the upper one 
1024, neither is sufficient for reasonable interpolation.
So, let's say, we use factor 256.
We get the first 3 values of this level, calculate the higest one, and are at 
position 768, oops, we moved beyond 720, we measure that offset, and go on 
with the next 3 values of level 256 for the second interpolated value of 
level 720.

Keep track of the offset, and at some sensible point, we have to merge this 
into the calculation, to keep the interpolated values reasonably accurate.

I've been to lazy to make a formal approach for it, it's a fairly simple thing 
though, above describes it almost completely, the trick is to find a suitable 
existing level to interpolate the new level from.


Anoter approach would be to load a level at least 2 steps below the given one, 
load the data, and apply a matrix calculation, definately works, with very 
accurate interpolation I think, but might be a little expensive :-)

Greetings,

Remon




reply via email to

[Prev in Thread] Current Thread [Next in Thread]