octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Octave Forge] Possible addition of gradient-descent framework for o


From: Daniel Kraft
Subject: Re: [Octave Forge] Possible addition of gradient-descent framework for optim package
Date: Fri, 23 Oct 2015 08:59:54 +0200
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.3.0

Hi Olaf!

On 2015-10-23 08:29, Olaf Till wrote:
> As for the algorithms: Gradient descent is not what I'd think
> desirable for 'optim'. Line search is e.g. implemented in core Octaves
> 'sqp' with a few lines of code, also Armijo with backtracking, but
> additionally constraints are handled, which would be necessary for a
> general algorithm employing line search. I wouldn't think a public
> function is needed for line search, since it is typically used as a
> component of another algorithm.

I fully agree that a gradient descent is not really a very complicated
algorithm.  The issue in my situation is that the problem I'm working on
(shape optimisation) is much more challenging and very different from
optimisation in vector spaces where such algorithms are usually
formulated.  This is much more similar to optimisation on manifolds, as
far as I understand the general optimisation theory (which is not my
main topic of research).

For instance, the concepts of "point" and "direction" are very different
from each other.  You cannot simply do something like

  x_new = x_current + step * direction".

I assume that all existing algorithm code would have to be rewritten for
that (but I may be wrong).

Also second-order methods and shape Hessians are a quite complicated
topic (on which a colleague of mine is doing a PhD, but I don't think a
nice general theory exists that could be the basis for general methods).
 This is why I think that gradient descent is a suitable method for my
own research, and why I implemented it.

> You mention the possibility of storing and replaying the course of
> optimization. The existing frontends of 'optim' can call user-provided
> callbacks, with which it should be possible to implement this. Can't
> you interface the existing functions of 'optim' with your package by
> providing suitable callbacks, instead of writing separate optimization
> code? If some functionality for this should be lacking in 'optim', we
> could possibly add it.

Which callbacks do you mean here -- the function evaluation and gradient
computation and so on?  I'm not sure if this can be used to extract all
of the information I use -- which includes not just all points along the
optimisation, but also things like accepted line-search steps.  Is this
available from the callbacks?

Summarising the discussion so far, I think that the problem I'm tackling
with shape optimisation is quite different from the usual numerical
optimisation stuff (mostly because the setting is not a vector space).
I think that one cannot easily fit it into existing optimisation
backends; since it is still "optimisation", I believe that my code could
still fit to the optim package -- as a separate category for
optimisation on manifolds, for instance.  But I also understand if you
think this is too far off the scope of the package; after all, maybe it
fits best to level-set for shape optimisation unless a separate package
for manifolds or shape optimisation is created.

Yours,
Daniel

-- 
http://www.domob.eu/
OpenPGP: 1142 850E 6DFF 65BA 63D6  88A8 B249 2AC4 A733 0737
Namecoin: id/domob -> https://nameid.org/?name=domob
--
Done:  Arc-Bar-Cav-Hea-Kni-Ran-Rog-Sam-Tou-Val-Wiz
To go: Mon-Pri

Attachment: signature.asc
Description: OpenPGP digital signature


reply via email to

[Prev in Thread] Current Thread [Next in Thread]