gneuralnetwork
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gneuralnetwork] Development Path


From: Jean Michel Sellier
Subject: Re: [Gneuralnetwork] Development Path
Date: Sun, 19 Jun 2016 15:25:48 +0200

Hi David,

Thanks for this very interesting email! I would say that your idea could be very useful and if you want to develop it we can see how to do it together. Otherwise, I would say that we also have to keep the level of control we have right now in Gneural Network, since many researchers are also interested in this very fine level of details.

Best,

JM



2016-06-19 6:38 GMT+02:00 David Mascharka <address@hidden>:
Hi everybody,

I've been playing around with the gneural network code and with some
neural net code of my own. I had the idea to utilize the GNU Scientific
Library for Gneural, since it includes BLAS support and some other nice
functionality. I made a small example that you can see at
(https://github.com/davidmascharka/neural-net-playing/blob/master/nn-gsl.c)
creating a fully-connected layer of 2 neurons with 3 inputs. It's a
general layer framework though, that you can make as large or deep as
you want.

I don't have any derivatives in there for backprop so for now it's just
a random matrix dot product applied to an input, then a tanh
nonlinearity applied. I don't mind hand-deriving gradients for
operations like addition, multiplication, and some of the nonlinearities
but I'm planning to implement automatic differentiation for more
flexibility and easier use.

What do you guys think about transitioning to an architecture more like
this? I think it's a lot more flexible than the current approach of
specifying a network neuron-by-neuron, especially if we want to develop
large networks like VGG that have millions of parameters.

Best,
David




--

reply via email to

[Prev in Thread] Current Thread [Next in Thread]