[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Gneuralnetwork] Development Path

From: David Mascharka
Subject: [Gneuralnetwork] Development Path
Date: Sat, 18 Jun 2016 23:38:26 -0500
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.7.2

Hi everybody,

I've been playing around with the gneural network code and with some
neural net code of my own. I had the idea to utilize the GNU Scientific
Library for Gneural, since it includes BLAS support and some other nice
functionality. I made a small example that you can see at
creating a fully-connected layer of 2 neurons with 3 inputs. It's a
general layer framework though, that you can make as large or deep as
you want.

I don't have any derivatives in there for backprop so for now it's just
a random matrix dot product applied to an input, then a tanh
nonlinearity applied. I don't mind hand-deriving gradients for
operations like addition, multiplication, and some of the nonlinearities
but I'm planning to implement automatic differentiation for more
flexibility and easier use.

What do you guys think about transitioning to an architecture more like
this? I think it's a lot more flexible than the current approach of
specifying a network neuron-by-neuron, especially if we want to develop
large networks like VGG that have millions of parameters.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]