-
Notifications
You must be signed in to change notification settings - Fork 0
Usage instructions
Neurose is used in a very similar way to Pytorch. In the example files you will find working examples of building a neural network with Neurose.
Define a class which inherits Neurose's Net
. The parent initializer takes the loss function and learning rate as parameters: here the loss function is mean squared error, and the learning rate 0.02
. Like in Pytorch, you have to define the forward pass manually by transforming the input and returning it. The activation functions are used with call
and the layers using forward
. The network is passed to the activation functions and layers so that parameters can be saved for backpropagation during feedforward. Note that unlike with Pytorch, if you don't want to use an activation function, you have to use the "Passive" activation function, and its call method on feed forward. See README for a list of all activation functions.
For example, the following network
from net import Net
from layers import Linear
from functions import MeanSquaredError, Sigmoid
class Example(Net):
def __init__(self):
super().__init__(MeanSquaredError, learning_rate=0.02)
self.activation1 = Sigmoid(self)
self.layer1 = Linear(self, 3, 4)
self.layer2 = Linear(self, 4, 2)
def forward_pass(self, input):
x = self.activation1.call(self.layer1.forward(input))
x = self.activation1.call(self.layer2.forward(x))
return x
would result in the following neural network:
Initialize your network:
example = Example()
Resetting parameters between epochs. This has to be done, since the inputs and outputs of each layer are saved one each forward pass.
example.reset_saved_parameters()
Going through one forward pass. The input is expected to be a numpy array of the shape (batch_size, input_size)
. Output will be a numpy array of shape (batch_size, output_size)
.
output = example.forward(input)
Calculating the loss for a single batch. The loss function is defined in initialization. Make sure that both output
and true_labels
are numpy arrays, and that output is of shape (batch_size, output_size)
.
If you're using mean squeared error, the labels also have to be of shape (batch_size, output_size)
. If you're using cross entropy loss, labels have to be of shape (batch_size,)
.
loss = example.calculate_loss(network_output, true_labels)
Do some sweet deriving. See notes about backpropagation for the math behind it.
example.backpropagate()
Update the weights and biases of the network:
example.update_weights()