Light neural network library that can create, train, and test different models.
-
Model
-
Members
- layers ~ all layers within current model object
-
Methods
- initialize(self, n, seed) ~ initializes all weights and biases within all layers. n is # of units in input.
- forward_propagate(self, x_in) ~ takes a training input and propragate through all layers.
- back_propagate(self, m, x_in, y_out, alpha) ~ performs backpropagation across all layers for a single epoch. y_out holds actual y values for current training set.
- fit(self, X_train, Y_train, X_test, Y_test, alpha, epochs, seed, reg_rate) ~ trains data across all epochs. Returns training cost, test cost history, train accuracy, and test accuracy
- predict(self, x_test) ~ returns forward_propagate results for x_test
- summarize(self) ~ summarizes the model
-
-
Layer
-
Members
- activation ~ holds activation function class for current layer
- units ~ number of neurons in current layer
- neurons ~ np.array holding activation values for each neuron
- W_l ~ weight matrix for current layer
- B_l ~ bias vector for current layer
-
Methods
- initialize(self, prev_layer_units) ~ initializes weights and biases within current layer.
- feed_forward(self, a_in) ~ forward propagates through current layer. Weighted sum and activation calculations are performed here.
- back_prop(self, del_J_z, prev_layer, alpha) ~ performs backprop through current layer. note, del_J_z represents partial derivative of cost W.R.T weighted sum, z.
- summarize(self) ~ summarizes current layer
-