Back to index

python-biopython  1.60
Public Member Functions | Public Attributes | Private Attributes
Bio.NeuralNetwork.BackPropagation.Layer.InputLayer Class Reference
Inheritance diagram for Bio.NeuralNetwork.BackPropagation.Layer.InputLayer:
Inheritance graph
[legend]
Collaboration diagram for Bio.NeuralNetwork.BackPropagation.Layer.InputLayer:
Collaboration graph
[legend]

List of all members.

Public Member Functions

def __init__
def update
def backpropagate
def __str__
def set_weight

Public Attributes

 weights
 weight_changes
 values
 nodes

Private Attributes

 _next_layer

Detailed Description

Definition at line 58 of file Layer.py.


Constructor & Destructor Documentation

def Bio.NeuralNetwork.BackPropagation.Layer.InputLayer.__init__ (   self,
  num_nodes,
  next_layer 
)
Initialize the input layer.

Arguments:

o num_nodes -- The number of nodes in the input layer.

o next_layer -- The next layer in the neural network this is
connected to.

Reimplemented from Bio.NeuralNetwork.BackPropagation.Layer.AbstractLayer.

Definition at line 59 of file Layer.py.

00059 
00060     def __init__(self, num_nodes, next_layer):
00061         """Initialize the input layer.
00062 
00063         Arguments:
00064 
00065         o num_nodes -- The number of nodes in the input layer.
00066 
00067         o next_layer -- The next layer in the neural network this is
00068         connected to.
00069         """
00070         AbstractLayer.__init__(self, num_nodes, 1)
00071 
00072         self._next_layer = next_layer
00073 
00074         # set up the weights
00075         self.weights = {}
00076         for own_node in self.nodes:
00077             for other_node in self._next_layer.nodes:
00078                 self.weights[(own_node, other_node)] = \
00079                                         random.randrange(-2.0, 2.0)
00080 
00081         # set up the weight changes
00082         self.weight_changes = {}
00083         for own_node in self.nodes:
00084             for other_node in self._next_layer.nodes:
00085                 self.weight_changes[(own_node, other_node)] = 0.0
00086 
00087         # set up the calculated values for each node -- these will
00088         # actually just be set from inputs into the network.
00089         self.values = {}
00090         for node in self.nodes:
00091             # set the bias node -- always has a value of 1
00092             if node == 0:
00093                 self.values[0] = 1
00094             else:
00095                 self.values[node] = 0

Here is the caller graph for this function:


Member Function Documentation

Debugging output.

Definition at line 42 of file Layer.py.

00042 
00043     def __str__(self):
00044         """Debugging output.
00045         """
00046         return "weights: %s" % self.weights

def Bio.NeuralNetwork.BackPropagation.Layer.InputLayer.backpropagate (   self,
  outputs,
  learning_rate,
  momentum 
)
Recalculate all weights based on the last round of prediction.

Arguments:

o learning_rate -- The learning rate of the network

o momentum - The amount of weight to place on the previous weight
change.

o outputs - The output info we are using to calculate error.

Definition at line 114 of file Layer.py.

00114 
00115     def backpropagate(self, outputs, learning_rate, momentum):
00116         """Recalculate all weights based on the last round of prediction.
00117 
00118         Arguments:
00119         
00120         o learning_rate -- The learning rate of the network
00121 
00122         o momentum - The amount of weight to place on the previous weight
00123         change.
00124 
00125         o outputs - The output info we are using to calculate error.
00126         """
00127         # first backpropogate to the next layers
00128         next_errors = self._next_layer.backpropagate(outputs, learning_rate,
00129                                                      momentum)
00130         
00131         for this_node in self.nodes:
00132             for next_node in self._next_layer.nodes:
00133                 error_deriv = (next_errors[next_node] *
00134                                self.values[this_node])
00135 
00136                 delta = (learning_rate * error_deriv +
00137                         momentum * self.weight_changes[(this_node, next_node)])
00138 
00139                 # apply the change to the weight
00140                 self.weights[(this_node, next_node)] += delta
00141 
00142                 # remember the weight change for next time
00143                 self.weight_changes[(this_node, next_node)] = delta

def Bio.NeuralNetwork.BackPropagation.Layer.AbstractLayer.set_weight (   self,
  this_node,
  next_node,
  value 
) [inherited]
Set a weight value from one node to the next.

If weights are not explicitly set, they will be initialized to
random values to start with.

Reimplemented in Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer.

Definition at line 47 of file Layer.py.

00047 
00048     def set_weight(self, this_node, next_node, value):
00049         """Set a weight value from one node to the next.
00050 
00051         If weights are not explicitly set, they will be initialized to
00052         random values to start with.
00053         """
00054         if (this_node, next_node) not in self.weights:
00055             raise ValueError("Invalid node values passed.")
00056         
00057         self.weights[(this_node, next_node)] = value

Update the values of the nodes using given inputs.

Arguments:

o inputs -- A list of inputs into the network -- this must be
equal to the number of nodes in the layer.

Definition at line 96 of file Layer.py.

00096 
00097     def update(self, inputs):
00098         """Update the values of the nodes using given inputs.
00099 
00100         Arguments:
00101 
00102         o inputs -- A list of inputs into the network -- this must be
00103         equal to the number of nodes in the layer.
00104         """
00105         if len(inputs) != len(self.values.keys()) - 1:
00106             raise ValueError("Inputs do not match input layer nodes.")
00107 
00108         # set the node values from the inputs
00109         for input_num in range(len(inputs)):
00110             self.values[input_num + 1] = inputs[input_num]
00111 
00112         # propogate the update to the next layer
00113         self._next_layer.update(self)

Here is the caller graph for this function:


Member Data Documentation

Definition at line 71 of file Layer.py.

Definition at line 38 of file Layer.py.

Definition at line 88 of file Layer.py.

Definition at line 81 of file Layer.py.

Reimplemented from Bio.NeuralNetwork.BackPropagation.Layer.AbstractLayer.

Definition at line 74 of file Layer.py.


The documentation for this class was generated from the following file: