Back to index

python-biopython  1.60
Public Member Functions | Public Attributes | Private Attributes
Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer Class Reference
Inheritance diagram for Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer:
Inheritance graph
[legend]
Collaboration diagram for Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer:
Collaboration graph
[legend]

List of all members.

Public Member Functions

def __init__
def update
def backpropagate
def get_error
def set_weight
def __str__

Public Attributes

 values
 nodes
 weights

Private Attributes

 _activation

Detailed Description

Definition at line 255 of file Layer.py.


Constructor & Destructor Documentation

def Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer.__init__ (   self,
  num_nodes,
  activation = logistic_function 
)
Initialize the Output Layer.

Arguments:

o num_nodes -- The number of nodes in this layer. This corresponds
to the number of outputs in the neural network.

o activation -- The transformation function used to transform
predicted values.

Reimplemented from Bio.NeuralNetwork.BackPropagation.Layer.AbstractLayer.

Definition at line 256 of file Layer.py.

00256 
00257     def __init__(self, num_nodes, activation = logistic_function):
00258         """Initialize the Output Layer.
00259 
00260         Arguments:
00261 
00262         o num_nodes -- The number of nodes in this layer. This corresponds
00263         to the number of outputs in the neural network.
00264 
00265         o activation -- The transformation function used to transform
00266         predicted values.
00267         """
00268         AbstractLayer.__init__(self, num_nodes, 0)
00269 
00270         self._activation = activation
00271 
00272         self.values = {}
00273         for node in self.nodes:
00274             self.values[node] = 0

Here is the caller graph for this function:


Member Function Documentation

Debugging output.

Definition at line 42 of file Layer.py.

00042 
00043     def __str__(self):
00044         """Debugging output.
00045         """
00046         return "weights: %s" % self.weights

def Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer.backpropagate (   self,
  outputs,
  learning_rate,
  momentum 
)
Calculate the backpropagation error at a given node.

This calculates the error term using the formula:

p = (z - t) z (1 - z)

where z is the calculated value for the node, and t is the
real value.

Arguments:

o outputs - The list of output values we use to calculate the
errors in our predictions.

Definition at line 292 of file Layer.py.

00292 
00293     def backpropagate(self, outputs, learning_rate, momentum):
00294         """Calculate the backpropagation error at a given node.
00295 
00296         This calculates the error term using the formula:
00297 
00298         p = (z - t) z (1 - z)
00299 
00300         where z is the calculated value for the node, and t is the
00301         real value.
00302 
00303         Arguments:
00304 
00305         o outputs - The list of output values we use to calculate the
00306         errors in our predictions.
00307         """
00308         errors = {}
00309         for node in self.nodes:
00310             calculated_value = self.values[node]
00311             real_value = outputs[node - 1]
00312 
00313             errors[node] = ((real_value - calculated_value) *
00314                             calculated_value *
00315                             (1 - calculated_value))
00316 
00317         return errors

def Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer.get_error (   self,
  real_value,
  node_number 
)
Return the error value at a particular node.

Definition at line 318 of file Layer.py.

00318 
00319     def get_error(self, real_value, node_number):
00320         """Return the error value at a particular node.
00321         """
00322         predicted_value = self.values[node_number]
00323         return 0.5 * math.pow((real_value - predicted_value), 2)

def Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer.set_weight (   self,
  this_node,
  next_node,
  value 
)
Set a weight value from one node to the next.

If weights are not explicitly set, they will be initialized to
random values to start with.

Reimplemented from Bio.NeuralNetwork.BackPropagation.Layer.AbstractLayer.

Definition at line 324 of file Layer.py.

00324 
00325     def set_weight(self, this_node, next_node, value):
00326         raise NotImplementedError("Can't set weights for the output layer")
def Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer.update (   self,
  previous_layer 
)
Update the value of output nodes from the previous layers.

Arguments:

o previous_layer -- The hidden layer preceeding this.

Definition at line 275 of file Layer.py.

00275 
00276     def update(self, previous_layer):
00277         """Update the value of output nodes from the previous layers.
00278 
00279         Arguments:
00280 
00281         o previous_layer -- The hidden layer preceeding this.
00282         """
00283         # update all of the nodes in this layer
00284         for update_node in self.nodes:
00285             # sum up the contribution from all of the previous inputs
00286             sum = 0.0
00287             for node in previous_layer.nodes:
00288                 sum += (previous_layer.values[node] *
00289                         previous_layer.weights[(node, update_node)])
00290 
00291             self.values[update_node] = self._activation(sum)
    

Here is the caller graph for this function:


Member Data Documentation

Definition at line 269 of file Layer.py.

Definition at line 38 of file Layer.py.

Definition at line 271 of file Layer.py.


The documentation for this class was generated from the following file: