Back to index

python-biopython  1.60
Public Member Functions | Public Attributes | Private Attributes
Bio.NeuralNetwork.BackPropagation.Layer.HiddenLayer Class Reference
Inheritance diagram for Bio.NeuralNetwork.BackPropagation.Layer.HiddenLayer:
Inheritance graph
[legend]
Collaboration diagram for Bio.NeuralNetwork.BackPropagation.Layer.HiddenLayer:
Collaboration graph
[legend]

List of all members.

Public Member Functions

def __init__
def update
def backpropagate
def __str__
def set_weight

Public Attributes

 weights
 weight_changes
 values
 nodes

Private Attributes

 _next_layer
 _activation

Detailed Description

Definition at line 144 of file Layer.py.


Constructor & Destructor Documentation

def Bio.NeuralNetwork.BackPropagation.Layer.HiddenLayer.__init__ (   self,
  num_nodes,
  next_layer,
  activation = logistic_function 
)
Initialize a hidden layer.

Arguments:

o num_nodes -- The number of nodes in this hidden layer.

o next_layer -- The next layer in the neural network that this
is connected to.

o activation -- The transformation function used to transform
predicted values.

Definition at line 145 of file Layer.py.

00145 
00146     def __init__(self, num_nodes, next_layer, activation = logistic_function):
00147         """Initialize a hidden layer.
00148 
00149         Arguments:
00150 
00151         o num_nodes -- The number of nodes in this hidden layer.
00152 
00153         o next_layer -- The next layer in the neural network that this
00154         is connected to.
00155 
00156         o activation -- The transformation function used to transform
00157         predicted values.
00158         """
00159         AbstractLayer.__init__(self, num_nodes, 1)
00160 
00161         self._next_layer = next_layer
00162         self._activation = activation
00163 
00164         # set up the weights
00165         self.weights = {}
00166         for own_node in self.nodes:
00167             for other_node in self._next_layer.nodes:
00168                 self.weights[(own_node, other_node)] = \
00169                                         random.randrange(-2.0, 2.0)
00170 
00171         # set up the weight changes
00172         self.weight_changes = {}
00173         for own_node in self.nodes:
00174             for other_node in self._next_layer.nodes:
00175                 self.weight_changes[(own_node, other_node)] = 0.0
00176 
00177         # set up the calculated values for each node
00178         self.values = {}
00179         for node in self.nodes:
00180             # bias node
00181             if node == 0:
00182                 self.values[node] = 1
00183             else:
00184                 self.values[node] = 0
        

Here is the caller graph for this function:


Member Function Documentation

Debugging output.

Definition at line 42 of file Layer.py.

00042 
00043     def __str__(self):
00044         """Debugging output.
00045         """
00046         return "weights: %s" % self.weights

def Bio.NeuralNetwork.BackPropagation.Layer.HiddenLayer.backpropagate (   self,
  outputs,
  learning_rate,
  momentum 
)
Recalculate all weights based on the last round of prediction.

Arguments:

o learning_rate -- The learning rate of the network

o momentum - The amount of weight to place on the previous weight
change.

o outputs - The output values we are using to see how good our
network is at predicting things.

Definition at line 205 of file Layer.py.

00205 
00206     def backpropagate(self, outputs, learning_rate, momentum):
00207         """Recalculate all weights based on the last round of prediction.
00208 
00209         Arguments:
00210 
00211         o learning_rate -- The learning rate of the network
00212 
00213         o momentum - The amount of weight to place on the previous weight
00214         change.
00215 
00216         o outputs - The output values we are using to see how good our
00217         network is at predicting things.
00218         """
00219         # first backpropogate to the next layers
00220         next_errors = self._next_layer.backpropagate(outputs, learning_rate,
00221                                                      momentum)
00222 
00223         # --- update the weights
00224         for this_node in self.nodes:
00225             for next_node in self._next_layer.nodes:
00226                 error_deriv = (next_errors[next_node] *
00227                                self.values[this_node])
00228 
00229                 delta = (learning_rate * error_deriv +
00230                         momentum * self.weight_changes[(this_node, next_node)])
00231 
00232                 # apply the change to the weight
00233                 self.weights[(this_node, next_node)] += delta
00234 
00235                 # remember the weight change for next time
00236                 self.weight_changes[(this_node, next_node)] = delta
00237 
00238         # --- calculate error terms
00239         errors = {}
00240         for error_node in self.nodes:
00241             # get the error info propogated from the next layer
00242             previous_error = 0.0
00243             for next_node in self._next_layer.nodes:
00244                 previous_error += (next_errors[next_node] *
00245                                    self.weights[(error_node, next_node)])
00246 
00247             # get the correction factor
00248             corr_factor = (self.values[error_node] *
00249                            (1 - self.values[error_node]))
00250 
00251             # calculate the error
00252             errors[error_node] = previous_error * corr_factor
00253 
00254         return errors
                
def Bio.NeuralNetwork.BackPropagation.Layer.AbstractLayer.set_weight (   self,
  this_node,
  next_node,
  value 
) [inherited]
Set a weight value from one node to the next.

If weights are not explicitly set, they will be initialized to
random values to start with.

Reimplemented in Bio.NeuralNetwork.BackPropagation.Layer.OutputLayer.

Definition at line 47 of file Layer.py.

00047 
00048     def set_weight(self, this_node, next_node, value):
00049         """Set a weight value from one node to the next.
00050 
00051         If weights are not explicitly set, they will be initialized to
00052         random values to start with.
00053         """
00054         if (this_node, next_node) not in self.weights:
00055             raise ValueError("Invalid node values passed.")
00056         
00057         self.weights[(this_node, next_node)] = value

def Bio.NeuralNetwork.BackPropagation.Layer.HiddenLayer.update (   self,
  previous_layer 
)
Update the values of nodes from the previous layer info.

Arguments:

o previous_layer -- The previous layer in the network.

Definition at line 185 of file Layer.py.

00185 
00186     def update(self, previous_layer):
00187         """Update the values of nodes from the previous layer info.
00188 
00189         Arguments:
00190 
00191         o previous_layer -- The previous layer in the network.
00192         """
00193         # update each node in this network
00194         for update_node in self.nodes[1:]:
00195             # sum up the weighted inputs from the previous network
00196             sum = 0.0
00197             for node in previous_layer.nodes:
00198                 sum += (previous_layer.values[node] *
00199                         previous_layer.weights[(node, update_node)])
00200 
00201             self.values[update_node] = self._activation(sum)
00202 
00203         # propogate the update to the next layer
00204         self._next_layer.update(self)

Here is the caller graph for this function:


Member Data Documentation

Definition at line 161 of file Layer.py.

Definition at line 160 of file Layer.py.

Definition at line 38 of file Layer.py.

Definition at line 177 of file Layer.py.

Definition at line 171 of file Layer.py.

Reimplemented from Bio.NeuralNetwork.BackPropagation.Layer.AbstractLayer.

Definition at line 164 of file Layer.py.


The documentation for this class was generated from the following file: