Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Raining Algorithm For Full CPN

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 2

raining Algorithm for Full CPN:

Step 0: Set the weights and the initial learning rate.

Step 1: Perform step 2 to 7 if stopping condition is false for phase I training.

Step 2: For each of the training input vector pair x:y presented, perform step 3 to .

Step 3: Make the X-input layer activations to vector X.

Make the Y-input layer activation to vector Y.

Step 4: Find the winning cluster unit.

If dot product method is used, find the cluster unit zj with target net input; for j=1 to p,

zinj=∑xi.vij + ∑yk.wkj

If Euclidean distance method is used, find the cluster unit zj whose squared distance from input vectors is the smallest:

Dj=∑(xi-vij)^2 + ∑(yk-wkj)^2

If there occurs a tie in case of selection of winner unit, the unit with the smallest index is the winner. Take the winner unit index as J.

Step 5: Update the weights over the calculated winner unit zj.

For i=1 to n, viJ(new)=viJ(old) + α[xi-viJ(old)]

For k =1 to m, wkJ(new)=wkJ(old) + β[yk-wkJ(old)]

Step 6: Reduce the learning rates.

α (t+1)=0.5α(t); β(t+1)=0.5β(t)

Step 7: Test stopping condition for phase I training.

Step 8: Perform step 9 to 15 when stopping condition is false for phase II training.

Step 9: Perform step 10 to 13 for each training input vector pair x:y. Here α and β are small constant values.

Step 10: Make the X-input layer activations to vector x. Make the Y-input layer activations to vector y.

Step 11: Find the winning cluster unit (Using the formula from step 4). Take the winner unit index as J.

Step 12: Update the weights entering into unit zJ.

For i=1 to n, viJ(new)=viJ(old) + α[xi-viJ(old)]

For k =1 to m, wkJ(new)=wkJ(old) + β[yk-wkJ(old)]

Step 13: Update the weights from unit zj to the output layers.

For i=1 to n, tJi(new)=tJi(old) + b[xi-tJi(old)]

For k =1 to m, uJk(new)=uJk(old) + a[yk-uJk(old)]

Step 14: Reduce the learning rates a and b.

a(t+1)=0.5a(t); b(t+1)=0.5b(t)
Step 15: Test stopping condition for phase II training.

Forward-only Counterpropagation network:

A simplified version of full CPN is the forward-only CPN. Forward-only CPN uses only the x vector to form the cluster on the Kohonen
units during phase I training. In case of forward-only CPN, first input vectors are presented to the input units. First, the weights between
the input layer and cluster layer are trained. Then the weights between the cluster layer and output layer are trained. This is a specific
competitive network, with target known.

Training Algorithm for Forward-only CPN:

Step 0: Initialize the weights and learning rates.

Step 1: Perform step 2 to 7 when stopping condition for phase I training is false.

Step 2: Perform step 3 to 5 for each of training input X.

Step 3: Set the X-input layer activation to vector X.

Step 4: Compute the winning cluster unit J. If dot product method is used, find the cluster unit zJ with the largest net input:

zinj=∑xi.vij

If Euclidean distance is used, find the cluster unit zJ square of whose distance from the input pattern is smallest:

Dj=∑(xi-vij)^2

If there exists a tie in the selection of winner unit, the unit with the smallest index is chosen as the winner.

Step 5: Perform weight updation for unit zJ. For i=1 to n,

viJ(new)=viJ(old) + α[xi-viJ(old)]

Step 6: Reduce learning rate α:

α (t+1)=0.5α(t)

Step 7: Test the stopping condition for phase I training.

Step 8: Perform step 9 to 1 when stopping condition for phase II training is false.

Step 9: Perform step 10 to 13 for each training input pair x:y.

Step 10: Set X-input layer activations to vector X. Set Y-output layer activation to vector Y.

Step 11: Find the winning cluster unit J.

Step 12: Update the weights into unit zJ. For i=1 to n,

viJ(new)=viJ(old) + α[xi-viJ(old)]

Step 13: Update the weights from unit zJ to the output units.

For k=1 to m, wJk(new)=wJk(old) + β[yk-wJk(old)]

Step 14: Reduce learning rate β,

β(t+1)=0.5β(t)

You might also like