Chap 11. DecisionMaker-PolyNet Translation Layer11.1 Encoding and Decoding Methods
11.2 Parameters
11.3 Chapter Project: Encoding and Decoding
11.4 Wisconsin Breast Cancer Database
11.5 Test Run
Chap 11. DecisionMaker-PolyNet Translation LayerChapters 8 – 11 introduce the Presentation Layer components, Abm2. If you do not want know the internal structure, please skip chapters 8, 9, 10, and 11. We will assume you are familiar with the Attrasoft DecisionMaker software.
This chapter discusses how to format the DecisionMaker data for the PolyNet. Attrasoft DecisionMaker is software that generates functions from historical data. The last chapter discussed the DecisionMaker in detail. The DecisionMaker uses the Abm algorithm, not the PolyNet algorithm. In this chapter, we replace the Abm with the PolyNet.
The Abm2 class library is an interface between the PolyApplet and the data used by the DecisionMaker software. There are two matching algorithms in the PolyApplet: Abm and PolyNet. The Abm2 class library will translate data for both algorithms. The Abm2 class library will play four roles:
Data Source Data Consumer DescriptionWe introduced the Abm2 class library in the last chapter. In this chapter, we will only discuss topics related to the PolyNet.
Predictor Data Abm Chap 8
Predictor Data PolyNet Chap 9
DecisionMaker Data Abm Chap 10
DecisionMaker Data PolyNet Chap 1111.1 Encoding and Decoding Methods
The Encoding functions are:
bool decisionMakerEncodePolyLinear ();The Decoding functions are:bool decisionMakerDecodePolyLinearReal ();11.2 Parameters
bool decisionMakerDecodePolyLinearInt ();The Parameters are:
public double Empty-Field;The Precision-level is fixed at 10 and is not a parameter for the PolyNet.The Empty-Field means whenever the Predictor meets this number, it will ignore the number. The reason for this parameter is to handle cases where there are missing entries in the historical data.
Figure 11.1 Chapter 11 Project.
11.3 Chapter Project: Encoding and Decoding
The chapter project is basically the same as the last one, with the exceptions of Abm encoding/decoding being replaced by PolyNet encoding/decoding. The file I/O part has been implemented and the neural computing part has been implemented. The only job needing to be done here is Data Encoding and Decoding.
There are two pairs of Encoding and Decoding:
Integer Prediction:
private void button18_Click_1(object sender, System.EventArgs e)Real Prediction:{private void button19_Click_1(object sender, System.EventArgs e)
if ( y.decisionMakerEncodePolyLinear ())
richTextBox1.AppendText ( "Encoding End!\n") ;
}{
if ( y.decisionMakerDecodePolyLinearInt () )
y.openOutputFile ();
}private void button7_Click_2(object sender, System.EventArgs e){private void button11_Click_2(object sender, System.EventArgs e)
if ( y.decisionMakerEncodePolyLinear ())
richTextBox1.AppendText ( "Encoding End!\n") ;
}{
if ( y.decisionMakerDecodePolyLinearInt () )
y.openOutputFile ();
}
11.4 Wisconsin Breast Cancer DatabaseWe introduced the Wisconsin Breast Cancer database in the last chapter. Below are the last 20 rows:
Question File Answer
ID
1368882 2 1 1 1 2 1 1 1 1 2
1369821 10 10 10 10 5 10 10 10 7 4
1371026 5 10 10 10 4 10 5 6 3 4
1371920 5 1 1 1 2 1 3 2 1 2
466906 1 1 1 1 2 1 1 1 1 2
466906 1 1 1 1 2 1 1 1 1 2
534555 1 1 1 1 2 1 1 1 1 2
536708 1 1 1 1 2 1 1 1 1 2
566346 3 1 1 1 2 1 2 3 1 2
603148 4 1 1 1 2 1 1 1 1 2
654546 1 1 1 1 2 1 1 1 8 2
654546 1 1 1 3 2 1 1 1 1 2
695091 5 10 10 5 4 5 4 4 1 4
714039 3 1 1 1 2 1 1 1 1 2
763235 3 1 1 1 2 1 2 1 2 2
776715 3 1 1 1 3 2 1 1 1 2
841769 2 1 1 1 2 1 1 1 1 2
888820 5 10 10 3 7 3 8 10 2 4
897471 4 8 6 4 3 4 10 6 1 4
897471 4 8 8 5 4 5 10 4 1 4
These 20 rows are further divided into 2 groups:
The DecisionMaker is expected to produce the Answer file, which reflects the correct answer located in the last column.
- The first group consists of column 2 - 10. It forms the Question file for the DecisionMaker.
- The last row provides the correct diagnoses concerning whether a patient has cancer or not.
Step 1. Files.
The following are done automatically:
DecisionMaker Train: cancer1a.txt:Step 2. Encoding.
DecisionMaker Recognition: cancer1b.txt:
Train: example1a.txt
Recognition: example1c.txt
Neural Output: example1d.txt
Output: example2c.txtClick the “DM PolyNet Int” encoding button.
Step 3. Neural Computing.
Click the “P Distribute” button (P = PolyNet) to complete the neural computing.
Step 4. Decoding.
Click the “DM PolyNet Int” decoding button.
Here is the output:
---------------------------------------------------
2 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 4055
---------------------------------------------------
Weighted Average
2
Highest Probability
2 4055
Error of each number
0.11---------------------------------------------------
10 10 10 10 5 10 10 10 7
Possibility Confidence*Probability
2 641
4 2851
---------------------------------------------------
Weighted Average
4
Highest Probability
4 2851
Error of each number
0.11---------------------------------------------------
5 10 10 10 4 10 5 6 3
Possibility Confidence*Probability
2 640
4 2793
---------------------------------------------------
Weighted Average
4
Highest Probability
4 2793
Error of each number
0.11---------------------------------------------------
5 1 1 1 2 1 3 2 1
Possibility Confidence*Probability
2 5431
---------------------------------------------------
Weighted Average
2
Highest Probability
2 5431
Error of each number
0.11---------------------------------------------------
1 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 3729
---------------------------------------------------
Weighted Average
2
Highest Probability
2 3729
Error of each number
0.11---------------------------------------------------
1 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 3729
---------------------------------------------------
Weighted Average
2
Highest Probability
2 3729
Error of each number
0.11---------------------------------------------------
1 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 3729
---------------------------------------------------
Weighted Average
2
Highest Probability
2 3729
Error of each number
0.11---------------------------------------------------
1 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 3729
---------------------------------------------------
Weighted Average
2
Highest Probability
2 3729
Error of each number
0.11---------------------------------------------------
3 1 1 1 2 1 2 3 1
Possibility Confidence*Probability
2 6468
---------------------------------------------------
Weighted Average
2
Highest Probability
2 6468
Error of each number
0.11---------------------------------------------------
4 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 4284
---------------------------------------------------
Weighted Average
2
Highest Probability
2 4284
Error of each number
0.11---------------------------------------------------
1 1 1 1 2 1 1 1 8
Possibility Confidence*Probability
2 1874
---------------------------------------------------
Weighted Average
2
Highest Probability
2 1874
Error of each number
0.11---------------------------------------------------
1 1 1 3 2 1 1 1 1
Possibility Confidence*Probability
2 2056
---------------------------------------------------
Weighted Average
2
Highest Probability
2 2056
Error of each number
0.11---------------------------------------------------
5 10 10 5 4 5 4 4 1
Possibility Confidence*Probability
2 1271
4 1638
---------------------------------------------------
Weighted Average
3
Highest Probability
4 1638
Error of each number
0.11---------------------------------------------------
3 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 4580
---------------------------------------------------
Weighted Average
2
Highest Probability
2 4580
Error of each number
0.11---------------------------------------------------
3 1 1 1 2 1 2 1 2
Possibility Confidence*Probability
2 6195
---------------------------------------------------
Weighted Average
2
Highest Probability
2 6195
Error of each number
0.11---------------------------------------------------
3 1 1 1 3 2 1 1 1
Possibility Confidence*Probability
2 1413
---------------------------------------------------
Weighted Average
2
Highest Probability
2 1413
Error of each number
0.11---------------------------------------------------
2 1 1 1 2 1 1 1 1
Possibility Confidence*Probability
2 4055
---------------------------------------------------
Weighted Average
2
Highest Probability
2 4055
Error of each number
0.11---------------------------------------------------
5 10 10 3 7 3 8 10 2
Possibility Confidence*Probability
2 638
4 2698
---------------------------------------------------
Weighted Average
4
Highest Probability
4 2698
Error of each number
0.11---------------------------------------------------
4 8 6 4 3 4 10 6 1
Possibility Confidence*Probability
2 1259
4 1634
---------------------------------------------------
Weighted Average
3
Highest Probability
4 1634
Error of each number
0.11---------------------------------------------------
4 8 8 5 4 5 10 4 1
Possibility Confidence*Probability
2 1256
4 1630
---------------------------------------------------
Weighted Average
3
Highest Probability
4 1630
Error of each number
0.11The prediction is 100% correct.
Return