-
Notifications
You must be signed in to change notification settings - Fork 0
/
Perceptron.ctxt
49 lines (49 loc) · 18.4 KB
/
Perceptron.ctxt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
#BlueJ class context
comment0.target=Perceptron
comment0.text=\r\n\ The\ Perceptron\ class\ implements\ a\ feed-forward\ neural\ network\ with\ a\ configurable\ number\r\n\ of\ input\ nodes,\ output\ nodes,\ number\ of\ hidden\ layers,\ and\ number\ of\ nodes\ in\ each\ \r\n\ hidden\ layer.\ The\ Perceptron\ object\ asks\ for\ a\ configuration\ file\ at\ run-time\ when\ its\r\n\ main\ method\ is\ run.\ The\ configuration\ file\ specifies\ information\ such\ as\ the\ number\r\n\ of\ nodes\ in\ each\ layer,\ the\ value\ of\ lambda\ (the\ learning\ factor),\ the\ maximum\ number\r\n\ of\ iterations,\ a\ file\ for\ weights\ (or\ 'randomize'\ if\ the\ weights\ should\ be\ generated\ randomly),\r\n\ an\ inputs\ file,\ an\ outputs\ file,\ and\ a\ lower\ and\ upper\ bound\ on\ randomly\ generated\ weights.\r\n\ A\ Perceptron\ object\ can\ be\ trained\ on\ input\ cases\ via\ gradient\ descent,\ and\ will\ stop\ training\r\n\ if\ the\ error\ drops\ below\ a\ threshold\ or\ a\ maximum\ number\ of\ iterations\ is\ reached.\ The\r\n\ backpropagation\ algorithm\ is\ used\ when\ training\ the\ perceptron.\r\n\ \r\n\ @author\ Russell\ Yang\r\n\ @version\ 9/4/2019\ (creation\ date)\r\n
comment1.params=inputNodes\ hiddenLayerNodes\ outputNodes\ lambda\ maxIterations\ stoppingError\ weightsFile\ trainingCases\ outputsFile\ lowerBound\ upperBound
comment1.target=Perceptron(int,\ int[],\ int,\ double,\ int,\ double,\ java.lang.String,\ java.lang.String,\ java.lang.String,\ double,\ double)
comment1.text=\r\n\ Constructor\ for\ the\ Perceptron\ class\ with\ parameters.\ Sets\ instance\ variables\ to\ values\ based\ on\ the\ parameters,\ using\ the\r\n\ helper\ method\ setInstanceVariables.\r\n\ \r\n\ @param\ inputNodes\ the\ number\ of\ nodes\ that\ the\ network\ uses\ to\ take\ in\ inputs\r\n\ @param\ hiddenLayerNodes\ an\ array\ where\ each\ element\ is\ the\ number\ of\ nodes\ in\ a\ hidden\ layer\ of\ the\ network,\r\n\ \ \ \ \ \ \ \ and\ the\ length\ of\ the\ array\ is\ the\ number\ of\ hidden\ layers\r\n\ @param\ outputNodes\ the\ number\ of\ nodes\ in\ the\ output\ layer\r\n\ @param\ lambda\ a\ value\ of\ lambda,\ the\ learning\ factor\r\n\ @param\ maxIterations\ the\ maximum\ number\ of\ iterations\ the\ network\ will\ be\ trained\ for\r\n\ @param\ stoppingError\ a\ threshold;\ the\ network\ will\ stop\ if\ the\ total\ error\ drops\ below\ it\r\n\ @param\ weightsFile\ a\ path\ to\ a\ file\ of\ weights\ or\ the\ word\ "randomize".\ If\ weightsFile\ is\ "randomize",\ weights\ will\r\n\ \ \ \ \ \ \ \ be\ generated\ according\ to\ a\ specified\ lower\ and\ upper\ bound.\ If\ the\ weightsFile\ is\ to\ a\ file\ of\ weights,\r\n\ \ \ \ \ \ \ \ the\ weights\ are\ whitespace\ delimited\ and\ each\ new\ line\ represents\ a\ different\ value\ for\ the\ connectivity\ layer\ index\ (m).\r\n\ \ \ \ \ \ \ \ For\ example,\ for\ a\ 2-2-1\ network,\ the\ text\ file\ will\ be\ structured\ as\ follows\:\ \r\n\ \ \ \ \ \ \ \ w000\ w001\ w010\ w011\r\n\ \ \ \ \ \ \ \ w100\ w110\r\n\ @param\ inputsFile\ a\ filename\ of\ the\ inputs\ file,\ where\ the\ first\ line\ consists\ of\ 2\ space\ separated\ integers.\r\n\ \ \ \ \ \ \ \ The\ first\ is\ the\ number\ of\ cases\ and\ the\ second\ is\ the\ number\ of\ inputs\ per\ case.\r\n\ @param\ outputsFile\ a\ filename\ where\ the\ file\ contains\ the\ theoretical\ outputs\ to\ be\ read.\ The\r\n\ \ \ \ \ \ \ \ first\ line\ consists\ of\ 2\ space\ separated\ integers,\ the\ first\ is\ the\ number\ of\ array\ items\ in\r\n\ \ \ \ \ \ \ \ each\ row\ that\ follow,\ and\ the\ second\ is\ the\ number\ of\ rows.\ Each\ row\ in\ the\ file\ after\r\n\ \ \ \ \ \ \ \ the\ first\ line\ corresponds\ to\ a\ set\ of\ inputs.\ Within\ each\ row,\ the\ elements\ are\ space-separated\r\n\ \ \ \ \ \ \ \ and\ the\ first\ element\ is\ the\ first\ output\ of\ the\ network,\ the\ second\ element\ is\ the\ second\r\n\ \ \ \ \ \ \ \ output\ of\ the\ network,\ etc.\ For\ example,\ for\ a\ neural\ network\ that\ is\ doing\ multiple\ outputs\r\n\ \ \ \ \ \ \ \ and\ is\ supposed\ to\ output\ OR,\ AND,\ and\ XOR\ in\ the\ first,\ second,\ and\ third\ outputs,\ the\ input\r\n\ \ \ \ \ \ \ \ cases\ would\ be\ all\ the\ different\ combinations\ of\ two\ boolean\ inputs\:\ (0,0);\ (0,1);\ (1,0);\ and\ (1,1).\r\n\ \ \ \ \ \ \ \ Thus,\ taking\ the\ first\ column\ to\ be\ the\ OR\ outputs,\ the\ second\ column\ to\ be\ the\ AND\ outputs,\ and\r\n\ \ \ \ \ \ \ \ the\ third\ column\ to\ be\ the\ XOR\ outputs,\ the\ outputsFile\ would\ look\ like\ this\:\r\n\ \ \ \ \ \ \ \ 3\ 4\r\n\ \ \ \ \ \ \ \ 0\ 0\ 0\r\n\ \ \ \ \ \ \ \ 1\ 0\ 1\r\n\ \ \ \ \ \ \ \ 1\ 0\ 1\r\n\ \ \ \ \ \ \ \ 1\ 1\ 0\r\n\ @param\ lowerBound\ a\ lower\ bound\ (inclusive)\ on\ the\ values\ of\ the\ randomly\ generated\ initial\ weights\r\n\ @param\ upperBound\ an\ upper\ bound\ (exclusive)\ on\ the\ values\ of\ the\ randomly\ generated\ initial\ weights\r\n\ @param\ outputNodes\ the\ number\ of\ output\ nodes\ in\ the\ network\r\n
comment10.params=x
comment10.target=double\ activationFunction(double)
comment10.text=\r\n\ This\ static\ method\ applies\ an\ activation\ function\ to\ a\ given\ double.\ It\ can\ be\ changed\ to\ different\r\n\ activation\ functions\ as\ the\ user\ wishes.\ The\ activation\ function\ takes\ a\ large\ input\ and\ "scales"\ it\r\n\ down\ to\ a\ input\ with\ a\ much\ smaller\ magnitude.\r\n\ \r\n\ @param\ x\ a\ double\ value\ which\ the\ activation\ function\ will\ be\ applied\ to\r\n
comment11.params=x
comment11.target=double\ activationFunctionDerivative(double)
comment11.text=\r\n\ This\ static\ method\ applies\ the\ derivative\ of\ an\ activation\ function\ to\ a\ given\ double.\ It\ can\ be\ changed\r\n\ as\ the\ user\ wishes.\r\n\ \r\n\ @param\ x\ a\ double\ value\ which\ the\ activation\ function\ will\ be\ applied\ to\r\n
comment12.params=theoretical\ calculated
comment12.target=void\ updateWeightsBackprop(double[],\ double[])
comment12.text=\r\n\ Updates\ weights\ using\ the\ backpropagation\ algorithm.\ Only\ works\ for\ two\ connectivity\ layer\ networks.\ Uses\ the\r\n\ mathematical\ results\ found\ in\ Dr.\ Nelson's\ notes\:\ "3-Minimizing\ and\ Optimizing\ the\ Error\ Function."\ Note\ that\r\n\ Greek\ letters\ used\ in\ relation\ to\ j\ are\ uppercase\ while\ the\ ones\ related\ to\ i\ are\ lowercase.\ Thus\ (for\ example),\r\n\ omegaJ\ refers\ to\ capital\ omega\ J\ and\ omegaI\ refers\ to\ lowercase\ omega\ I.\ By\ taking\ activation\ states\ (without\r\n\ the\ activation\ function\ applied)\ from\ the\ rawActivations\ 2D\ array,\ we\ avoid\ doing\ more\ computation.\r\n\r\n\ @param\ theoretical\ an\ array\ of\ theoretical\ outputs\r\n\ @param\ calculated\ an\ array\ of\ calculated\ outputs\r\n
comment13.params=
comment13.target=void\ gradientDescent()
comment13.text=\r\n\ The\ gradientDescent\ method\ minimizes\ the\ total\ error\ function\ by\ stepping\ the\ weights\ in\ the\ opposite\ direction\r\n\ of\ the\ gradient\ (the\ direction\ of\ steepest\ ascent).\ Backpropagation\ is\ used\ to\ speed\ up\ training\ time\ and\ use\r\n\ less\ resources.\ See\ Dr.\ Nelson's\ course\ notes\ for\ the\ mathematical\ results\ used\ here.\r\n\ The\ network\ will\ stop\ running\ if\ the\ total\ error\ drops\ below\ the\ stoppingError\ threshold,\ or\ if\ the\ number\ of\ iterations\r\n\ reaches\ maxIterations.\ The\ method\ will\ also\ print\ out\ the\ input\ cases,\ the\ results\r\n\ (inputs,\ theoretical\ outputs,\ and\ actual\ outputs),\ and\ network\ configuration\ and\ information\r\n\ (number\ of\ iterations,\ stopping\ error\ threshold,\ reason\ for\ stopping,\ value\ of\ lambda,\r\n\ and\ final\ total\ error.\r\n
comment14.params=theoreticalOutputs\ actualOutputs
comment14.target=double\ calculateError(double[],\ double[])
comment14.text=\r\n\ Calculates\ the\ error\ between\ a\ theoretical\ outputs\ array\ and\ actual\ outputs\ array\ according\ to\ the\ formulas\r\n\ in\ the\ design\ document.\r\n\ \r\n\ @param\ theoreticalOutputs\ the\ theoretical\ value\ of\ outputs\r\n\ @param\ actualOutputs\ the\ actual\ values\ of\ the\ outputs\r\n
comment15.params=errorArr
comment15.target=double\ calculateTotalError(double[])
comment15.text=\r\n\ Calculates\ the\ total\ error\ in\ an\ array\ of\ case\ errors.\ Each\ element\ in\ the\ array\ is\ squared,\ and\ the\r\n\ squares\ are\ added\ together.\ Then,\ the\ square\ root\ of\ the\ total\ is\ returned.\r\n\ \r\n\ @param\ errorArr\ an\ array\ where\ each\ element\ is\ a\ case\ error\r\n
comment2.params=filename
comment2.target=Perceptron(java.lang.String)
comment2.text=\r\n\ Constructor\ for\ the\ Perceptron\ class\ that\ reads\ from\ a\ given\ configuration\ file.\ Sets\ instance\ variables\ to\ values\r\n\ based\ on\ the\ read\ values,\ using\ the\ helper\ method\ setInstanceVariables.\r\n\ \r\n\ @param\ filename\ the\ name\ of\ the\ configuration\ file\r\n\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \r\n\ Special\ considerations\:\ this\ method\ performs\ exception\ catching\ to\ catch\ an\ NumberFormatException,\ FileNotFoundException,\r\n\ or\ ArrayIndexOutOfBoundsException\ that\ may\ be\ thrown.\ It\ will\ throw\ a\ RuntimeException\ with\ a\ relevant\ message\r\n\ if\ either\ of\ those\ occurs\r\n
comment3.params=inputNodes\ hiddenLayerNodes\ outputNodes\ lambda\ maxIterations\ stoppingError\ weightsFile\ inputsFile\ outputsFile\ lowerBound\ upperBound
comment3.target=void\ setInstanceVariables(int,\ int[],\ int,\ double,\ int,\ double,\ java.lang.String,\ java.lang.String,\ java.lang.String,\ double,\ double)
comment3.text=\r\n\ A\ helper\ method\ that\ sets\ instance\ variable\ values\ based\ on\ values\ passed\ from\ either\ of\ the\ constructors.\r\n\ \r\n\ @param\ inputNodes\ the\ number\ of\ nodes\ that\ the\ network\ uses\ to\ take\ in\ inputs\r\n\ @param\ hiddenLayerNodes\ an\ array\ where\ each\ element\ is\ the\ number\ of\ nodes\ in\ a\ hidden\ layer\ of\ the\ network,\r\n\ \ \ \ \ \ \ \ and\ the\ length\ of\ the\ array\ is\ the\ number\ of\ hidden\ layers\r\n\ @param\ outputNodes\ the\ number\ of\ nodes\ in\ the\ output\ layer\r\n\ @param\ lambda\ a\ value\ of\ lambda,\ the\ learning\ factor\r\n\ @param\ maxIterations\ the\ maximum\ number\ of\ iterations\ the\ network\ will\ be\ trained\ for\r\n\ @param\ stoppingError\ a\ threshold;\ the\ network\ will\ stop\ if\ the\ total\ error\ drops\ below\ it\r\n\ @param\ weightsFile\ a\ path\ to\ a\ file\ of\ weights\ or\ the\ word\ "randomize".\ If\ weightsFile\ is\ "randomize",\ weights\ will\r\n\ \ \ \ \ \ \ \ be\ generated\ according\ to\ a\ specified\ lower\ and\ upper\ bound.\ If\ the\ weightsFile\ is\ to\ a\ file\ of\ weights,\r\n\ \ \ \ \ \ \ \ the\ weights\ are\ whitespace\ delimited\ and\ each\ new\ line\ represents\ a\ different\ value\ for\ the\ connectivity\ layer\ index\ (m).\r\n\ \ \ \ \ \ \ \ For\ example,\ for\ a\ 2-2-1\ network,\ the\ text\ file\ will\ be\ structured\ as\ follows\:\ \r\n\ \ \ \ \ \ \ \ w000\ w001\ w010\ w011\r\n\ \ \ \ \ \ \ \ w100\ w110\r\n\ @param\ inputsFile\ a\ filename\ of\ the\ inputs\ file,\ where\ the\ first\ line\ consists\ of\ 2\ space\ separated\ integers.\r\n\ \ \ \ \ \ \ \ The\ first\ is\ the\ number\ of\ cases\ and\ the\ second\ is\ the\ number\ of\ inputs\ per\ case.\r\n\ @param\ outputsFile\ a\ filename\ where\ the\ file\ contains\ the\ theoretical\ outputs\ to\ be\ read.\ The\r\n\ \ \ \ \ \ \ \ first\ line\ consists\ of\ 2\ space\ separated\ integers,\ the\ first\ is\ the\ number\ of\ array\ items\ in\r\n\ \ \ \ \ \ \ \ each\ row\ that\ follow,\ and\ the\ second\ is\ the\ number\ of\ rows.\ Each\ row\ in\ the\ file\ after\r\n\ \ \ \ \ \ \ \ the\ first\ line\ corresponds\ to\ a\ set\ of\ inputs.\ Within\ each\ row,\ the\ elements\ are\ space-separated\r\n\ \ \ \ \ \ \ \ and\ the\ first\ element\ is\ the\ first\ output\ of\ the\ network,\ the\ second\ element\ is\ the\ second\r\n\ \ \ \ \ \ \ \ output\ of\ the\ network,\ etc.\ For\ example,\ for\ a\ neural\ network\ that\ is\ doing\ multiple\ outputs\r\n\ \ \ \ \ \ \ \ and\ is\ supposed\ to\ output\ OR,\ AND,\ and\ XOR\ in\ the\ first,\ second,\ and\ third\ outputs,\ the\ input\r\n\ \ \ \ \ \ \ \ cases\ would\ be\ all\ the\ different\ combinations\ of\ two\ boolean\ inputs\:\ (0,0);\ (0,1);\ (1,0);\ and\ (1,1).\r\n\ \ \ \ \ \ \ \ Thus,\ taking\ the\ first\ column\ to\ be\ the\ OR\ outputs,\ the\ second\ column\ to\ be\ the\ AND\ outputs,\ and\r\n\ \ \ \ \ \ \ \ the\ third\ column\ to\ be\ the\ XOR\ outputs,\ the\ outputsFile\ would\ look\ like\ this\:\r\n\ \ \ \ \ \ \ \ 3\ 4\r\n\ \ \ \ \ \ \ \ 0\ 0\ 0\r\n\ \ \ \ \ \ \ \ 1\ 0\ 1\r\n\ \ \ \ \ \ \ \ 1\ 0\ 1\r\n\ \ \ \ \ \ \ \ 1\ 1\ 0\r\n\ @param\ lowerBound\ a\ lower\ bound\ (inclusive)\ on\ the\ values\ of\ the\ randomly\ generated\ initial\ weights\r\n\ @param\ upperBound\ an\ upper\ bound\ (exclusive)\ on\ the\ values\ of\ the\ randomly\ generated\ initial\ weights\r\n\ @param\ outputNodes\ the\ number\ of\ output\ nodes\ in\ the\ network\r\n
comment4.params=outputsFile
comment4.target=double[][]\ readOutputs(java.lang.String)
comment4.text=\r\n\ A\ helper\ method\ that\ reads\ theoretical\ outputs\ from\ a\ specified\ file\ name.\ The\ method\ is\ capable\ of\ reading\ the\ outputs\ for\ each\ combination\r\n\ of\ case\ and\ node.\r\n\ \r\n\ @param\ outputsFile\ a\ file\ name\ of\ the\ text\ file\ that\ specifies\ the\ theoretical\ outputs.\ The\ first\ line\ in\ the\ outputsFile\ should\ consist\r\n\ \ \ \ \ \ \ \ of\ two\ space-separated\ natural\ numbers.\ The\ first\ number\ specifies\ the\ number\ of\ outputs\ (ex\:\ 3\ if\ OR,\ AND,\ and\ XOR\ are\ the\ different\r\n\ \ \ \ \ \ \ \ output\ nodes).\ The\ second\ number\ specifies\ the\ number\ of\ cases\ per\ output\ (ex\:\ 4\ if\ the\ pairs\ 0,0;\ 0,1;\ 1,0;\ and\ 1,1\ are\ being\ used\r\n\ \ \ \ \ \ \ \ as\ boolean\ logic\ input\ cases).\ For\ example,\ if\ the\ user\ wants\ to\ do\ OR,\ and,\ and\ XOR\ as\ the\ three\ outputs\ on\ all\ 4\ input\ pairs,\ the\r\n\ \ \ \ \ \ \ \ outputsFile\ should\ look\ like\ this\:\r\n\ \ \ \ \ \ \ \ 3\ 4\r\n\ \ \ \ \ \ \ \ 0\ 0\ 0\r\n\ \ \ \ \ \ \ \ 1\ 0\ 1\r\n\ \ \ \ \ \ \ \ 1\ 0\ 1\r\n\ \ \ \ \ \ \ \ 1\ 1\ 0\r\n\ @precondition\ the\ theoretical\ outputs\ file\ accounts\ for\ at\ least\ one\ output\ node\ and\ at\ least\ one\ case.\ If\ this\ is\ not\ satisfied,\ a\ relevant\r\n\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ RuntimeException\ with\ a\ descriptive\ error\ message\ will\ be\ thrown\r\n\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \r\n\ Special\ considerations\:\ this\ method\ performs\ exception\ catching\ to\ catch\ an\ NumberFormatException,\ FileNotFoundException,\r\n\ or\ ArrayIndexOutOfBoundsException\ that\ may\ be\ thrown.\ It\ will\ throw\ a\ RuntimeException\ with\ a\ relevant\ message\r\n\ if\ either\ of\ those\ occurs\r\n\ \r\n\ @return\ a\ 2D\ array\ outputs,\ which\ represents\ the\ theoretical\ outputs\ for\ each\ output\ and\ case\r\n
comment5.params=arr
comment5.target=int\ arrayMax(int[])
comment5.text=\r\n\ A\ static\ helper\ method\ that\ finds\ the\ maximum\ value\ of\ an\ integer\ array.\ This\ is\ used\ in\ the\ code\ to\ find\r\n\ the\ maximum\ number\ of\ nodes\ in\ the\ hidden\ layer\ array.\r\n\ \r\n\ @param\ arr\ an\ array\ where\ the\ maximum\ value\ will\ be\ determined\r\n\ @return\ the\ maximum\ value\ in\ the\ array\ arr\r\n
comment6.params=filename
comment6.target=void\ readWeights(java.lang.String)
comment6.text=\r\n\ Reads\ in\ the\ weights\ from\ a\ text\ file\ OR\ generates\ random\ weights,\ depending\ on\ whether\ filename\ is\ a\ path\ to\ a\ file\ of\ weights\r\n\ or\ the\ word\ "randomize".\ If\ the\ word\ "randomize"\ is\ used,\ then\ weights\ will\ be\ generated\ randomly\ using\ the\ setRandomWeights\r\n\ helper\ method\ If\ the\ filename\ is\ a\ path\ to\ a\ weights\ file,\ the\ weights\ will\ be\ read\ from\ that\ text\ file.\ In\ the\ text\ file,\r\n\ the\ weights\ are\ whitespace\ delimited\ and\ each\ new\ line\ represents\ a\ different\ value\ for\ the\ connectivity\ layer\ index\ (m).\r\n\ For\ example,\ for\ a\ 2-2-1\ network,\ the\ text\ file\ will\ be\ structured\ as\ follows\:\ \r\n\ w000\ w001\ w010\ w011\r\n\ w100\ w110\r\n\ \r\n\ @param\ filename\ the\ path\ to\ a\ file\ that\ will\ be\ read\ OR\ the\ word\ "randomize"\r\n\ @precondition\ filename\ is\ either\ a\ file\ name\ or\ the\ word\ "randomize"\r\n\ \r\n\ Special\ considerations\:\ this\ method\ performs\ exception\ catching\ to\ catch\ an\ InputMismatchException\r\n\ or\ FileNotFoundException\ that\ may\ be\ thrown.\ It\ will\ throw\ a\ RuntimeException\ with\ a\ relevant\ message\r\n\ if\ either\ of\ those\ occurs\r\n
comment7.params=
comment7.target=void\ setRandomWeights()
comment7.text=\r\n\ Sets\ the\ weights\ to\ random\ values\ between\ a\ lower\ and\ upper\ bound\ (instance\ variables).\r\n
comment8.params=inputs\ raw
comment8.target=double[]\ runNetwork(double[],\ boolean)
comment8.text=\r\n\ Runs\ the\ network\ on\ data.\ Takes\ in\ a\ double[]\ of\ inputs\ and\ set\ the\ values\ of\ the\ input\ nodes\ to\ be\ the\ values\ in\r\n\ the\ inputs\ array.\ Runs\ the\ initial\ values\ of\ the\ input\ nodes\ through\ the\ network.\ This\ is\ done\ by\ looking\ at\ each\r\n\ node\ in\ the\ hidden\ layers\ and\ output\ layer,\ and\ multiplying\ the\ previous\ activations\ by\ the\ weights\ running\ from\r\n\ each\ previous\ activation\ to\ the\ "current"\ node\ (dot\ product).\ Returns\ an\ array\ of\ doubles\ \r\n\ \r\n\ @param\ inputs\ an\ array\ of\ doubles\ where\ each\ item\ is\ an\ activation\ state\ of\ an\ input\ node\r\n\ @param\ raw\ true\ if\ the\ raw\ values\ of\ the\ output\ layer\ should\ be\ returned\ (no\ activation\ function)\r\n\ @return\ an\ array\ of\ doubles\ where\ each\ item\ is\ an\ output\ value\r\n
comment9.params=filename
comment9.target=void\ readInputs(java.lang.String)
comment9.text=\r\n\ The\ readInputs\ method\ reads\ the\ user\ inputs\ from\ a\ file\ and\ sets\ them\ to\ an\ instance\ variable.\r\n\ The\ file\ must\ follow\ a\ specific\ format.\ The\ first\ line\ in\ the\ file\ consists\ of\ two\ whitespace\ delimited\r\n\ positive\ integers.\ The\ first\ number\ is\ the\ number\ of\ cases\ and\ the\ second\ is\ the\ number\ of\ inputs\r\n\ per\ case.\ Each\ line\ in\ the\ file\ after\ the\ first\ consists\ of\ whitespace\ delimited\ inputs.\r\n\ Different\ sets\ of\ inputs\ occur\ on\ different\ lines.\ For\ example,\ a\ file\ with\ n+1\ lines\ would\ have\ n\ sets\ \r\n\ of\ inputs\ to\ be\ run\ through\ the\ network.\r\n\ \r\n\ @param\ filename\ the\ path\ of\ the\ file\ to\ be\ read\r\n\ \r\n\ Special\ considerations\:\ this\ method\ performs\ exception\ catching\ to\ catch\ an\ NumberFormatException,\ FileNotFoundException,\r\n\ or\ ArrayIndexOutOfBoundsException\ that\ may\ be\ thrown.\ It\ will\ throw\ a\ RuntimeException\ with\ a\ relevant\ message\r\n\ if\ either\ of\ those\ occurs\r\n
numComments=16