Neural Net Overview

 

Overview

Lambda Information Server supports NeuralNet Objects with continuous real sigmoid input, self-organizing hidden layers, and real number output user selectable as continuous, sigmoid, bipolar, or binary. The Analytic Information Server NeuralNet object learns any continuous mapping of the type f(x) : In => Rm  (modeled after requirements of the Hecht-Nielson theorem). Translated into programmers English, there may be from 1 to N inputs. These inputs must each be real numbers within the closed interval [0, 1] (yes that includes 0 and 1 and all the real numbers in-between). There may be from 1 to M outputs. These outputs will each be real numbers, which the programmer may select to appear as continuous (unrestricted real number output), sigmoid real numbers in the closed interval [0, 1], binary either 0 or 1, or bipolar either -1 or 1.

The Lambda Information Server NeuralNet object supports incremental learning with built-in methods for both forward propagation and back propagation. It is the responsibility of the programmer to convert the input into the numeric range required by the NeuralNet object and to convert the output from the numeric range produced by the NeuralNet object.

NeuralNet Objects are multilayer collections of neural cells molded into a network of interconnecting cells and weights. Each Analytic Information Server NeuralNet object contains a layer of input cells, a self-organizing layer of hidden cells, and a layer of output cells. Analytic Information Server NeuralNet objects are modeled after the organization of neural cells in biology.

Each of the neural cells in a Analytic Information Server NeuralNet object is connected to each cell in its neighboring layers. Each Analytic Information Server NeuralNet cell has assigned to it both a current value and a connection weight with respect to each of its neighboring cells. Analytic Information Server NeuralNet objects can be used for advanced pattern recognition applications, data compression and decompression, speech recognition, etc.

The parent Structure of a Analytic Information Server NeuralNet object is composed of the NeuralNet Structure, as shown below. It is a normal user defined Structure and can be easily created with the (new NeuralNet: ) function call; however, a special makeNeuralNet function has been provided to make creation of a complex NeuralNet objects much easier.

 

 (defineStructure NeuralNet:

                                    input: 

                                    hidden:

                                    output:

                                    momentum:

                                    learning:

                                    filter:

                                    theta:

                                    notes:

                                    error:)

 

The internal Structures of a Analytic Information Server NeuralNet object, namely the input layer, the hidden layer, and the ouput layer are composed of the NeuralLayer child Structures, as shown below. It is a normal user defined Structure and can be easily created with the (new NeuralLayer: ) function call; however, a special makeNeuralNet function has been provided to make creation of a complex NeuralNet objects much easier.

 

 (defineStructure NeuralLayer:

                                    inputs:

                                    outputs:

                                    weights:

                                    biases:

                                    deltas:

                                    errors:)

 

 

Reference:       Fundamentals of Neural Networks, Laurene Fausett, Florida Institute of Technology, Prentice-Hall Inc, 1994, ISBN 0-13-334186-0