The key to designing a successful neural network is to have a lot of good data.
We've illustrated this design process with NetMaker and BrainMaker.
First, you read your data into NetMaker, a spreadsheet which comes free with any BrainMaker. NetMaker reads Ascii, Excel, Lotus, dBase, Metastock, CSI, and binary format data. Then you label your various columns as inputs to the network or outputs from the network. NetMaker then builds all required training and testing files for you.
Then, you use BrainMaker to train your neural network. Training proceeds until either the entire training file has been memorized, or until the network's performance on the testing file is optimum, at your choice. The thermometer displays (bar graphs) shown above may be slid back and forth with the mouse to test your network on varying inputs.
Above, BrainMaker has detected the presence of MMX processing and is using the MMX parallel processors to speed up its computation. In this example, BrainMaker has run through the training set 666 times (105229 presentations of facts) in 1 minute and 8 seconds. No other neural network system can come close to BrainMaker's training speed.
After your network is trained, you can switch the display over to show numeric values, allowing you to edit the exact value of the inputs directly on the screen. The network instantly runs your new numbers and predicts an index price - in this case, four days in advance.
BrainMaker has extremely flexible input and display formats. Here, we see BrainMaker learning optical character recognition. The input is a graphic picture of the number nine; the network is indicating with its output that the probability is very high that this is a picture of a nine, and very low that this is a picture of any other digit.
Also, shown above are two of BrainMaker Professional's graphs: the network progress display, which shows us graphically how the average error has declined over the course of training, and the connection histograms, which indicate to us how much of the network's capacity is being used. The histogram display shows us if the network has too many neurons (good memorization, poor generalization), too few neurons (not enough capacity to learn this problem) or just right (optimum generalizing).
As an example we'll consider a network which determines the value of residential property in a suburban area. The data you gather might include the age of the home, square footage, number of bedrooms, bathrooms, garage area, land area, etc.
A simple collection of home sales for the year, like the one below, is all you need to get started.
NetMaker will create the necessary neural network training and testing files for you from this data. Then BrainMaker will create, train, and test your network. In this case, a real estate assessment network, the house price is the target output, and age, square feet, bedrooms, bathrooms, and land size are the variables used to predict the selling price. After this network is trained, you enter in the five input variables for a particular house - age through land size - and BrainMaker predicts a selling price for that house.
Your network consists of many neurons, grouped into layers. The first layer is the input neurons, which take in your input data. The hidden neurons learn how the inputs combine to produce the desired results; these neurons do the real work of the network. The output neurons translate the network results for you.
The connections, which can be thought of as lines between the layers, are what get corrected during training. BrainMaker strengthens some connections and weakens others, so that the next time example data is presented the neural network will output a more correct answer.
When the network is trained to your satisfaction, you can give the network current input data and it will make a prediction, assess value, or recognize patterns -- whatever you have trained it to do. You can also use one of Professional's analysis features to determine which input variables influence the output the most, play what-if scenarios, and learn the cause-and-effect relationships imbedded in the data.