Most programs work with floating point arithmetic. Floating point arithmetic represents numbers like this: 2.718281828*10^4. The largest possible floating point number is about 10^40, and the smallest possible floating point number is about 10^-40.
BrainMaker performs its internal calculations with fixed point arithmetic. Fixed point arithmetic represents numbers like this: 2.71828. In fixed point arithmetic, there is no exponent.
Because of this, all the numbers BrainMaker reads in from your training or testing files must be comverted from floating point to fixed point; then, when BrainMaker is done performing its neural computations, the results must be converted from fixed point back to floating point. These conversions introduce very small changes in your data values: for example, the number 2.71828 might be represented by BrainMaker as 2.7183.
We use fixed point arithmetic because it is much faster than floating point arithmetic; in fact, the entire MMX instruction set is dedicated to fixed point arithmetic. The price we pay for this speed is very small numeric differences.
Since artificial neural networks, like biological neural networks, are inherently low-precision processes, this makes no noticible difference in the quality of classification or prediction; however, it does mean that BrainMaker's display may not perfectly match the numbers in your training file.
Neural Networks are not inherantly precise things. The reason we humans invented computers and Lotus 123 is that we can't add up a long list of numbers accurately. Computers are much better at arithmetic than humans. People, on the other hand, can drive cars in traffic, walk up stairs, recognize speech, and choose good people for government positions (ok, maybe we're not so good at that last...). Artificial neural networks share the strengths and weaknesses of people.
Typically, a neural network can achieve precision of about 5%. This means if you ask BrainMaker to learn to multiply numbers up to 33x33 (results up to 1000), BrainMaker will learn to multiply numbers +/- about 50. This is not so good: you'd be better off using a calculator. If you're trying to predict the price of, say, GM stock, which varies from maybe 50 to 150, BrainMaker can learn to predict this within about 5%; since the range is 150-50 = 100, BrainMaker's error will be about +/- $5. Since this is a bigger price swing than GM typically has in a day, this approach is hopeless.
However, GM stock prices changes, one day to the next, vary from about down $5 to about up $5. If you have BrainMaker learn to predict a GM stock price change on the day, it will learn this within about 5%; since the range is not about $10, BrainMaker's precision will be about +/-$0.50.
People are a lot like this: a market expert, upon hearing some news about GM, will say something like "This will drive the price up about $5". If he happens to know that yesterday GM was trading at $110, then he'll predict that today GM will go for about $115.