LuaFann
These are a set of Lua
bindings for the Fast Artificial Neural Network (FANN
) library.
The FANN
project has an excellent introduction on the theory and operation of artificial neural networks on its homepage.
The source code in this module is released under the GNU Lesser General Public License
(LGPL
), which is the license under which the FANN
library is released.
You can access the README or the manual from this page.
Info
Current Team Lead: Vadim A. Misbakh-Soloviov
(@msva, mva@mva.name)'2012
Original Author: Werner Stoop
(wstoop@gmail.com)'2009
Feel free to contact me (@msva) if you have any problems/questions/suggestions. Alternatively, you can post your ideas/questions/suggestions on issue tracker
Building
The LuaFann
module was written using ANSI C
, and should work on any platforms supported by Lua
and FANN
. In order to build LuaFann
, you should have Lua
and FANN
already installed on your system.
It is a Makefile
(GNU
one, but it should work on BSD
and other POSIX-systems (not Windows) as well), which you can use to build LuaFann on your system. Look at README for help about targets.
Unfortunately I don't have access to other operating systems supported by Lua
and FANN
, so I cannot provide specific makefiles/project files for those. If you can contribute it would be greatly appreciated. If you need to build the module on another operating system you can follow the instructions found on the Building Modules page in the Lua users wiki
.
Example
Here is an example of a Lua
script that imitates the FANN
XOR
example: module.lua
(in test directory). This example trains a neural network to mimic the exclusive-OR
(XOR
) function (In the example, a -1
is used for a boolean false
and a +1
is used for a boolean true
).
Load the LuaFann
module into Lua
through the require statement:
require("fann")
Create a neural network object through the FANN
create_standard()
function. The example creates a neural network with three layers where there are 2 input neurons, 2 neurons in the hidden layer and one output layer neuron:
ann = fann.create_standard(3, 2, 2, 1)
Next load some training data from a file through the read_train_from_file()
function.
train = fann.read_train_from_file("xor.data")
The xor.data
training file comes from the FANN
examples and is described in more detail on the FANN
homepage.
4 2 1
-1 -1
-1
-1 1
1
1 -1
1
1 1
-1
It is a simple text file wherein the first row describes the training set: There are four entries, with two inputs and one output. Thereafter are alternating rows where the even rows' describe the entry's inputs and the odd rows describe the entry's output(s).
The second and third rows describe the first training set: The second row has -1 -1
as the two input values and the third row contains -1
as the output value (more sophisticated networks may have more outputs) which means that when both inputs are false
, the neural network should be trained to output a false
.
Likewise the forth and fifth rows describe the second training set, the sixth and seventh rows describe the third training set and the eight and ninth rows describe the forth training set. You'll notice that this training set will teach the network to output -1
when its inputs are the same and to output 1
when its inputs are different (thus a XOR
function)
Next, some other parameters of the network are set. The FANN
documentation provides some more detail on these:
ann:set_activation_steepness_hidden(1)
ann:set_activation_steepness_output(1)
ann:set_activation_function_hidden(fann.FANN_SIGMOID_SYMMETRIC)
ann:set_activation_function_output(fann.FANN_SIGMOID_SYMMETRIC)
ann:set_train_stop_function(fann.FANN_STOPFUNC_BIT)
ann:set_bit_fail_limit(0.01)
Now the weights within the neural network are initialized according to the training data:
ann:init_weights(train)
You are now ready to call the train_on_data()
function to train the neural network:
ann:train_on_data(train, 500000, 1000, 0.001)
You can use the test_data()
function to calculate the mean-square error, which gives an indication of how well the neural network performs. The closer to zero this value is the smaller the error in the trained neural network:
mse = ann:test_data(train)
print("MSE: " .. mse)
You can use the save()
function to save a neural network to a file.
ann:save("myxor.net")
You can reload the neural network at a later stage through the create_from_file()
function:
ann = fann.create_from_file("myxor.net")
You are now ready to use the trained neural network to classify inputs through the run()
function. In this example the run()
function takes two inputs and returns a single output (which corresponds to the training set; other networks may have more inputs or outputs):
xor = ann:run(1, 1)
print("Result: " .. xor)
xor = ann:run(1, -1)
print("Result: " .. xor)
xor = ann:run(-1, -1)
print("Result: " .. xor)
xor = ann:run(-1, 1)
print("Result: " .. xor)
For fun, try changing the inputs to the network slightly, using, say, xor = ann:run(1.1, -0.75)
to see that the network will still output a valid result.
Future
There are still a lot of the more advanced FANN
functions outstanding. I hope to get around to those eventually.
If there are functions you require urgently, you can place a feature request through the GitHub issues.
Naturally, contributions from the community are always welcome.