Luafann

Lua wrapper for the FANN neural network functions.

To load Luafann as a Lua module use the require("fann") Lua construct.
In the examples below, the variable ann refers to a neural network object instance created by fann.create_standard() or fann.create_from_file(), and the variable train refers to a training set object instance created by fann.read_train_from_file()

fann.create_standard(num_layers, neurons_1, neurons_2, ..., neurons_n)

Creates a neural network with num_layers.
The i'th layer will have neurons_i neurons (the function must thus have num_layers+1 parameters in total).
Example: ann = fann.create_standard(3, 2, 3, 1)

fann.create_sparse(connection_rate, num_layers, neurons_1, neurons_2, ..., neurons_n)

Creates a neural network with num_layers that are not fully connected.
The i'th layer will have neurons_i neurons (the function must thus have num_layers+1 parameters in total).
Example: ann = fann.create_sparse(0.5, 3, 2, 3, 1)

fann.create_from_file(filename)

Creates a neural network from a file.
Example: ann = fann.create_from_file("xor_float.net")

ann:__gc()

Garbage collects the neural network.

ann:__tostring()

Converts a neural net to a string for Lua's virtual machine
Example: print(ann)

ann:print_connections()

Prints the connections in the neural network
Example: ann:print_connections()

ann:print_parameters()

Prints the neural network's parameters
Example: ann:print_parameters()

ann:set_training_algorithm(function)

Sets the training function for the neural network.
Valid algorithms are fann.FANN_TRAIN_INCREMENTAL, fann.FANN_TRAIN_BATCH, fann.FANN_TRAIN_RPROP or fann.FANN_TRAIN_QUICKPROP
Example: ann:set_training_algorithm(fann.FANN_TRAIN_QUICKPROP)

ann:get_training_algorithm()

Retrieves the training algorithm:
Valid algorithms are fann.FANN_TRAIN_INCREMENTAL, fann.FANN_TRAIN_BATCH, fann.FANN_TRAIN_RPROP or fann.FANN_TRAIN_QUICKPROP

ann:set_learning_rate(function)

Sets the learning rate for the various training algorithms.
Example: ann:set_learning_rate(0.7)

ann:get_learning_rate()

Retrieves the learning rate of the training algorithm.

ann:set_activation_function_hidden(function)

Sets the activation function for the hidden layer neurons.
Example: ann:set_activation_function_hidden(fann.FANN_SIGMOID_SYMMETRIC)

ann:set_activation_function_output(function)

Sets the activation function for the output neurons.
Example: ann:set_activation_function_output(fann.FANN_SIGMOID_SYMMETRIC)

ann:set_activation_steepness_hidden(function)

Sets the steepness of the activation function for the hidden neurons.
Example: ann:set_activation_steepness_hidden(1)

ann:set_activation_steepness_output(function)

Sets the steepness of the activation function for the output neurons.
Example: ann:set_activation_steepness_output(1)

ann:set_train_stop_function(function)

Sets the training stop criteria.
Valid values are either FANN_STOPFUNC_BIT or FANN_STOPFUNC_MSE
Example: ann:set_train_stop_function(fann.FANN_STOPFUNC_BIT)

ann:set_bit_fail_limit(limit)

Sets the bit fail limit for training the neural net.
Example: ann:set_bit_fail_limit(0.01)

ann:init_weights(train)

Initializes the weights using Widrow and Nguyen's algorithm based on the given training data train.
Example: ann:init_weights(train)

ann:test_data(train)

Runs the network through the training data in train and returns the MSE.
Example: mse = ann:test_data(train)

ann:run(input1, input2, ..., inputn)

Evaluates the neural network for the given inputs.
Example: xor = ann:run(-1, 1)

ann:save(file)

Saves a neural network to a file named file
Example: ann:save("xor_float.net")

fann.read_train_from_file(filename)

Creates a training object by reading a training data file.
Example: train = fann.read_train_from_file("xor.data")

train:__gc()

Garbage collects training data.

train:__tostring()

Converts training data to a string for Lua's virtual machine
Example: print(train)

ann:train_on_file(file, max_epochs, epochs_between_reports, desired_error)

Trains the neural network on the data in the file file, for up to max_epochs epochs, reporting every epochs_between_reports. Training stops when the error reaches desired_error
Example: ann:train_on_file("xor.data", 500000, 1000, 0.001)

ann:train_on_data(train, max_epochs, epochs_between_reports, desired_error)

Trains the neural network on the data in train, for up to max_epochs epochs, reporting every epochs_between_reports. Training stops when the error reaches desired_error
Example: ann:train_on_data(train, 500000, 1000, 0.001)

train:save(filename)

Saves training data to a specified file
Example: train:save("train.data")

train:scale_input(min, max)

Scales the inputs of training data to the new range [min-max]
Example:

train:scale_output(min, max)

Scales the outputs of training data to the new range [min-max]
Example:

train:scale(min, max)

Scales the inputs and outputs of training data to the new range [min-max]
Example:

Constants

The fann class also contains several variables that reflect the constants defined for FANN in fann_data.h, as listed below:
The following is a list of all activation functions in FANN. The following is a list of the training algorithms that can be used. The following is a list of the stop criteria used during training