Artificial Neural Networks

 

ANN are an architectional structure – a network – that is  comprised of a plethora of interconnected units, which are called artificial neurons. Each such unit is characterised by inputs and outputs and executes locally a very simple calculation. Each connection between two neurons contains a "weight" value. The network's information, "knowledge" and efficiency is stored in the values of these weights. Each neuron's output is defined by its type, the connections to other neurons and possibly by some external inputs. It is possible that some network have some form of efficiency on construction, but in general they can achieve certain goals only after training (Haykin, 1999; Bishop, 1995; Ham et al, 2001; Perlovsky, 2001).

An Artificial Neural Network can have or many layers of neurons. Networks with only one layer, which is called the output layer, are known as "Single Layer Networks" (Inputs are not considered to be a layer because no calculations are carried out within them). However, there are networks that have one or more layers interposed between the inputs and the output layer (these are called Hidden Layers). These Hidden Layers give the ability to ANNs to solve complex problems. Regarding the connectivity among the neurons, each one can only be connected with those on the previous and next layer (if these layers exist of course) and the direction of the information is almost always restricted from the inputs towards the outputs (there is feedback of course but it is accomplished in a more advanced way). An ANN has full connectivity when all possible connections are in place and it is called feedforward when there is no feedback between layers.

The overall performance of an ANN depends on the neurons' characteristics, the training method and (naturally) the data on which the training takes place. Finally because Artificial Neurons don't function serially and their numbers can be great, ANNs are a characteristic example of mass parallel computation. Their characteristics are summed up in the following table:

ANN characteristics
General Example
Input Size 30 point "window"
Number of layers 4 in total
Layer sizes 30×20×10×1
Weight Connectivity Full & Feedforward
Output size One
Training Algorithm Levenberg - Marquardt

 

Below are useful pictures regarding ERPs. These are samples of my research, in most cases.

Hover over for info and left click to select. More pictures can be found in the ERP section.

.


Fully connected ANN with a hidden layer