network

Principe

See:
neural networks manual
course about the neural networks
course about the connexionnism
A neural network is defined by its type:
network dynamic: dynamic network
network fac: multilayers perceptron
network texture: kohonen network
fac network: defines a layer of neurons by their numbers
mass network: assigns weights to the synapses (connections between neurons)
matrix network: defines the weight matrix
motif network: defines entries on the first layer
law network: defines outputs on the last layer
motif network and law network used to define couples learning.
validate network: realizes learning for pairs (motif, law)
generate network(id)fac(h): changes hidden layers number into h
validate motif network returns the output of the network for a given input
neuron network: accesses a neuron network
limit law network: to filter outputs
It is advisable to normalize motifs and laws (between 0.0 and 1.0), and masses between (-1.0 to 1.0).

Learning

validate network(id) trains the neural network id according to its type (fac, texture, near, dynamic).
Properties are used to define adaptive learning::
meta alea network(id): periodically resets the matrix when the error is too long below 50% (with a frequency depending on the average).
meta coe network(id): periodic constants decrease when the learning error is too long between 10% and 50%.
displ network(id) to interactively change these coefficients.

network brush

network brush(id)

Returns the identifier of the neural networks associated to the brush id.

network brush(id)=idr

Assigns this identifier.
Note:
You must have defined a network whose input motifs are images and whose outputs laws are other images. When the brush is actived the luminance image he sees in the window (sizes of its radius) is passed as input to the network that gives, as output, the image luminance displayed under the brush.

network brush(id)=numr,numg,numb

Changes this identifier.
Notes:
1) When the brush is activated, the RGB image he sees in the window (sizes of its radiuss) is passed as input to the 3 networks number numr for the red component, number numg for the green component, and number numb for the blue component, which gives, as output, the R, G and B components of the image displayed by the brush.
2) The three networks numr, numg, numb must be isomorphic (motifs and laws of the same size).

matrix network

matrix network(id)

Returns the weight matrix of synaptic connections among neurons.
mass(n1,n2)network(id) returns the weight of the connection between the neurons n1 and n2 of matrix of network id. de la matrice id.

matrix network(id)=m1,m2,...

Changes this matrix.
mass(n1,n2)network(id)=m changes the weight of the connection between the neurons n1 and n2 of matrix of network id.
This matrix is generated automatically by the command generate network. Active connections are described in the facets of the network. Note that for an adaptive network all connections are active (fully connected network). The weights are initialized randomly and can be modified by generate mass network(id).

Environement multi networks

validate motif(M) network(r1,r2,...)

Determines the network riof which a motif is the closest of m and do:
validate motif(m) network(ri)
Determines the network riof which a law is the closest of m and returns the corresponding motif (This function simulates a "intentionality").

network propr1 propr2 vol

network(r)propr1 propr2[fac(nf)]vol(id)

Creates the local neural network r of volume id with propr1 property as inputs, and with the bounds of propr2 property as outputs .

network dynamic

network(id)dynamic

builds the neural network id type dynamic.
the set of motifs is the input flux.
the set of laws is the output flux dynamically computed by the coherent flows algorithm.

network fac

network(id)

builds the neural network identifier id and type fac multilayers perceptron (default).

network(id)fac(nh)motif(nm,dm)law(nl,dl))NP(np)rand

builds network identifier id np random motifs size (nm,dm) dm=1 by default, np random laws size (nl,dl) dl=1 by default and nh (0 by default) hidden layers.

network(id)dim(nm,nl,nh)

builds the neural network id type fac with n = motif size, nl = law size and nh = number of hidden layers (6,6,1 default).
To define neurons layers (input, output):
fac(0)network(id)=n1,n2,...;
To define learning pairs:
To define input motifs:
motif(0)network(id)=...;
To define output laws:
law(0)network(id)=...;
To generate the neurons and the synaptic weights matrix:
generate network(id);
Facets describe the no null connexions of the synaptic weights matrix which are randomly initialized between -1 and 1.
The transfer function is initialized to (1,1) (sigmoïde)

network texture

Builds the neural network identifier id and type texture Kohonen
To define neurons layers (inputs, outputs):
fac(0)network(id)=n1,n2,...;
To define input motifs:
motif(0)network(id)=...;
To generate the neurons and the synaptic weights matrix:
generate network(id);
This is randomly initialized between -1 and 1.
The transfer function is initialized to (1,1) (sigmoid)

network near

network(id)near

Sets the network id of type completely connected. Such a network is automatically configured so that the output stream (laws) is in phase with the input fux (motifs) by the motif(0)network(id)=m; builds an input motif.
generate network(id); generates the neurons and the synaptic weights matrix.
validate(nb,cpt)error(err)coe(c1,c2)network(id); trains the network.
exec network(id); computes the outputs of network id.

speed law(num)network(id); returns the speed of the law numbered num of network id.
speed motif(num)network(id); returns the speed of the motif numbered num of network id.

network vol

Some complex issues to a volume idv, which is not known for resolution algorithm, can be treated by methods connectionist. For that:
1) Create the property:        network propr1 propr2 vol(idv)=idr;
2) Create a neural network idr of type
perceptron.
3) Define learning pairs (M=input,L=output) by:
motif(0)network(r)=M;.
law(0)network(r)=L;.
4) Train the network n times by:
validate(n)network(idr);.
or dynamically each image, by:
`validate(1)network(idr);`
The network recognizes then the L based on M, and when a non learned M is given as input, it produces a coherent output S.
5) During an animation, at each frame (typically a function func(4,"F4") do:
validate propr1 propr2 vol(id)
which gives the propr1 property value as an input, the propr2 property value is then the output.
For example, the problem of the adjustment of the volume 1 rotations based on its axis is fixed as follows:
1) Create the network:
`network(1)axis rota fac(3)vol(1);`
Builds the network numbered 1 with 7 neurons:
(1,2,3) on the input layer and receving the axes (x,y,z).
(4,5) on the hidden layer.
(6,7) on the output layer giving the bounds (a1,a2).
2) Create the motifs motifs (values of the axes): ``` motif(0)network(1)=1,0,0; motif(0)network(1)=0,0,1; motif(0)network(1)=-1,0,0; ```
3) Create the laws (values of the rotations): ``` law(0)network(1)=0; law(0)network(1)=.5*PI; law(0)network(1)=PI; ```
4) Train the network:
`validate(1000)network(r);`
or dynamically doing, at each: `validate(1)network(1);`
5) At each image do:
`validate axis rota vol(1);` The (x,y,z) axis of volume 1 is given as input to the network which gives an output (a1,a2), the rotation angle of volume id is then forced between a1 and a2.