...
A Neural Networks (NN) can be classified according to the type of neuron interconnections and the flow of information.
Feed Forward Networks
A feedforward NN is a neural network where connections between the nodes do not form a cycle. In a feed-forward network information always moves one direction, from input to output, and it never goes backward. Feedforward NN can be viewed as mathematical models of a function .
Recurrent Neural Network
A Recurrent Neural Network (RNN) is the one that allows connections between nodes in the same layer, among each other or with previous layers.
Unlike feedforward neural networks, RNNs can use their internal state (memory) to process sequential input data.
...
Such a structure is also called Feedforward Multilayer Perceptron (MLP, see the picture).
The output of the node of the layers is computed as the weighted average of the input variables, with weights that are subject to optimization via training.
...
Layers are the basic building blocks of neural networks in Keras. A layer consists of a tensor-in tensor-out computation function (the layer's call method) and some state, held in TensorFlow variables (the layer's weights).
Callbacks API
A callback is an object that can perform actions at various stages of training (e.g. at the start or end of an epoch, before or after a single batch, etc).
...
Regularization layers : the dropout layer
The Dropout layer randomly sets input units to 0 with a frequency of rate
at each step during training time, which helps prevent overtraining. Inputs not set to 0 are scaled up by 1/(1-rate)
such that the sum over all inputs is unchanged.
...