ELEMENTS  OF  THE  STATISTICAL  LEARNING CONCEPT 
FOR  A  NEURAL  NETWORK  AND  ACCURATE
PREDICTION  OF  ITS  OPERATION

G. F. Malychina, A. V. Merkusheva

Saint-Petersburg

    The learning of neural networks (NN) for many problems (pattern recognition, nonlinear multi-parameter re-gression, probability distribution identification) is considered in generalized form on the basis of a concept that includes probabilistic interpretation for the NN input—output transfer function and basic notions having a mathematically formalized foundation: diversity (a set) of mapping being realized by NN (and a set of loss functions isomorphic to it); characteristics of that diversity on the basis of entropy and Vapnik—Chervonenkis dimension; risk functional (RF) and a condition allowing RF approximation by means of an empirical risk func-tional (ERF); the limits of the actual RF departure from ERF. The elements of the leaning statistical theory de-scribed here provide prediction and correction ("control") of the NN operation index after leaning, i.e. at the stage of NN testing with the data on not participating in learning.