Browse by author
Lookup NU author(s): Emeritus Professor Murray Aitkin, Robert Foxall
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
Multi-layer perceptions (MLPs), a common type of artificial neural networks (ANNs), are widely used in computer science and engineering for object recognition, discrimination and classification, and have more recently found use in process monitoring and control. "Training" such networks is not a straightforward optimisation problem, and we examine features of these networks which contribute to the optimisation difficulty. Although the original "perceptron", developed in the late 1950s (Rosenblatt 1958, Widrow and Hoff 1960), had a binary output from each "node", this was not compatible with backpropagation and similar training methods for the MLP. Hence the output of each node (and the final network output) was made a differentiable function of the network inputs. We reformulate the MLP model with the original perceptron in mind so that each node in the "hidden layers" can be considered as a latent (that is, unobserved) Bernoulli random variable. This maintains the property of binary output from the nodes, and with an imposed logistic regression of the hidden layer nodes on the inputs, the expected output of our model is identical to the MLP output with a logistic sigmoid activation function (for the case of one hidden layer). We examine the usual MLP objective function - the sum of squares - and show its multi-modal form and the corresponding optimisation difficulty. We also construct the likelihood for the reformulated latent variable model and maximise it by standard finite mixture ML methods using an EM algorithm, which provides stable ML estimates from random starting positions without the need for regularisation or cross-validation. Over-fitting of the number of nodes does not affect this stability. This algorithm is closely related to the EM algorithm of Jordan and Jacobs (1994) for the Mixture of Experts model. We conclude with some general comments on the relation between the MLP and latent variable models.
Author(s): Aitkin M, Foxall R
Publication type: Article
Publication status: Published
Journal: Statistics and Computing
Year: 2003
Volume: 13
Issue: 3
Pages: 227-239
Print publication date: 01/08/2003
ISSN (print): 0960-3174
ISSN (electronic): 1573-1375
Publisher: Springer
URL: http://dx.doi.org/10.1023/A:1024218716736
DOI: 10.1023/A:1024218716736
Altmetrics provided by Altmetric