Browse by author
Lookup NU author(s): Professor Elisabetta Cherchi
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
When the dimension of the vector of estimated parameters increases, simulation based methods become impractical, because the number of draws required for estimation grows exponentially with the number of parameters. In simulation methods, the lack of empirical identification when the number of parameters increases is usually known as the ‘‘curse of dimensionality’’ in the simulation methods. We investigate this problem in the case of the random coefficients Logit model. We compare the traditional Maximum Simulated Likelihood (MSL) method with two alternative estimation methods: the Expectation–Maximization (EM) and the Laplace Approximation (HH) methods that do not require simulation. We use Monte Carlo experimentation to investigate systematically the performance of the methods under different circumstances, including different numbers of variables, sample sizes and structures of the variance–covariance matrix. Results show that indeed MSL suffers from lack of empirical identification as the dimensionality grows while EM deals much better with this estimation problem. On the other hand, the HH method, although not being simulation-based, showed poor performance with large dimensions, principally because of the necessity of inverting large matrices. The results also show that when MSL is empirically identified this method seems superior to EM and HH in terms of ability to recover the true parameters and estimation time.
Author(s): Cherchi E, Guevara CA
Publication type: Article
Publication status: Published
Journal: Transportation Research Part B: Methodological
Year: 2012
Volume: 46
Issue: 2
Pages: 321-332
Print publication date: 01/02/2012
Online publication date: 15/12/2011
ISSN (print): 0191-2615
ISSN (electronic): 1879-2367
Publisher: Elsevier
URL: https://doi.org/10.1016/j.trb.2011.10.006
DOI: 10.1016/j.trb.2011.10.006
Altmetrics provided by Altmetric