nICE 2016

Numerical Methods for Structural Estimation

ICE2016 Lectures on Numerical Methods for Structural EstimationThursday, July 28, 2016Lou Henry Hoover, Room 105
You are invited to attend three presentations at ICE2016 that focus on numerical methods for structural estimation in economics models.Please circulate this to students and colleagues who may be interested.10:30am -12:00noon: Ken Judd, Hoover Institution
Title: Numerical Methods for Structural Estimation

Abstract: Structural estimation usually requires combining statistical theory with computational methods. The computational methods provide the mapping from parameter space to equilibrium behavior, and the statistical methods use those results to determine the values of structural parameters that lead to equilibrium processes that fit the data. The choice of computational methods change with time, reflecting the advances in both hardware and software. I will give a review of how these choices have evolved over the past 30 years, focusing on Nested Fixed Point (NFXP), Mathematical Programming with Equilibrium Constraints (MPEC) (a.k.a., inverse optimum methods), development of optimization algorithms, and related advances in differentiation and sparse linear algebra techniques. The conclusion will be that current structural estimation practice rests on ad hoc and unreliable computational procedures, but that modern tools from scientific computing could significantly improve the quality, reliability, and applicability of structural estimation.1:30pm – 2:30pm: Benjamin S. Skrainka
Title: A Large Scale Study of the Small Sample Performance of Random Coefficient Models of Demand 

Abstract: Despite the importance of the Berry et al. [1995] model of demand for differentiated products (BLP hereafter), there are few results about its finite sample behavior. In theory, simulation experiments provide a tool to answer such questions but computational and numerical difficulties have prevented researchers from performing any realistic studies. Those Monte Carlo studies which exist focus on only one market and often take computational short-cuts. Nevertheless, by utilizing recent advances in optimization [Su and Judd, 2010, DubĂ© et al., 2011] and multi-dimensional numerical integration [Skrainka and Judd, 2011], I develop a fast, robust implementation of BLP and show that a large- scale simulation approach is now feasible. I compute the finite sample behavior under both the traditional BLP instruments (characteristics of rival goods) and exogenous cost shifters using synthetic data generated from a structural model for realistic numbers of markets and products. This paper, then, has two objectives: to demonstrate the power of modern computational technology for solving previously intractable problems in Economics via massive parallelization and to characterize the finite sample behavior of the BLP estimator. 2:45 – 3:45pm: Harry J. Paarsch, University of Central Florida
Title: Hui: A Case Study of a Sequential Double Auction of Capital

Abstract: For many immigrants, raising capital through conventional financial institutions (such as banks) is difficult, even impossible. In such circumstances, alternative institutions are often employed to facilitate borrowing and lending within the immigrant community. Using the theory of non-cooperative games under incomplete information, we analyze one such institution, the hui, which is essentially a sequential, double auction among the participants in a cooperative. Within the symmetric independent private-values paradigm, we construct the Bayes-Nash equilibrium of a sequential, first-price, sealed-bid auction game, and then use this structure to interpret field data gathered from a sample of hui held in Melbourne, Australia during the early 2000s.

Subpages (3):