Economists vs Computation

In this section, I relate some examples of conversations and correspondence I and others have had with economists that reveal their approaches and attitudes towards computation.


Code Fishing

The following exchange happened at a seminar:

Judd: I am worried about your algorithm. [List of concerns.] In particular, the convergence rate is probably very slow.

Seminar speaker: Yes, those are good points but the program works.

Judd: What do you mean by “works”?

Seminar speaker: The results are what my economic intuition says they should be.

Judd: What if the results were not compatible with your intuition?

Seminar speaker: Then I would conclude that the program was wrong.

This exchange illustrates an abuse of computational modeling that I suspect is all too common. For this economist, the only objective of his computational modeling is to support the conclusions he has already made. He trusts his intuition more than the validity of his computational approach. He may be correct in that evaluation of his computational skills but that raises the question of why is he even trying to use computational methods at all. In the end, he is essentially fishing for code that gives him what he wants. Of course, his papers do not admit that but instead wraps his “intuitions” inside a mathematical veneer that will likely mislead readers into thinking that his conclusions are based on careful computations.

Sounds a lot like “regression fishing.” Perhaps he should write a paper with the title “I Just Ran Four Million Programs”.


Caveat Emptor (Particularly when the price is zero!)

At a seminar, I pointed out that the X method for analyzing the X model had been shown to be seriously flawed. X offered no defense of his method, and appeared to accept the points I made. In fact he said he would be happy if people found better ways to analyze X models since then more people would use them. I later e-mailed him suggesting that he add a note to the website with his code saying that flaws in the code’s method had been found. The response was no change. X knows that his code is being downloaded by people interested in adapting the code to help them solve their estimation problems, but chooses to keep them in the dark about the method’s flaws.

The lesson is clear. Do not trust the code you download. The sole purpose is to document the computations in a paper. It would be nice if authors would also feel some responsibility to inform potential users about serious flaws that were discovered after publication, particularly in cases where the code and its underlying method are used frequently. Unfortunately, many economists act in the same manner as the actors in their economic models.


It’s the RA’s Fault — NOT!!

X: Ken, I want to ask you a computational question. First, let me tell you about the economic model.

Judd: I don’t want to hear anything about the model. The economic details are irrelevant. All I need to know are the mathematical features of the problem. First, what kind of problem is it? Constrained optimization?

X: Yes.

Judd: Second, what is the objective like? Is it continuous? differentiable? concave or convex?

X: It is linear.

Judd: Third, what are the constraints like? continuous? differentiable?

X: They are linear.

Judd: You have a linear programming problem. What software are you using?

X: We are using Matlab. We are able to solve problems with 100 variables and constraints but we can’t handle problems with 200 variables and constraints.

Judd: The Matlab Optimization Toolbox has a linear programming module. I am surprised that Matlab can’t even do a decent job of linear programming.

X: Well, my RA wants to use fmincon.

Judd: [Usual rant about fmincon].

In this case, the professor knows little about computation, and so relies on his RA. Of course, the professor has made no effort to have the RA educated in numerical methods nor in the relative quality of software. The blind professor relies on one of the many students he and his colleagues keep blind.

This also illustrates the ignorance that pervades economics about current computational capabilities. Even if one uses only a laptop, linear programming problems with a few hundred variables and constraints are not large problems. They are not small problems. They are infinitesimal problems.


Nobody Told Me This is a Tea Party

At a presentation at the World Congress, I pointed out some of the mathematical errors made by an earlier attempt to solve a problem numerically. After the presentation, the authors of those errors jumped out of their chairs and approached me. One said,

“Ken, why do you always criticize our work? We never criticize your work!”

This comment illustrates the current aversion to serious discussions of facts. I understand that when one goes to a social event, various conventions are to be followed. For example, if I meet a man at a party and he introduces me to his wife, I am supposed to refer to her as his “lovely wife”. However, I don’t recall getting a memo saying that these conventions are also in force when discussing research at conferences, seminars, or in published work. Of course, if I did I tossed it in the trash where it belonged.

I have frequently observed that criticism is considered impolite, even in factual discussions of research. When I was an editor, an author stated in his paper that he had a different way of solving a problem when in fact he had shown that the previous computational attempts were wrong. At least one referee had seized on the ambiguity of the word “different” to assert that there was nothing interesting in the paper. It took considerable effort on my part (even given my role as the Editor) to convince him to be clear on the key fact that the previous paper had made serious mathematical errors and that his paper illustrated a mathematically sound way to proceed.


It was only the computational section; who cares if it is correct?

This is the text of an e-mail to me, replying to some comments I made on a paper of the sender. My reactions are in brackets and italics.

When you told us about the [problems] you found we were doing the galleys of our [name of journal] paper and we did make some changes to the text, but we did not signal problems to the reader.

[I am sure the readers are so grateful that you did not burden them with this kind of information.]

That we didn’t is just a screw-up, and it was in no way motivated by hiding anything or making the method we used look better than it is.

[Ever heard of “observational equivalence”?]

Our main concern was whether to actually replicate your results then and there and report on it in detail in the published paper, but we decided not to simply because there was very little time;

[Little time? Did the editor say “Send it in now, right or wrong!”]

also, we thought that the computational section was not emphasized much and for that reason we didn’t want to halt publication.

[So, lack of emphasis means that errors are not important? Perhaps you should have put a warning on the section indicating that. What if there was a grad student that was relying on this method for his thesis? Would you say to him “the section was not emphasized, so you should not have trusted it”?]

Clearly, however, we should have added some comments about how you … had found [problems] and that alternative methods therefore might work better.

[Something we can agree on. By the way, I keep checking for a corrigendum in the journal. Apparently I am not good at computer searches. Can you tell me when it appeared?]


It’s only an Econometrica paper; who cares if it is correct?

An econometrician once told me it did not matter if the computational aspects of an econometrics paper in Econometrica were bad. He asked, “Who is getting hurt?”

My reply was the following:

“Suppose that your friend read that paper and used it in some litigation consulting he was doing. Suppose also that the other side in the lawsuit hired someone who could track down computational sloppiness in such work. Suppose that your friend’s computations were shown to be seriously flawed, and moreover the errors were favorable to your friend’s employer. What do you suppose the lawyers working with someone who knows a little numerical analysis would do with those facts? How would it affect your friend’s reputation in the litigation consulting market?

Of course, he might be right and that nothing in Econometrica has any real social usefulness.



Seminar speaker: I would like to estimate a flexible version of my model, but computational limitations make it impossible to estimate all (n-1)n/2 parameters. So, I will set all but n equal to zero.

[Twenty minutes of model description.]

Speaker: I use the X method to compute my estimates.

Judd: It would be much faster if you used methods like Y and Z.

Speaker: That may be true, but I have so much computing power that speed does not matter.

Judd: Twenty minutes ago you told us that you had to set all but n parameters to zero due to computational limitations.

Speaker: (Pause.) Oh. That hurt.

Many economists often forget the decisions they have made in the past because of their perception of what the computational limits are. Given the continuing and rapid drop in the cost of computing, and the development of new algorithms, many problems that were impractical just ten years ago are now feasible. The fact that the academic economics community discourages the development of computational expertise in economics creates a large gap between what economists think what is feasible and what is really feasible.


Who is this Newton dude?

Conference speaker: I have a difficult problem. It is nonlinear and represents a dynamic equilibrium. I iterated this way and that way, and I got it to converge. (Figuratively pats himself on the back.)

After his presentation, I had the following conversation with the speaker:

Judd: I want to understand the details. If I understood you, you are solving a linear approximation for the equilibrium decision rule.

Speaker: Yes

Judd: There are three unknown coefficients, right?

Speaker: Yes

Judd: The three unknowns are pinned down by three equations, right?

Speaker: Yes

Judd: Those equations are smooth with respect to the unknowns, right?

Speaker: Yes

Judd: Why didn’t you use Newton’s method to solve your smooth three-variable, three-equation system?

Speaker: (Silence)


Economists aren’t interested in the right answer

X: I have a friend on the job market you should look at.

Y: He only does computational work.

X: Yes, but he shows how to get the correct answer whereas previous methods were unreliable and gave wrong answers.

Y: Getting the right answer is only of second-order importance in economics.