Hall and Rabushka introduced their Flat Tax proposal in a Wall Street Journal op-ed on December 10, 1981. This was followed by their book The Flat Tax, which has stimulated much discussion about tax reform around the world. This document proposes FlatTax+40, a project to update the ideas and arguments in The Flat Tax , to analyze tax reforms in general. Prior to WWII, the US Federal Government relied primarily on a variety of excise taxes and tariffs plus a small income tax. The fiscal demands of WWII forced the US to implement a significant income tax. The leading concept of a good income tax was provided by Haig and Simons, who argued for a comprehensive definition of income leading to a broad base and low rates. This focus on low rates and a broad base was overturned by the Diamond- Mirrlees analysis of 1971 which argued that only consumption should be taxed. 1971 also brought the famous Mirrlees paper on the optimal taxation of income and provided economists with ways to study efficient ways to implement social preferences over the distribution of consumption.
The Flat Tax incorporates these insights into a simple approach to taxation. The tax base shall be consumption but taxes should be collected using the existing institutional structure of the income tax. This can be done with modest changes in standard features of the income tax. While the Diamond-Mirrlees theory advocated different tax rates for each good, the flat tax is essentially a flat tax on consumption. The Flat Tax also implements progressivity and in a far simpler manner than any VAT or NST could, and also far simpler than implied by Mirrlees’ analysis.
In another WSJ op-ed, Rabushka proposed a merger of the Social Security and tax systems. The lack of any serious study of proposals like this further demonstrates the need for FlatTax+40.
The Flat Tax is a simple approach to progressive consumption taxation. Critics will argue correctly that the Flat Tax, as well as any merger with Social Security, will violate the theoretical conditions for being “optimal”. FlatTax+40 will address the practical policy question of whether the benefits of a more “optimal” tax system justify the cost of extra complexity.
FlatTax+40 will use the past forty years of advances in economics and quantitative tools to do a deeper study of the Flat Tax and its performance relative to alternatives. There have also been many changes in the US and global economies that require a reexamination of some arguments. The aim of this project is to take existing tools from computational science and applied mathematics, build models that capture more of the complexity of the real world, and apply them to questions about the Flat Tax and tax reforms in general.
Most economists are not aware of the kind of computational power that is now available. In December, 2019, JPE published a paper I wrote. The most complex example was computed in under six hours using 80,000 cores. From 2013 to 2019, the NSF Blue Waters gave my postdocs and me over 20M core hours to support that research. That work was all done over six years ago. Blue Waters is now considered obsolete, and has been replaced by far more powerful computer systems.
The Flat Tax has been a valuable reference point in many discussions of tax reform. This project will also use it as a focal point, but will have no agenda other than sound analysis. The initial task is to build a foundation and create the tools for our research but then to invite all to participate in the general discussion. Past analyses of the Flat Tax and other tax reforms typically use highly stylized models that focus on a few aspects of the economy and the tax system because that is all standard tools in economics can do. We will initially build our framework so that it can handle far more complexity and realism than earlier work and use it for basic tax issues from the start. When the framework has been tested and found reliable we will reach out to other researchers and offer to help them use our framework to investigate the issues and questions that concern them.
Focus: Life-cycle decisions and tax policy
This project will initially have a focus on the effects of taxation on life-cycle decisions, examining the cost of tax distortions relative to the revenue generated. This is a natural focus because there can be no coherent analysis of any tax policy without an understanding of how taxation affects individual decisionmaking.
The life-cycle focus will mean that we ignore general equilibrium effects. Flat Tax arguments sometimes involve general equilibrium arguments, but such analysis depends on modeling supply and demand curves, all of which ultimately arise from life-cycle decisionmaking. As we assemble the software and hardware tools and the databases we use in this project we will also discover ways to apply them to general equilibrium analysis. The proposed title is broader than the short-run focus, but it does reflect a longer term goal
of this line of research.
Life-cycle decisonmaking is distorted in many ways by the current US tax system. Any tax system will affect decisions related to savings, consumption, labor supply and investment in education. Many analyses have pointed out that these distortions can be substantial.
The US tax system creates more problems. The natural course of savings would be first save money for down payment for a car, pay off car while saving money for a house down payment, buy a house, work off house debt while saving money for children’s college education, then focus on saving for retirement. The tax system interferes with this in many ways. The limits on retirement contributions push you into retirement saving when young, reducing ability to pay for a house and the kids’ education. This is partly alleviated by special treatment on home mortgage payments and special programs for college savings, together with rules about how you can use the money if the kids do not go to college. After the kids are on their own, it may be time to move to a smaller house, but that will trigger capital gains taxes. In California, such moves can cause a jump in property taxes. The tax costs of housing transactions leads to inefficiently low turnover, and may even lead to excessive consumption of housing services.
These tax rules imply that a large amount of one’s assets is tied up in illiquid forms, which makes it difficult to use wealth to deal with unexpected financial, such as those arising from unemployment and health problems.
FlatTax+40 will examine the total cost of these distortions. Previous analyses have focused on stylized simplifications because they have used only standard tools. FlatTax+40 will use modern quantitative tools, and look at more realistic models.
We understand that no model can represent the full complexity of life-cycle decisionmaking and taxation. The goal of FlatTax+40 is not to create the most complex model of taxation possible. We understand that most economists and policy analysts will have not have access to high-power computing resources, and must rely instead on simplified models. However, no one knows which simplifications are valid and for their purposes. FlatTax+40 will examine complex models but also then determine which simplified versions are reliable for certain purposes. This “model reduction” approach is a proven tool in applied computational science, and we will bring it to economics.
Forty years of change and progress
The past 40 years have seen vast changes in the US economy. For example, the initial Flat Tax book took the perspective that the US was best viewed as a closed economy. In particular, it assumed that the real rate of interest was determined within the US and would change in response to changes in the tax system. That view was appropriate in 1981 because the US was a very large player in its part of the world economy. In the past 40 years, all developed countries have experience significant economic growth, and have liberalized capital flows across international borders. Furthermore, India, the Soviet Union and China have increased their participation in the world economy, The US is still a significant player in world asset markets but prices such as the real interest rate are determined largely by the world economy.
The past 40 years have also seen great changes in the US work force, with fewer blue-collar jobs and far more jobs requiring significant educational investment. There has also been major social changes, such as the increase in labor force participation by women and households with only one parent. All of these changes affect how a tax policy affects their lives and income processes.
New models of consumer decisionmaking
The typical life-cycle model assumes that individuals maximize expected utility with utility being separable across time. Many alternatives and generalizations have been developed. They include nonseparable utility, such as habit formation. Expected utility has been empirically rejected because it implies that individuals would hold portfolios far riskier than we observe. This has lead to the use of Kreps-Porteus preferences, in particular the Epstein-Zin case.
One simplifying, but unrealistic assumption, is that individuals know the probabilities of the risks they face. Robust decisionmaking models, particularly those advanced by Hansen and Sargent, relax that assumption.
Ambiguity and inattention are also bounded rationality concepts used in economics.
Behavioral economics and withholding
As an economist at the US Treasury Department, Milton Friedman made the case for income tax withholding in 1942. Withholding reduced the transparency of the tax and arguably made it easy to raise the revenue needed to finance WWII. Economists know that this goes against his rational expectations approach to economics, leading us to think that Friedman was just doing the job he was hired to do. It turns out that he (being a pragmatic economist) believed this argument, said it was necessary in WWII to raise the necessary revenue but wished that it was repealed after the war. This is an example of what is now called “Behavioral Economics”, a field that has ignored Friedman’s contributions and arguments. One way to formulate this idea is that people will not save today enough money to pay tax bills several months in the future, an idea modeled by hyperbolic discounting. We will use our tools to examine the implications of hyperbolic discounting, along with other forms of “bounded rationality” proposed in behavioral economics, to analyze taxpayer’s responses to withholding and other features of the tax system.
Intrafamily decision making
Families are not individuals, but economists like to assume that they act as if they were by assuming that differences in preferences are resolved in a rational fashion and lead to Pareto efficient outcomes. The movie “The War of the Roses” makes one sceptical. Game theory produces few precise predictions, but it can describe the large variety of behaviors compatible with the rules of a game. Recently developed computational tools can determine that set of possible Nash equilibria and study how changes in the rules and payoffs will affect that set. They could be applied to study tax policy details, such as the option to file separately or jointly, might affect intrafamily decisionmaking on joint labor supply decisions.
40 years of data gathering and empirical analysis as produced information about the demand and supply curves in the US economy but has also made us aware of the limits of our knowledge. Economics is not like physics where, for example, the fine structure constant is known with ten digit accuracy. This project will use the findings of the best econometric analysis but also incorporate the remaining uncertainty into evaluations of policy.
We will rely on results from the “structural estimation” approach to empirical work. This approach uses data to determine the underlying demand and supply curves in the economy. Some empirical tax research looks at historical data on tax rates and economic output and use regressions to make predictions about changes in tax policy. That approach cannot work for analyzing novel tax reform ideas because there is no data regarding new and unused tax policies. Structural estimation is the only possible empirical foundation for analyzing fundamental tax reform ideas
The computational power available to economists 40 years ago was infinitesimal compared to what we can use today. Computer scientists point to Moore’s law which tells us that the past 40 years have increased computer chip speed by a factor of 1,000,000. Economists focus on cost, where the advances have been even more dramatic with cost of a gigaflop being $40M in 1984 to $0.03 in 2017. Equally impressive have been advances in algorithms used to solve problems similar to those in economics. Furthermore, technology now allows us to connect tens of thousands of small computers to work together to solve enormous problems in a short time.
Forty years ago, economists had to use to very simplified descriptions of the world when analyzing economic data and tax policies. Economists who focused on different aspects of reality often came to different conclusions. Today economists can merge far more features of economic life into economic models and use those models as a silicon test tube to determine which features are most important.
Investments in human capital, housing capital and financial assets
The tax system’s distortions of household investment and savings decisions will be one of the first issues examined. This topic has been studied extensively but using highly stylized models. FlatTax+40 will do a far more extensive study, incorporating housing investments, portfolio investments, tax-favored retirement accounts, and uncertainty in both wages and asset returns. It will also examine the implications of alternative models of decisionmaking and preferences.
Flat Tax versus optimal commodity-specific taxes
Diamond-Mirrlees advocates different tax rates for each consumption good. The Flat Tax uses a single rate, implying that it is inferior in theory. The policy question is whether this deviation from theory is sufficiently important to justify more complex forms of consumption taxation.
Flat Tax plus carbon tax
Conservatives like the Flat Tax and Liberals want a carbon tax, which is a particular kind of consumption tax. Perhaps there is a grand deal.
Unemployment insurance design
The unemployment insurance system is part of the overall tax and subsidy system in the US. Labor supply decisions are affected by fluctuations in the labor market, tax policy, and the unemployment insurance system. Retirement savings incentives in the tax code encourage saving but often in illiquid forms. Any comprehensive analysis of life-cycle decision making and taxation should include studies on how the labor market matching process is affected by all social programs.
After we have constructed a database of the preferences of individuals over alternative tax systems, we can use current models of social choice, such as legislative bargaining models, to study what those theories say about likely outcomes of the political process.
The purpose of this project is to apply modern tools of analysis to tax policy, not find ways to confirm prejudices. Many disagreements arise from different assumptions about how individuals make decisions based on their preferences and beliefs, and how societies aggregate those individual differences to arrive at tax policies. This project will focus on what economic analysis can say about the implications of various alternative policies but stay away from telling society what its objectives should be. I expand on those ideas in two points below.
Pareto efficiency, not optimality
The optimal tax system depends on the social objective. Many use representative agent models which imply no conflict between individual preferences and the social objective. Representative agent models were the only tractable models 40 years ago, but obviously have limited value in thinking about the distributional consequences of any tax question. In his work on taxation and income distribution, Mirrlees used a utilitarian objective in his static model and assumed people differed only in their productivities.
The past 40 years have seen advances in overlapping generations models, dynamic stochastic general equilibrium models, and methods for applying Mirrlees’ ideas to far more complex and diverse societies. We now have the ability to examine how taxes affects life-cycle decisions in societies with several dimensions of heterogeneity.
However, we still have no scientific tools for telling societies what their objective should be. One theme of this project will focus on telling policymakers the consequences of alternative tax systems, point to policies that are most effective for different objectives, and then let them make a choice based on their preferences. The key concept is Pareto Optimality and Multiobjective Optimization is the branch of the mathematical programming literature focused on these issues, developing general purpose tools often used by consultants when they help large organizations decide among their options.
For example, suppose that society is choosing wage and interest income taxes, and a level of expenditures. Assume a typical wage-earnings profile for individuals in an overlapping generations model, but that young people cannot borrow against future higher income. This is a capital market imperfection that renders basic optimal tax theory irrelevant and has led some to argue for interest income taxation. One way to examine tax policy choices is to compute the implications of alternative tax systems and then display the ones that are Pareto efficient in terms have producing the most utility given the revenue target. The figures below display the results of one example comparing the life-cycle utility for an individual (who lives 80 years) to the present value of the taxes paid. Each dot represents the utility-revenue combination achieved by some choice of the wage and interest tax rates with the color representing the magnitude of the tax. As you can see, the best choice always has a low rate of taxation on interest income.
These graphs show that interest income taxation should be low no matter what level of government expenditure is chosen. In this case interest income taxation creates distortions sufficiently costly that they wipe out any benefit from using a revenue-neutral reduction in the labor income tax rate. One objective of this project is to evaluate the Pareto Efficient frontier of tax policy alternatives along side the Flat Tax. If some policy is superior to the Flat Tax, these tools will allow us to quantify the magnitude of that difference, and provide some information on whether the benefits of a more complex policy justifies the extra administrative costs relative to the Flat Tax.
Technical note: It took 19 seconds for Mathematica on my obsolete iMac to compute the results for the 6561 tax rate combinations. This glacial speed is justified by the ease with which I could create these plots. Serious research will use the best numerical algorithms and software on the best available hardware, improving performance by orders of magnitude.
Robustness, not calibration
A common procedure today is to create a model, collect some data, find a collection of parameter values (elasticity of labor supply, intertemporal elasticity of substitution, risk aversion,…) that match the data, and then used that calibrated model to evaluate tax policies. When I took econometrics over 40 years ago, we were taught to compute standard errors to measure the precision of our estimates. I also learned about Bayesian methods where data is used to update beliefs about parameter values. These methods imply that one should not assume that the point estimates are adequate for tax policy studies. Instead, we should accept the fact that our information about the critical parameters is imprecise and realize that any policy analysis should examine how sensitive results are to empirically plausible alternatives. All my papers on policy questions take this approach, and focus on results that are robust to the range of empirical estimates of the critical parameters.
Policy makers of often these ideas. According to a famous story, when his economic advisors gave him a range of values for a question he asked, LBJ responded “Ranges are for cattle. Give me a number.” This is also a common attitude among academic economists. The recent book Radical Uncertainty by Kay and King argues forcefully against this use of economic models.
Computational engineers face the same issues. A range of general concerns about the reliability of computational models at US nuclear weapons labs has lead to a variety of methods called Validation, Verification and Uncertainty Quantification (VVUQ). Uncertainty quantification methods address important sensitivity issues in computational engineering and can surely be used in economics.
As usual, Alvin Rabushka’s description of this proposal is perfect:
“What you propose, when fully articulated, provides a platform for assessing different tax proposals as they enter the political realm for discussion, debate, and analysis.”