In Angus Deaton’s “view, theory is a complement to measurement, and generalizable insights arise only when the underlying economic mechanisms are elucidated and tested.” This appears in an article in today’s New York Times
(http://mobile.nytimes.com/2015/10/13/upshot/why-angus-deaton-deserved-the-economics-nobel-prize.html).
But that’s hardly even half the story. Past this, theory and data should together be embodied in portable instrumentation measuring in a common language and distributed throughout interconnected stakeholder networks. Then economics ceases to be merely the study of economic behaviors and phenomena and becomes a living instantiation and ontological management embodying those behaviors and phenomena. For instance, theory-informed data on the economics of education may support robustly generalizable policies but can do nothing to support formation of common currencies for the exchange of literacy capital. For that, the rules, roles and responsibilities of market institutions must be structured with instruments facilitating the exchange of intangible assets.
That is, to inform teachers’ instructional decisions for individual students as well as their pedagogical methods, and so to make teachers into effective cultivators of literacy and other forms of living capital, the theory-data-instrument package has to be systematically incorporated into the curriculum, assessment, and teachers’ continuing education and professional development. A start in this direction can be found in, for instance, the uses made of the Lexile Framework for Reading by various educational publishers and assessment agencies.
Deaton’s “method of careful analysis of data from household surveys has transformed four large swaths of the dismal science: microeconomics, econometrics, macroeconomics and development economics.”
“This focus on empirics has been a boon for the field of econometrics, which is the application of statistical methods to economic problems. Mr. Deaton’s signature achievement in this area has been in forcing empirical researchers to pay closer attention to questions of measurement.”
“As the Nobel committee put it, Mr. Deaton’s ‘work covers a wide spectrum, from the deepest implications of theory to the grittiest detail of measurement.'”
Here we see a common but very shortsighted automatic connection between statistical data analysis and measurement. Statistical analyses usually do nothing to investigate, establish or deploy the three key features of measurement: a) the existence of a meaningful, invariant unit for expressing theoretically explained and empirically reproducible comparisons, b) the calibration, universal distribution and maintenance of instruments measuring in that unit, and c) the systematic incorporation of that unit in research and practice as the legally required expression of quantities exchanged in accord with financial, accounting, and regulatory standards.
Had Deaton actually covered the full econometric spectrum the way the Nobel committee said, “from the deepest implications of theory to the grittiest detail of measurement,” his Nobel prize would celebrate the accomplishment of a whole new science of economics. That new science would not be, as today’s economics remains, primarily focused on centralized statistical analyses of data incorporating uncontrolled, unexamined, instrument- and sample-dependent variations in unit size. Hayek’s critique of socialism’s “fatal conceit” of central planning suggests that a more thoroughly capitalist economics would instead seek to form efficient markets for low friction exchanges of high quality information owned and controlled by individuals.
Most readers first reaction to the preceding statements will be that these expectations are patently unrealistic and impossible for measurement in the economics of a wide range of sectors, from education to health care to social services to environmental resource management. But that reaction is based in ignorance of the decades of research and practice that have already put measurement in the hands of teachers, clinicians and others on the front lines of these fields.
It is encouraging that “Deaton has turned his attention to measures of subjective well-being, including happiness,” and that he has “highlighted the problems in constructing coherent measures of global poverty.” But again, examination of his work shows no use of the well-established viability and value of the three key features of measurement, despite the large and readily available literature on the relevant theories, data, instruments, models, methods, software, and studies.
How long will we continue rewarding and celebrating only ideas that fit our preconceptions concerning what’s possible? What will it take for truly new ideas to break out of our cultural blind spots and start informing explorations of alternative methods and possibilities? When will we start systematically testing more of our assumptions about what’s possible, instead of blithely chaining ourselves to perspectives that prevent us from fulfilling our dreams of a fairer, more equitable world?
Deaton deserved a Nobel prize, as others have said, because the prize rewards particular ways of satisfying unspoken norms more than it encourages truly breakout disruptions of our expectations.