By Atanu Biswas
Recently, I got interested in a quartet of detective novels jointly written by two American economics professors between 1978 and 2016 under the pseudonym Marshall Jevons. Henry Spearman, their amateur detective, is a Harvard professor and a Nobel laureate economist. In the novels, Spearman, clearly modelled after legendary economist Milton Friedman, uses microeconomic frameworks to solve the mysteries by assuming that humans frequently act irrationally and make mistakes.
Can an economist (or statistician) who works with data be related to a detective due to her pursuit of extracting the truth hidden therein? I studied the Goldin essay from 1998 to learn how she relates to the role of a “detective.” In fact, her Nobel-winning research involved collecting and processing tonnes of data in an effort to discover the “truth” about the female labour force from 1790 to the present. “I heard the clarion call for the economist detective,” she said.
The difference between the actual “economist detective” and the fictional Henry Spearman is clear. While Goldin theorises by using data insight, which is much more challenging, Spearman uses conventional theories to unravel the mysteries.
In her 1998 essay, Goldin wrote: “I recall the precise moment when I switched my attention to the evolution of the female labor force. I had few ideas about the location of evidence; fewer still on what the evolution was. But I knew it would be a story of importance, relevant to the current period, and a project for which my detective work would pay off. I also knew that I was the one to do it.” But does an “economist detective” just adopt Joe Friday’s style, which was to always seek the “facts,” or Sherlock Holmes’, which was to pursue theorising with the “facts”? Actually, no. “They may have been great detectives, but they would have made lousy economists,” Goldin perceived. An “economics detective” requires math and statistics, and she should have both theory and evidence to support her findings. “Go back and forth among theory, empirics, and stories until you iterate on the very best truth you can tell,” Goldin advises.
Since Adam Smith’s time, economic theories have not always been developed by sifting through vast amounts of data. Earlier, there was a dearth of data as well. However, some outstanding contemporary economists might be regarded as “detectives” in Goldin’s perspective. Amartya Sen used data on famines in Bengal, Ethiopia, the Sahel, and Bangladesh to conclude that in many cases, social and economic factors such as declining wages, unemployment, rising food prices, and poor food distribution rather than food supplies were what caused starvation among certain groups in society. Milton Friedman, who won the 1976 Nobel Prize, was unquestionably an expert in empirical modelling. The 2015 economics Nobel laureate Angus Deaton measured poverty using tonnes of data, including our NSSO data. Data insights were frequently a source of inspiration for 2002 economics Nobel laureate Daniel Kahneman during his research career. Robert Lucas Jr. (1995 Nobel laureate in economics), Daniel McFadden (2000 Nobel laureate in economics), Richard H. Thaler (2017 Nobel prize-winner in economics), and Joshua Angrist (2021 economics Nobel-winner), to name a few, are other renowned “economist detectives” of our time. In fact, labour economists and behavioural economists are more inclined to base their theories on empirical insights.
What about statisticians? Incidentally, the American Statistical Association defines “statistics” as the “science of learning from data.” As a result, a statistician should, by definition, be a data-detective. In fact, data insights were instrumental for many early findings in statistics. Francis Galton developed the crucial ideas of regression and correlation coefficient in the late nineteenth century while analysing data on more than 10,000 individuals. Under the alias “Student,” William Gosset used data from the Guinness brewery to introduce the Student’s t-distribution in the early 1900s. A thorough examination of anthropometric data helped PC Mahalanobis develop his renowned D-squared statistic. And many more examples are there. Legendary statisticians like Karl Pearson and Ronald Fisher also used data insights to develop their ideas. Well, were they “statistician detectives”?
Statistics, the subject, has undergone significant change over time. The discipline of data science, however, has emerged within the last few decades, ostensibly to comprehend the pulse of data. The demand for data scientists has undoubtedly been boosted by the ever-expanding data universe and the digital and AI revolutions. However, data science frequently appears to be just a brutal tool for slicing and dicing through vast amounts of data with already accessible statistical and computational techniques.
What about science, in general? We know that scientists like Charles Darwin and Gregor Mendel meticulously collected data and developed their theories and laws using empirical evidence. So, historically, science has been data-driven when a “scientist detective” has used relevant scientific expertise to extract knowledge from data. Overall, there are some Sherlock Holmeses in the fields of statistics, economics, and science in general. Claudia Goldin aided in our comprehension of that.
Atanu Biswas, professor of statistics, Indian Statistical Institute, Kolkata. Views are personal.