Squeezed middle class, "statistically significant," and the printing press
A family has dinner together.
Bigstock
Smorgasbord
The OECD published Under Pressure: The Squeezed Middle Class:
On average across OECD countries, the share of people in middle-income households, defined as households earning between 75% and 200% of the median national income, fell from 64% to 61% between the mid-1980s and mid-2010s. The economic influence of the middle class and its role as ‘centre of economic gravity’ has also weakened. The aggregate income of all middle-income households was four times the aggregate income of high-income households three decades ago; today, this ratio is less than three. . . . More than one-in-five middle-income households spend more than they earn. Over-indebtedness is higher for middle-income than for both low- and high-income households. . . . Middle-class lifestyle is typically associated with certain goods and services and certain living conditions, such as decent housing, good education and good and accessible health services. However, the prices of core consumption goods and services such as health, education and housing have risen well above inflation, while middle incomes have been lagging behind. In particular, ageing and new medical technologies have driven up the cost of health services; the race for diplomas is pressing parents to invest more and more in education while, at the same time, education services became more costly in a number of countries; the geographical polarisation of jobs is pushing up housing prices in large urban areas, precisely where most rewarding jobs are available.
OECD (2019)
¤
Steven A. Altman, Pankaj Ghemawat, and Phillip Bastian have written the “DHL Global Connectedness Index 2018: The State of Globalization in a Fragile World”:
Surprisingly, one commonality between globalization’s supporters and its critics is that both tend to believe the world is already far more globalized than it really is. . . . The world is both more globalized than ever before and less globalized than most people perceive it to be. The intriguing possibility embodied in that conclusion is that companies and countries have far larger opportunities to benefit from global connectedness and more tools to manage its challenges than many decision-makers recognize.
Altman, et al. (2019)
¤
The National Academies of Sciences, Engineering, and Medicine have published “A Roadmap to Reducing Child Poverty,” edited by Greg Duncan and Suzanne Le Menestrel.
[M]any studies show significant associations between poverty and poor child outcomes, such as harmful childhood experiences, including maltreatment, material hardship, impaired physical health, low birthweight, structural changes in brain development, and mental health problems. Studies also show significant associations between child poverty and lower educational attainment, difficulty obtaining steady, well-paying employment in adulthood, and a greater likelihood of risky behaviors, delinquency, and criminal behavior in adolescence and adulthood. Because these correlations do not in themselves prove that low income is the active ingredient producing worse outcomes for children, the committee focused its attention on the literature addressing the causal impacts of childhood poverty on children. The committee concludes from this review that the weight of the causal evidence does indeed indicate that income poverty itself causes negative child outcomes, especially when poverty occurs in early childhood or persists throughout a large portion of childhood.
Duncan and Le Menestrel (2019)
¤
Kevin L. Kliesen, Brian Levine, and Christopher J. Waller discuss “Gauging Market Responses to Monetary Policy Communication.” The authors point out that a century ago, an unofficial motto attributed to the Bank of England was “Never explain, never apologize.” From 1967 to 1992, the main method of communication for the Federal Open Market Committee (FOMC) was to release a public statement 90 days after its meetings—not right after meetings. In contrast,
[T]he modern model of central bank communication suggests that central bankers prefer to err on the side of saying too much rather than too little. The reason is that most central bankers believe that clear and concise communication of monetary policy helps achieve their goals. . . . We find that Fed communication is associated with changes in prices of financial market instruments such as Treasury securities and equity prices. However, this effect varies by type of communication, by type of instrument, and by who is doing the speaking. . . . Perhaps not surprisingly, we find that the largest financial market reactions tend to be associated with communication by Fed Chairs rather than by other Fed governors and Reserve Bank presidents and with FOMC meeting statements rather than FOMC minutes.
Federal Reserve Bank of St. Louis (2019)
¤
Andreas Schrimpf and Vladyslav Sushko present “Beyond LIBOR: A Primer on the New Benchmark Rates”:
As of mid-2018, about $400 trillion worth of financial contracts referenced London interbank offered rates (LIBORs) in one of the major currencies. . . . A major impetus for reform comes from the need to strengthen market integrity following cases of misconduct involving banks’ LIBOR submissions. To protect them against manipulation, the new (or reformed) benchmark rates would ideally be grounded in actual transactions and liquid markets rather than be derived from a poll of selected banks. . . . The reform process constitutes a major intervention for both industry and regulators, as it is akin to surgery on the pumping heart of the financial system. . . . The new risk-free rates (RFRs) provide for robust and credible overnight reference rates, well suited for many purposes and market needs. In the future, cash and derivatives markets are expected to migrate to the RFRs as the main set of benchmarks. . . . It is possible that, ultimately, a number of different benchmark formats will coexist, fulfilling a variety of purposes and market needs. The jury is still out on whether any resulting market segmentation would lead to material inefficiencies or could even be optimal under the new normal.
Schrimpf and Shushko (2019)
Collections of Essays
Meredith A. Crowley has edited Trade War: The Clash of Economic Systems Threatening Global Prosperity, a readable e-book of 11 essays. From Crowley’s introduction:
A trade war of unprecedented scope and magnitude currently engulfs the world’s two largest economies—the US and China. . . . Multiple factors—the unprecedented economic growth of an economy operating outside the traditional Western capitalist model; new structures of production with supply chains spanning the globe; geographically concentrated job losses within the US; and a multilateral trading system that has stagnated and failed to keep pace with changes in the world economy—have all contributed to the current mess. The current problems extend well beyond the highly visible US–China conflict to the wider community of countries struggling with the interface between Chinese state capitalism and their own capitalist systems, the failure of the WTO to make progress with multilateral negotiations over almost anything, and a dispute resolution system that has veered off track. From our current vantage point, the prospects for the future of the multilateral trading system look grim. . . . Yet, in the middle of ongoing negotiations to resolve the US–China conflict, it is important to remember that the open, liberal multilateral trading system has delivered enormous benefits in its 75-year history—Ralph Ossa estimates the gains from trade amount to one-quarter of world income.
Crowley (2019)
¤
Bigstock
The American Statistical Association has devoted a special supplemental issue of its journal The American Statistician to the theme “Statistical Inference in the 21st Century: A World Beyond p < 0.05.” Ronald L. Wasserstein, Allen L. Schirm, and Nicole A. Lazar contribute a useful overview essay, “Moving to a World beyond ‘p < 0.05.’”:
We conclude, based on our review of the articles in this special issue and the broader literature, that it is time to stop using the term ‘statistically significant’ entirely. Nor should variants such as ‘significantly different,’ ‘p < 0.05,’ and ‘nonsignificant’ survive, whether expressed in words, by asterisks in a table, or in some other way. Regardless of whether it was ever useful, a declaration of ‘statistical significance’ has today become meaningless. . . . In sum, ‘statistically significant’—don’t say it and don’t use it.
The American Statistical Association (2019)
¤
Heather Boushey, Ryan Nunn, and Jay Shambaugh have edited a collection of eight essays on the subject “Recession Ready: Fiscal Policies to Stabilize the American Economy.” From their introduction:
[I]ncreasing the automatic nature of fiscal policy would be helpful. Increasing spending quickly could lead to a shallower and shorter recession. Using evidence-based automatic ‘triggers’ to alter the course of spending would be a more-effective way to deliver stimulus to the economy than waiting for policymakers to act. Such well-crafted automatic stabilizers are the best way to deliver fiscal stimulus in a timely, targeted, and temporary way. There will likely still be a need for discretionary policy; but by automating certain parts of the response, the United States can improve its macroeconomic outcomes.
Boushey, et al. (2019)
¤
The annual Conference on Research in Income and Wealth focuses on improved measurement of economic statistics. Katharine G. Abraham, Ron S. Jarmin, Brian Moyer, and Matthew D. Shapiro organized this year’s conference, held March 15–16, 2019, in Bethesda, Maryland, on the theme of “Big Data for 21st Century Economic Statistics.” Sixteen of the papers (and their presentation slides) are available at the website of the conference organizer, the National Bureau of Economic Research. A selection of some titles gives a flavor of the proceedings:
- “Re-engineering Key National Economic Indicators,” by Gabriel Ehrlich, John C. Haltiwanger, Ron S. Jarmin, David Johnson, and Matthew D. Shapiro
- “Nowcasting the Local Economy: Using Yelp Data to Measure Economic Activity,” by Edward L. Glaeser, Hyunjin Kim, and Michael Luca
- “Transforming Naturally Occurring Text Data into Economic Statistics: The Case of Online Job Vacancy Postings,” by David Copple, Bradley J. Speigner, and Arthur Turrell
- “From Transactions Data to Economic Statistics: Constructing Real-Time, High-Frequency, Geographic Measures of Consumer Spending,” by Aditya Aladangady, Shifrah Aron-Dine, Wendy Dunn, Laura Feiveson, Paul Lengermann, and Claudia R. Sahm
- “Valuing Housing Services in the Era of Big Data: A User Cost Approach Leveraging Zillow Microdata,” by Marina Gindelsky, Jeremy Moulton, and Scott A. Wentland
NBER (2019)
¤
The Harvard Project on Climate Agreements and Harvard’s Solar Geoengineering Research Program have collaborated to publish “Governance of the Deployment of Solar Geoengineering,” an introduction followed by 26 short essays. In “The Implications of Uncertainty and Ignorance for Solar Geoengineering,” Richard J. Zeckhauser and Gernot Wagner write:
Risk, uncertainty, and ignorance are often greeted with the precautionary principle: ‘do not proceed.’ Such inertia helps politicians and bureaucrats avoid blame. However, the future of the planet is too important a consequence to leave to knee-jerk caution and strategic blame avoidance. Rational decision requires the equal weighting of errors of commission and omission. . . . That also implies that the dangers of SG [solar geoengineering]—and they are real—should be weighed objectively and dispassionately on an equal basis against the dangers of an unmitigated climate path for planet Earth. The precautionary principle, however tempting to invoke, makes little sense in this context. It would be akin to suffering chronic kidney disease, and being on the path to renal failure, yet refusing a new treatment that has had short-run success, because it could have long-term serious side effects that tests to date have been unable to discover. Failure to assiduously research geoengineering and, positing no red-light findings, to experiment with it would be to allow rising temperatures to go unchecked, despite great uncertainties about their destinations and dangers. That is hardly a path of caution.
Zeckhauser and Wagner (2019)
Economists Speak
Christopher J. Ruhm delivered the “Presidential Address: Shackling the Identification Police?” to the Southern Economic Association.
To summarize, clean identification strategies will frequently be extremely useful for examining the partial equilibrium effects of specific policies or outcomes—such as the effects of reducing class sizes from 30 to 20 students or the consequences of extreme deprivation in-utero—but will often be less successful at examining the big ‘what if’ questions related to root causes or effects of major changes in institutions or policies. . . . Have the identification police become too powerful? The answer to this question is subjective and open to debate. However, I believe that it is becoming increasingly difficult to publish research on significant questions that lack sufficiently clean identification and, conversely, that research using quasi-experimental and (particularly) experimental strategies yielding high confidence but on questions of limited importance are more often being published. In talking with PhD students, I hear about training that emphasizes the search for discontinuities and policy variations, rather than on seeking to answer questions of fundamental importance. At professional presentations, experienced economists sometimes mention ‘correlational’ or ‘reduced-form’ approaches with disdain, suggesting that such research has nothing to add to the canon of applied economics.
Ruhm (2019)
¤
David A. Price interviews R. Preston McAfee:
First, let’s be clear about what Facebook and Google monopolize: digital advertising. The accurate phrase is ‘exercise market power,’ rather than monopolize, but life is short. Both companies give away their consumer product; the product they sell is advertising. While digital advertising is probably a market for antitrust purposes, it is not in the top 10 social issues we face and possibly not in the top thousand. Indeed, insofar as advertising is bad for consumers, monopolization, by increasing the price of advertising, does a social good. . . . That leaves . . . two places where I think we have a serious tech antitrust problem. . . . My concern is that phones, on which we are incredibly dependent, are dominated by two firms that don’t compete very strongly. While Android is clearly much more open than Apple, and has competing handset suppliers, consumers face switching costs that render them effectively monopolized. . . . The second place I’m worried about significant monopolization is Internet service. In many places, broadband service is effectively monopolized. . . . I’m worried about that because I think broadband is a utility. You can’t be an informed voter, you can’t shop online, and you probably can’t get through high school without decent Internet service today. So that’s become a utility in the same way that electricity was in the 1950s. Our response to electricity was we either did municipal electricity or we did regulation of private provision. Either one of those works. That’s what we need to do for broadband.
Federal Reserve Bank of Richmond (2018)
Discussion Starters
Peter Cappelli thinks “Your Approach to Hiring Is All Wrong”:
Businesses have never done as much hiring as they do today. They’ve never spent as much money doing it. And they’ve never done a worse job of it. . . . The recruiting and hiring function has been eviscerated. Many U.S. companies—about 40%, according to research by Korn Ferry—have outsourced much if not all of the hiring process to ‘recruitment process outsourcers,’ which in turn often use subcontractors, typically in India and the Philippines. . . . Survey after survey finds employers complaining about how difficult hiring is. . . . But clearly they are hiring much more than at any other time in modern history, for two reasons. The first is that openings are now filled more often by hiring from the outside than by promoting from within. In the era of lifetime employment, from the end of World War II through the 1970s, corporations filled roughly 90% of their vacancies through promotions and lateral assignments. Today the figure is a third or less. When they hire from outside, organizations don’t have to pay to train and develop their employees. . . . The second reason hiring is so difficult is that retention has become tough: Companies hire from their competitors and vice versa, so they have to keep replacing people who leave. Census and Bureau of Labor Statistics data shows that 95% of hiring is done to fill existing positions. Most of those vacancies are caused by voluntary turnover. . . . The root cause of most hiring, therefore, is drastically poor retention.
Harvard Business Review (2019)
¤
Wikimedia Commons
Jeremiah Dittmar and Skipper Seabold explain how “Gutenberg’s Moving Type Propelled Europe towards the Scientific Revolution”:
Printing was not only a new technology: it also introduced new forms of competition into European society. Most directly, printing was one of the first industries in which production was organised by for-profit capitalist firms. These firms incurred large fixed costs and competed in highly concentrated local markets. Equally fundamentally—and reflecting this industrial organisation—printing transformed competition in the ‘market for ideas.’ Famously, printing was at the heart of the Protestant Reformation, which breached the religious monopoly of the Catholic Church. But printing’s influence on competition among ideas and producers of ideas also propelled Europe towards the scientific revolution. . . . Following the introduction of printing, book prices fell steadily. The raw price of books fell by 2.4 per cent a year for over a hundred years after Gutenberg. Taking account of differences in content and the physical characteristics of books, such as formatting, illustrations and the use of multiple ink colours, prices fell by 1.7 per cent a year. . . . Printing provided a new channel for the diffusion of knowledge about business practices. The first mathematics texts printed in Europe were ‘commercial arithmetics,’ which provided instruction for merchants. With printing, a business education literature emerged that lowered the costs of knowledge for merchants. The key innovations involved applied mathematics, accounting techniques and cashless payments systems.
LSE Business Review (2019)
¤
Michael Manville offers a discussable counterfactual in “Longer View: The Fairness of Congestion Pricing: The Choice Between Congestion Pricing Fairness and Efficiency Is a False One”:
Suppose we had a world where all freeways were priced, and where we used the revenue to ease pricing’s burden on the poor. Now suppose someone wanted to change this state of affairs, and make all roads free. Would we consider this proposal fair? The poorest people, who don’t drive, would gain nothing. The poor who drive would save some money, but affluent drivers would save more. Congestion would increase, and so would pollution. The pollution would disproportionately burden low-income people. With priced roads, poor drivers were protected by payments from the toll revenue. With pricing gone, the revenue would disappear as well, and so would compensation for people who suffered congestion’s costs. This proposal, in short, would reduce both efficiency and equity. It would harm the vulnerable, reward the affluent, damage the environment, and make a functioning public service faulty and unreliable. . . . We have so normalized the current condition of our transportation system that we unthinkingly consider it fair and functional. It is neither. Our system is an embarrassment to efficiency and an affront to equity.
Transfers (2019)