February 6, 2023

Falling wages and discretionary pay-setting practices

A decline in the use of standardized pay rates contributed to stagnating wages from the 1970s to the mid-1990s.

Source: darrenmbaker

In the 1970s, hourly compensation for blue-collar workers began to flatline, in stark contrast to rising incomes at the top. To the surprise of economists, however, productivity continued to grow. 

While numerous explanations for this divergence have been put forward—such as a fall in the real minimum wage, declining union power, and technological change, economists are still filling in the details.

One part of the story includes a shift in the system used to determine worker compensation, according to a paper in the American Economic Journal: Applied Economics.

Authors Maxim Massenkoff and Nathan Wilmers show that employers moved from standardized pay rates to more flexible pay-setting practices in the 1970s and 1980s, which coincided with stagnating wages. Using new microdata, they found that when employers abandoned standardized pay rates, wages fell, particularly for the lowest-paid workers in a job.

Often when we study sources of labor market inequality, we really just focus on the big macro changes and we don't think as much about how those big macro societal trends play out inside workplaces and impact workers.

Nathan Wilmers

Digging through the National Archives, the authors found a unique dataset known as the Wage Fixing Authority Survey (WFAS), which has been gathered annually since 1974 from establishments in 130 local labor markets across the United States. The survey asks employers to report pay levels and pay-setting practices for a number of key blue-collar occupations and provides the only establishment-by-occupation-level microdata for multiple US industries going  back to the onset of wage stagnation in the 1970s.

“From most datasets, you know something about how much workers are getting paid, but you know very little about what's actually happening inside these workplaces to determine what that wage number is,” Wilmers told the AEA in an interview.  “We were super excited to see questions about the basis on which employers were setting pay in this survey data.”

Historical research suggests that before the 1910s, wages were largely set by factory foremen through a flexible, informal process. Starting in the 1910s and continuing into the 1970s, employers sought to reduce worker turnover by standardizing pay-rate schedules—a practice unions also encouraged as a way to increase fairness and decrease competition among workers. These standardized rates determined wages largely by job title and seniority. But after the economic shocks of the 1970s, pay setting became more individualized; companies started clawing back discretion for their managers by strengthening their merit raise and pay-for-performance programs.

The WFAS dataset allowed the authors to take a detailed look at this broad trend in payment practices and wages beginning in the 1970s.

The data showed that in 1974, around three quarters of jobs in the WFAS were covered by standardized pay rates. By 1991, only half of those jobs were covered, with the remainder allowing some managerial discretion in pay-setting.

 

The decline of standardized pay rates
The chart below shows the decline in the share of jobs without any variation in pay and those with pay determined solely by seniority from 1974 to 1991. It also shows an increase in the share of "flexible" jobs that set pay based on merit or other methods.

 

Using an event study, the authors linked these changes to a simultaneous fall in wages. They found an immediate and sustained reduction in real wages of around 1 percent. 

The group hardest hit appeared to be nontrades workers, such as janitors and food service workers. They experienced wage declines of around 30 percent from the 1978 peak, while the wages of workers in trades occupations, such as mechanics and electricians, only fell 10 percent. Depending on the modeling assumptions chosen, the shift in pay-setting practices accounted for as little as 1 percent or as much as 20 percent of the real wage decline for nontrades blue-collar workers during this period.

The researchers argued that more flexible pay setting practices made it easier for employers to skip annual raises, issue smaller pay increases, and pay less to new workers. These pay policy changes were likely one way that decreases in bargaining power were translated into decreased wages, but many other factors may have contributed to this shift.

“Inflation could be a big part of the story too, because these changes were coming off of the heels of a lot of inflation in the early eighties,” Massenkoff said. “It seems like this inflation episode might have really scared firms away from getting locked into much higher salaries.” 

Even technological advancements could have played a role, such as new human resources monitoring tools that made performance assessment easier and more reliable.

Overall, standardized pay likely only acted as a bulwark against larger forces pushing wages down. But it is still crucial to understanding the divergence between wage and productivity growth.

“These decisions about human resources and how managers are setting pay really matter for workers,” Wilmers said. “Often when we study sources of labor market inequality, we really just focus on the big macro changes and we don't think as much about how those big macro societal trends play out inside workplaces and impact workers.”

Wage Stagnation and the Decline of Standardized Pay Rates, 1974–1991 appears in the January 2023 issue of the American Economic Journal: Applied Economics.