Revisiting the Minimum Wage and a questionable “seminal study”

Yes, readers, I have gone back to school and am studying economics.  And I’m killing two birds with one stone by writing my commentary on a class-assigned paper in blog format.

The paper in question is, in fact, the 1994 paper which shifted economists’ thinking on the minimum wage, because of its claim that minimum wage boosts had no ill effects on employment and were, basically, “free money.”

Here’s how Vox characterized it:

[F]or years many economists assumed, almost without questioning, that minimum wages destroyed jobs. They might be worthwhile, sure, but you have to weigh the harm they do to the demand for labor against their benefits for workers who remain employed.

In a paper first published by the National Bureau of Economic Research in 1993, Krueger and his co-author Card exploded that conventional wisdom. They sought to evaluate the effects of an increase in New Jersey’s minimum wage, from $4.25 to $5.05 an hour, that took effect on April 1, 1992. (At 2019 prices, that’s equivalent to a hike from $7.70 to $9.15.)

Card and Krueger surveyed more than 400 fast-food restaurants in New Jersey and eastern Pennsylvania to see if employment growth was slower in New Jersey following the minimum wage increase. They found no evidence that it was. “Despite the increase in wages, full-time-equivalent employment increased in New Jersey relative to Pennsylvania,” they concluded. That increase wasn’t statistically significant, but they certainly found no reason to think that the minimum wage was hurting job growth in New Jersey relative to Pennsylvania.

Card and Krueger’s was not the first paper to estimate the empirical effects of the minimum wage. But its compelling methodology, and the fact that it came from two highly respected professors at Princeton, forced orthodox economists to take the conclusion seriously.

And with that in mind, join me as I dig through the meat of the study:  “Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania.”  (This is not actually the “class assignment”; I will need to distill my thoughts even further into 250 – 300 words, which will be harder!)

The core concept of the study was this:  generally speaking, it’s hard to measure the effects of a change in the minimum wage, because there’s so much much else happening at the same time.  For example, the latest change in the US federal minimum wage occurred at the same time as the “Great Recession.”  But in 1992, New Jersey increased its minimum wage to a level above the federal minimum, $5.05 rather than $4.25, and next-door Pennsylvania did not.  The authors believe that looking at changes in employment patterns, wages, costs, etc., at fast-food chains in those two states provide a means of analyzing the impact of the minimum wage hike.

In order to do so, they (or rather their employees) conducted phone surveys of fast-food restaurants in those two states in late February/early March of 1992, just before the minimum wage hike was implemented, and in November-December 1992, after the April 1992 change had had some time for effects to be seen.  They had, all things considered, reasonable response rates to their surveys (72.5% in PA and 91% in NJ, with different numbers of attempts made in the two states) for the first wave, and bolstered their response rate for the second wave with in-person visits as needed.

Their core findings:

In the New Jersey restaurants, the number of employees per store actually increased during this time frame, even as they decreased in Pennsylvania due to the recession at the time.  At the same time, within New Jersey, among stores which had previously had a starting wage equal to the minimum wage, as well as those stores with a starting wage above the minimum but below the new minimum, the number of employees increased; but in those stores where wages were already above the minimum, employment decreased.

The authors then get mathier.  They perform two regressions, one to estimate the effect on employment of a store being in New Jersey, and another to estimate the effect of a store having previously paid less than the new minimum wage.  This is where my interpretation is a bit marginal, but here goes:

The change in the number of FTE employees per fast-food restaurant in NJ compared to the change in PA, was 2.51.  The regression model calculates, stripping out other impacts, that New Jersey-ness accounted for 2.3 new employees per store.  Having to raise wages (compared to NJ restaurants already paying the new minimum) produced a regression factor of 11.91 times the “wage gap” when controlled for differences in different regions within NJ as well as for differences among the different large chains surveyed; this is a high factor because it gets reduced by this gap-factor, which is .11.

They also perform additional statistical tests by fine-tuning their calculations, for example, excluding New Jersey shore area stores because of their tourist economy, adjusting the weightings of part-time employees in calculating full-time equivalents, etc.  These produce different employment impacts but still the same conclusion, that the minimum wage increase actually increased employment.

The authors also assess whether the minimum wage hike affected other aspects of the restaurants’ operations.  There was an increase in full-time workers in New Jersey, but no significant effect with respect to restaurants who had paid less vs. more in NJ.  There was no statistically-significant change in the restaurants’ opening hours, the free/reduced-price meal benefit, the amount of the first raise or the time until that first raise is given.

They did find that prices in New Jersey increased by 4%, a slightly greater increase than would be needed to make up for the higher wages (taking into account the wage increase and the proportion of the restaurant’s costs due to labor), but they discard this as a relevant consideration because prices increased at the same level regardless of whether an individual restaurant was impacted by the wage hike or not (based on whether their starting wage had been below the new minimum or not).

Finally, they assessed whether the wage increase prevented new stores from opening, looking at broader data, and found no statistically-significant evidence.

After presenting their statistical tests, they propose various explanations.  They consider alternate labor-market theories “monopsonistic and job-search models”), but discard them.  They theorize that employers obliged to pay higher wages may decrease their quality (longer lines, reduced cleanliness) or may shift pricing of some products relative to others, but ultimately conclude with the simple statement that “these findings are difficult to explain.”

So what’s to be made of this?

Their analysis is certainly more useful than one without any “control group” and it’s the new “one weird trick” of economists to find and exploit what they consider to be “natural experiments” (though I suppose “new” is all relative).  It also has, I think, particular merit in looking at employment at specific businesses, rather than at unemployment rates across a region, so as to drill down to the question of “how do employer manage an increased labor cost?”

But there are plenty of deficiencies:

One common gripe of Krueger’s critics (e.g., at the Foundation for Economic Education) is that the time frame of Krueger’s analysis is simply too narrow.  By late February, employers already knew they would need to offer a much higher minimum wage, and would likely have been taking that into account by avoiding hiring and reducing staff with attrition.  It could even be that the increase in employees was an indicator that they found, on average, that they had been too cautious in the period leading up to the hike.  It also seems likely that employers wouldn’t have been sitting on some innovation that would allow them to reduce staff which they would suddenly implement immediately upon increasing wages, but that labor-reduction initiatives would take time, so that the long-term effect of the wage hike would take some time to materialize.  (For example, the free refill was introduced by Taco Bell in 1988, but became commonplace in the 90s.  Was this merely a coincidence that this marketing tactic occurred roughly at the same time as a significant nationwide minimum wage increase, with a phase-in that was driven by the time and effort to remodel locations, or did stores find it more advantageous to reduce worker time in this fashion, when labor increased in cost?  Other changes, such as the self-service ordering kiosk, required advances in technology that will presumably be motivated by higher labor cost but not simply “waiting in the wings.”)

It also seems too simplistic to simply discard the increase in prices just because those prices increased at all New Jersey restaurants, including those which had already been paying higher wages. It would seem fairly reasonable that once the previously-lower-paying restaurants had increased their prices, the rest would follow, or that, if certain franchise owners had a mix of higher- and lower-wage restaurants, they might have raised prices in parallel.  Consequently, this consistent price-hike across stores is not the counter-evidence Krueger claims.

In fact, it would seem to merit a closer review, to identify the characteristics of those restaurants previously paying higher wages, especially because they did not boost their wages to remain a “higher wage employer.”  Were these particularly-profitable restaurants?  Restaurants which had difficulty recruiting employees due to locally-tight labor markets?  For instance, restaurants in wealthier suburbs tend to recruit workers from further away and offer higher wages to make the additional travel time worthwhile.  Would they, in the longer term, have difficulty finding workers without boosting that wage differential?

Lastly, they measure the impact of the wage increase on overall work hours by asking whether the opening hours have changed, whether the number of cash registers have changed, and whether the there is a change in the number of cash registers typically open at 11:00 AM.  But it seems likely that a key way that employers will seek to mitigate the effect of a wage hike is by lower staffing at slower times in the day, either by scheduling employers for fewer hours, or by being readier to send employees home.  And they ask whether employees work on a full- or part-time basis but do not actually ask in their survey what the total or average work hours is at each surveyed store.  Perhaps this is a piece of information that they considered too difficult for store managers to provide, so did not ask it so as to ensure they would receive a response to their request, but without knowing this, we simply cannot know whether the study’s data is what it claims to be.

Now, I’ve said that this is considered to be a key study that shifted the debate about the minimum wage, and, it turns out, it wasn’t without pushback.  Richard Berman of the Employment Policies Institute criticized the study in a 1996 report, “The Crippling Flaws in the New Jersey Fast Food Study,” and Krueger and Card responded with their own criticism of Berman’s criticism, as well as a further study by economists William Wascher & David Neumark (not available without paywall), in 2000.  Krueger finds fault with the attempts by Berman and by Wascher and Neumark to re-do the analysis using better or alternate data sources, but does not directly address Berman’s “crippling flaws” (or if they do so, it is so briefly addressed that I missed it).  What were these flaws?  First, that there were significant numbers of stores with clear data errors, such as shifts in the number of part-time and full-time employees as well as a failure to specify, in the price-increase portion, what defines a “regular hamburger” (is it a Big Mac? A Quarter Pounder?  A dollar-menu basic hamburger?).   EPI researchers went back to many of the surveyed restaurants and could not match the employment numbers, and Berman believes this is simply because of inconsistencies in definition of part vs. full-time and the basic fact that the manager or assistant manager answering the survey would have been juggling multiple duties and relying on memory for these numbers.  In any event, Krueger and Card dial back their claims, from 1994’s statement that “we find that the increase in the minimum wage increased employment” to a more cautious, “the increase in New Jersey’s minimum wage probably had no effect on total employment in New Jersey’s fast-food industry, and possibly had a small positive effect.”

public domain

 

 

 

Forbes post, “A ‘Living Wage’ Of $34,000? Bad Data, Or Bad Math, Will Stand In The Way Of Social Security Reform”

Originally published at Forbes.com on February 1, 2021

 

Yes, I have been calling for a comprehensive Social Security reform ever since I began writing at this platform. And, yes, that plan calls for a change from the current formula to a flat benefit for all, or, as I’ve also called it, a “basic retirement income.”

The catch, of course, is this: how do you decide what that right income level is?

The federal government gives us a number that seems reasonable enough by its name: the “poverty guideline.” This works out to $12,880 for a single person, or $17,420 for a household of two. True, you’d have to decide whether a household of two gets twice the single person’s benefit or only the two-person-household benefit, and you’d have to decide whether two people cohabitating count as a “single household” or not, but then the hard work of deciding what an “anti-poverty” benefit should be is finished.

Now, there’s a small wrinkle here: the federal government has two different calculations, the “guideline” and the “threshold”: the former is used for benefits eligibilities and the latter for counting how many people live in poverty — and, for what it’s worth, the threshold for an individual over 65 is $1,000 less than someone younger.

But the “poverty guideline” is ultimately just a metric used for other calculations — eligibility for Food Stamps is not “below the poverty line” but “below 130% of the poverty line.” And the calculation itself is based on the rather arbitrary assumption that people spend 1/3 of their income on food, so it’s not a particularly “scientific” measure of the amount of money needed to keep someone out of material deprivation.

But the promise President Biden has made with respect to expanding Social Security is to boost benefits to a minimum of 125% of the poverty level — and, it appears, to do so on an individual basis, so that households-of-two would get what works out to 180% of the poverty level. Is this enough?

Let’s do some more math: the current federal minimum wage is $7.25 per hour, which works out to $15,080 per year, based on a 40 hour week. Biden wants to boost this to $15.00 per hour, or $31,200, because, he says, that’s the level needed to prevent people from living in poverty (see his speech on his spending plan). Is it necessary to keep Social Security benefits in line with the minimum wage?

Lastly, there are (at least) two “Living Wage” calculators that purport to calculate the wage truly needed to cover the “basic needs” of families or a “subsistence living” income level.

The first of these is at MIT. To take some representative numbers:

In Peoria, Illinois, a single adult working 40 hours a week would need a wage of $10.70, or $22,256 per year. Two adults sharing expenses would need a total of $35,734.

In Chicago, those wages/incomes increase to $28,288 and $43,513, respectively.

(The “living wage” climbs even higher for parents; a single adult supporting 3 children would need to earn $39.31 per hour or $81,756 annually, according to their calculations, but that’s not really relevant when it comes to old-age/retirement benefits.)

The second of these was produced at the Economic Policy Institute. It does appear somewhat outdated, using 2017 data, but it produces considerably higher calculations.

Here, in Peoria, they calculate annual expenses of $33,994 for a single adult and $47,785 for a couple.

And in Chicago, they calculate a single adult needs to earn $36,917 and a couple, between the two of them, needs $50,006. Again, the numbers are even higher with children — $101,140 with three children.

But, it turns out, the basis for their calculations is questionable, at best.

According to the MIT documentation, the calculations assume that families prepare all their food at home (no eating out) according to the government’s “low cost food plan.” They calculate average health expenses based on typical premiums for employer health insurance and out-of-pocket costs from national government surveys. For families with children, they assume families elect the lower-cost option between family and center child care (but use average costs for each type).

But they base housing costs on the HUD Fair Market Rent estimates, that is, from HUD data for the 40th percentile rent for “standard quality units.” Why would a family living at a basic, subsistence level, rent an apartment at nearly the average rent for the area? They calculate transportation expenses based on average spending across all consumers, adjusting only to reflect purchasing used rather than new cars. They (appear to) calculate “other” expenses, again, by using average Americans’ spending on such items as clothing or personal care products.

The EPI documentation indicates other ways in which numbers they claim to be “basic expenses” are really just “average survey expenses.” For all metro areas, they assume parents choose daycare centers, despite their higher cost, and, again, spend the average amount on daycare. For transportation, they again use average American transportation spending, adjusting only to reduce vehicle miles travelled assuming less discretionary travelling. For “other” expenses, they again use survey data on actual spending rather than calculating necessary spending, with the primary adjustment being the assumption that families don’t spend any money on “entertainment” or the survey’s “other” category.

In addition, this calculation uses ACA/Obamacare exchanges to calculate health insurance costs but doesn’t take into account the Obamacare premium subsidies. And the tax calculations don’t appear to include Earned Income Tax Credits or Child Tax Credits (though this could be a result of calculating such high costs that the hypothetical family wouldn’t qualify).

Did MIT and EPI intentionally seek to inflate the living wage calculations? It stands to reason that groups advocating for boosts in the minimum wage would construct these calculators in a way to produce results that are invariably higher than minimum wage, but when they produce values that are so much in excess of what is reasonable, they weaken their case instead of strengthening it. After all, consider that the median individual income is $36,000; does it really make sense to say that the majority of Americans are living at a below-subsistence level?

Or is this a result of data limitations? A typical exercise in a high school Personal Finance course is to collect information on food costs, rent costs, and so on, from various sources, and construct a budget on this basis, but that’s not easy to replicate for families nationwide. One also imagines that there’s a certain fear that calculating such a spending budget might be misunderstood as casting moral judgement on the poor.

And, of course, these calculations are all based on spending for working families, not retirees, who are, as a practical matter, likely to spend less on clothing or other discretionary spending, who have the large majority of their medical spending covered by Medicare and the entirety covered by Medicaid for those below federal thresholds.

At the end of the day, this rabbit hole discussion shows that it is by no means easy to figure this question out — neither for those affected by the minimum wage nor for those affected by discussions of what the right level of Social Security benefit is.

December 2024 Author’s note: the terms of my affiliation with Forbes enable me to republish materials on other sites, so I am updating my personal website by duplicating a selected portion of my Forbes writing here.

Minimum wage, median wage: some data and thoughts

So, readers, I would love to pound out an article about Illinois’ recent minimum wage hike and its future effect on the state economy.  For reference, here’s the Chicago Tribune’s summary:

Under the law, on Jan. 1 the statewide minimum wage increases from $8.25 to $9.25 per hour. The minimum wage again will increase to $10 per hour on July 1, 2020, and will then go up $1 per hour each year on Jan. 1 until hitting $15 per hour in 2025.

That’s a big jump.

Minimum wage supporters cite all manner of beneficial effects — a New York Times article floating around twitter today claimed that

A $15 minimum wage is an antidepressant. It is a sleep aid. A diet. A stress reliever. It is a contraceptive, preventing teenage pregnancy. It prevents premature death. It shields children from neglect. But why? Poverty can be unrelenting, shame-inducing and exhausting.

Its supporters also marshall studies to claim that it will have only beneficial effects on the economy — but skeptics point to the fact that studies finding this are unsatisfactory for a variety of reasons, for instance, a boost in the minimum wage in one locality in a region where wider economic effects might be offset by lower minimum wages in surrounding areas.  And I’m not going to try to produce an analysis of the literature, nor to make any particular claims of expertise as an (armchair) economist.

But I do want to use my platform, however small it is, to point to the magnitude of the increase.  To be sure, in the event that there is significant inflation, some of that increase will be moderated, but at today’s low inflation rates, $15 per hour in 2025 dollars won’t be that much different than $15 per hour in 2019 dollars.

So consider this:

The median wage in the Chicago metro area is $19.67.

In Springfield, Illinois:  $18.35.

In Peoria:  $18.14.

Decatur:  $16.80

Rockford:  $16.55

Carbondale:  $15.77

West Central Illinois nonmetropolitan area:  $15.41

(You can use the main BLS link to view all all metro area median wage data.)

In other words, once you leave metro Chicago and the midsized cities of Illinois, median wages drop to very nearly the level of the future minimum wage.

The BLS link also provides median wages for particular occupations — and the occupations with median wages below the new minimum extend far beyond fast food and retail workers.

In the West Central Illinois nonmetropolitan area:

Ushers, Lobby Attendants, and Ticket Takers earn a median wage of $9.10.

Childcare workers, $9.38.

Hairdressers, hairstylists, and cosmetologists:  $9.74.

Court, Municipal, and License Clerks:  $10.43.

Tax preparers:  $11.12

Nursing assistants:  $11.54.

Pharmacy aides:  $11.77.

Tellers:  $12.72.

Butchers and Meat Cutters:  $13.50.

Emergency Medical Technicians and Paramedics, $14.30.

Phlebotomists, $14.91.

and so forth.

What happens when the state mandates a minimum wage in excess of the wages that each of these occupations, at median, actually pay in this part of the state?  I simply lack the imagination to forsee the impact, but it’s surely not as simple as each of these occupations in fact paying $15.00.   These are in many cases jobs requiring specialized training; I find it difficult to imagine that EMTs would accept a wage that’s equal to what a McDonald’s worker gets the first day on the job, without specialized training.  And I likewise can’t fathom a situation in which every wage-earner’s wages are simply boosted by $6.75, across the board, and prices similarly simply reset at the level necessary for businesses to cover their costs.

Now, looking at this list of occupations, I seem to have selected service occupations which are connected up with the local economy, rather than the sorts of jobs associated with manufacturing or other industries which stretch beyond the local area.  And I am limited in my understanding of the nature of the economy in these sorts of small towns and rural areas, but — well, to the extent that it depends on the sorts of small manufacturing facilities scattered throughout middle America, those manufacturers will have to cope with a changed dynamic that could well lead to them leaving or automating, and to the extent that they provide a support structure that ultimately works its way down to the family farm, well, farmers are self-employed, aren’t they?  And their earnings won’t increase as a result of a minimum wage law, only their costs.

And, yes, I have selected the lowest-wage region from which to list by-occupation median wages.  Of course those numbers are higher in Springfield and Peoria, for example.   Childcare workers in Springfield, for example, earn $10.75 at median, pharmacy technicians, $13.77, and phlebotomists, $16.66.

So I don’t have an answer.  I’m not going to and I’m not able to build out a model of precisely which bad things will happen.  But I do think that looking at these sorts of BLS listings is a useful way of, even as a non-expert, getting a sense of the magnitude of the increase, and the potential for far-reaching unintended effects.

 

Image:  Marseilles, Illinois, population 5,094.  https://commons.wikimedia.org/wiki/File:Marseilles_IL_downtown1.jpg; IvoShandor [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)].