The poverty rate in the United States fell to 11.8 percent in 2018, according to data released last week by the Census Bureau — the lowest it’s been since 2001. But this estimate significantly understates the extent of economic deprivation in the United States today. Our official poverty line hasn’t kept up with economic change. Nor has it been modified to take into account widely held views among Americans about what counts as “poor.”

A better, more modern measure of poverty would set the threshold at half of median disposable income — that is, median income after taxes and transfers, adjusted for household size, a standard commonly used in other wealthy nations. According to the Organization for Economic Cooperation and Development — which includes 34 wealthy democracies — 17.8 percent of Americans were poor according to this standard in 2017, the most recent year available for the United States.

To be sure, there is no such thing as a purely scientific measure of poverty. Poverty is a social and political concept, not merely a technical one. At its core, it is about not having enough income to afford what’s needed to live at a minimally decent level. But there’s no purely scientific way to determine what goods and services are “necessary” or what it means to live at a “minimally decent level.” Both depend in part on shared social understandings and evolve over time as mainstream living standards evolve.

At a minimum, we should set the poverty line in a way that is both transparent and also roughly consistent with the public’s evolving understanding of what is necessary for a minimally decent life. The official poverty line used by the Census Bureau fails that test. It was set in the early 1960s at three times the value of an “economy food plan” developed by the Agriculture Department.

Read the rest of the article at CounterPunch