The stability of the federal government’s system for producing statistics, which the U.S. relies on to understand its population and economy, is under threat because of budget concerns, officials and data users warn.
And that’s before any follow-through on the new Trump administration and Republican lawmakers’ pledges to slash government spending, which could further affect data production.
In recent months, budget shortfalls and the restrictions of short-term funding have led to the end of some datasets by the Bureau of Economic Analysis, known for its tracking of the gross domestic product, and to proposals by the Bureau of Labor Statistics to reduce the number of participants surveyed to produce the monthly jobs report. A “lack of multiyear funding” has also hurt efforts to modernize the software and other technology the BLS needs to put out its data properly, concluded a report by an expert panel tasked with examining multiple botched data releases last year.
Long-term funding questions are also dogging the Census Bureau, which carries out many of the federal government’s surveys and is preparing for the 2030 head count that’s set to be used to redistribute political representation and trillions in public funding across the country. Some census watchers are concerned budget issues may force the bureau to cancel some of its field tests for the upcoming tally, as it did with 2020 census tests for improving the counts in Spanish-speaking communities, rural areas and on Indigenous reservations.
While the statistical agencies have not been named specifically, some advocates are worried that calls to reduce the federal government’s workforce by President Trump and the new Republican-controlled Congress could put the integrity of the country’s data at greater risk.
“We’re getting close to the bone now,” says Erica Groshen, a former commissioner of BLS who was appointed by former President Barack Obama. “So even if [the funding situation is] exactly the same, the impact is going to be worse” because of ongoing challenges with producing reliable data.
Why today’s government data is like “crumbling infrastructure”
Like roads and bridges, the federal statistical system is indispensable but usually overlooked, its supporters say. Groshen compares its current state to “crumbling infrastructure” that is still doing its job but with “visible cracks.”
“You’re still filling the potholes on the top, but you’re not repaving,” explains Groshen, now a senior economic adviser at the Cornell University School of Industrial and Labor Relations. “You’re not shoring up the undergirding of the bridge. You’re not developing the new bridge that has to replace the old bridge when you discovered that its life is about to end.”
While advocates and officials say government data remains reliable for now, they point to troubling conditions ahead.
“The economy is not becoming any simpler to measure, right? Things are getting more complex. You know, there’s lots of new things we have to learn how to measure,” said Vipin Arora, the BEA director, at a meeting last month of the Council of Professional Associations on Federal Statistics.
The statistical agencies are also faced with a crisis facing the broader survey and polling industry — a shrinking rate of people willing to answer questions.
To counter plummeting survey response rates, statistical agencies have been experimenting with using more existing government datasets and other administrative records to help take stock of the country’s population and economy. But it’s a process that takes time and money for the agencies to research and make sure the quality of the government’s statistics is not compromised, says Nancy Potok, a former chief statistician within the White House’s Office of Management and Budget, who previously served as a deputy director at the Census Bureau.
“Without the money, they’re kind of stuck in the old model, which is getting more and more expensive and less viable. And that’s going to affect the quality of the statistics eventually,” Potok warns.
Potok says she’s currently working on an update to an American Statistical Association report released last year to sound the alarm on the risks facing the country’s data. That report concluded that the main threats to the statistical agencies include declining public participation in surveys, not enough laws to help protect the data’s integrity from political interference and neglect from congressional appropriators.
“What we found was very worrisome because the agencies on the whole had lost about 14% of their purchasing power over the last 15 years. And the rest of what’s called discretionary non-defense spending increased 16% at the same time,” Potok says. “And yet the mandates and the workload and the challenges for the federal statistical agencies increased significantly over that same period.”
Why advocates see a “wise investment” in funding government data
With the next government shutdown deadline in March, Potok says she sees an opportunity to make a pitch for more support to the statistical system.
“If you’re really looking to cut the federal budget, you don’t want to cut the things that are working. You want to cut the stuff that’s not working,” Potok says. “And it’s not a huge investment relative to the size of the federal government to put some money into these agencies to be able to provide that information. It’s actually a wise investment.”
William Beach, a former commissioner of labor statistics who was appointed during the first Trump administration, agrees.
“The statistical system doesn’t need just more money. It needs modernization, the surveys part particularly. And if we did that, over the years, we would probably spend less money on the statistical system and get a better product,” says Beach, who is now a senior economics fellow at the Economic Policy Innovation Center, a conservative think tank.
How the 2030 census and the monthly jobs report could be affected
For now, many statistical agency heads are still faced with making hard choices.
Some census watchers are wary of how the temporary hiring freeze Trump has ordered may affect the next phase of work for a major 2030 census field test, involving thousands of temporary workers, coming up next year.
Terri Ann Lowenthal, a census consultant who served as the staff director of the former House oversight subcommittee for the head count, says the hiring freeze “could significantly disrupt” preparations for the test, which is designed in part to help the bureau improve its tallies of people of color, young children, renters and other historically undercounted populations.
“A census test, like the census itself, must be carried out according to a strict timetable,” Lowenthal says in a statement. “Failure to test new methods and operations that could contain costs and improve accuracy could put a successful census — one that counts all communities equally well — at risk.”
The bureau’s public information office declined NPR’s interview request and did not respond to a written question about the hiring freeze’s impact.
Economic data users like Algernon Austin, director for race and economic justice at the Center for Economic and Policy Research, a left-leaning think tank, are worried about what changes may be coming to the sample size for the Current Population Survey, which produces the monthly employment data.
“If we really want to address issues of racial equity, we really need larger samples, not smaller samples,” Austin says, noting that having fewer people participating in the survey makes it difficult, if not impossible, to release detailed statistics broken down by race and geography.
If the government were to scale back the already-limited demographic breakdowns of employment figures, Austin says researchers like him would have to scramble.
“We may, with a considerable effort, be able to do just a tiny piece of the work that needs to be done, but have to just throw up our hands and say, ‘We don’t know what’s going on in that state or that metropolitan area because we don’t have reliable data,’ ” Austin adds.
Edited by Benjamin Swasey