US Data Crisis: How Shutdown Exposed Flaws in Economic Statistics
US Economic Data Crisis: Why America Is Falling Behind

The recent 43-day government shutdown in the United States exposed a critical vulnerability in the nation's economic infrastructure: the alarming dependence on federal statistical agencies for reliable economic data. When these agencies stopped publishing crucial information about unemployment, inflation, and retail sales, policymakers and businesses faced an unprecedented information blackout.

The Shutdown That Revealed the Cracks

During the lengthy government closure, everyone from White House officials to Federal Reserve policymakers and small business owners struggled without access to reliable federal data. The shutdown made it painfully clear that private sector data cannot fully replicate the comprehensive information provided by government statistical agencies.

The Federal Reserve and other institutions were forced to rely on alternative data from private companies, but these substitutes proved inadequate for making informed economic decisions. This situation highlighted that data from statistical agencies remains the "gold standard," though that standard is increasingly threatened by budget constraints, falling response rates, and staffing challenges.

Why America's Data Quality Is Slipping

The erosion of data quality poses significant risks for economic policymaking. Former Cleveland Fed President Loretta Mester describes this as a "big problem" for Federal Reserve officials who depend on accurate economic benchmarks to guide the economy through both stable and turbulent periods.

"Maybe policy today isn't much affected, but policy tomorrow and years from now will be very much affected because the models won't be as robust," Mester warns.

The core issue lies in how economic data is collected. Much of the information comes from surveys of businesses and consumers, but participation rates have declined dramatically. The initial response rate to the August 2025 payroll survey was approximately 57%, compared to an average rate of around 70% over the past decade.

This diminishing sample size leads to bigger revisions and greater reliance on estimates. The problem became evident this summer when the Bureau of Labor Statistics lowered previously reported job gains for May and June by a collective 258,000 positions - four to five times larger than the median absolute revision since 1979.

The Path Forward: Blended Data Approach

Experts suggest that US statistical agencies should adopt a blended-data approach, combining traditional measures with alternative data sources. The Bureau of Economic Analysis already uses this method to calculate GDP, and the BLS has incorporated various sources to help estimate the Consumer Price Index.

The Chicago Federal Reserve has pioneered this approach, using approximately eight private data sources including ADP, Indeed, and Lightcast alongside BLS data to create more timely labor indicators. Their Advance Retail Trade Summary (CARTS) series demonstrates the potential of this method.

However, implementing these changes faces significant hurdles. "One of the biggest hurdles that we face—and the statistical agencies face it too—is the data is expensive," says Scott Brave, senior economist at the Federal Reserve Bank of Chicago. The technological requirements and customization needs make the transition challenging and costly.

Another proposed solution involves creating a centralized national statistics agency, similar to those in other G-7 nations. This would allow for better data-sharing, expertise consolidation, and potentially reduce long-term costs. Currently, the US remains the only G-7 country without such a centralized system.

While improving federal data quality will require significant investment in technology, staffing, and backtesting, the consequences of inaction could be severe. As the US economy undergoes rapid changes and demographic shifts, reliable and up-to-date data becomes increasingly crucial for informed decision-making at all levels.