The History of the United States' Nearly-Universal Child Care Program
Jan. 24, 2014
People who write about early education these days often mention that the United States once came within a Nixon veto of having a comprehensive, universal system of child care and pre-kindergarten (I’ve done it too). But only a few have noticed the Lanham Act, which established the United States’ sole experiment with a universal, comprehensive birth-to-12 child care system (for an exception, see Christina Samuels’ EdWeek analysis).
That’s right. Between 1943 and 1946, the United States ran a child care system designed to increase maternal employment as part of the war effort. According to a recent working paper from Arizona State University professor Chris Herbst, over half a million children passed through the program during those years at a cost of $1 billion (in 2012 dollars). The federal government provided the bulk of the funding (half to two-thirds); local communities made up the gap by charging parents daily “copays” amounting to no more than $9.50/day (also in 2012 dollars).
Herbst’s paper examined the program’s short- and long-term effects by comparing states with large, expensive Lanham Act expenditures against those that spent comparatively less. For instance, a state like California, which was full of industries critical to military mobilization, spent nearly fifty times as much (per child) as North Dakota. This allowed Herbst to measure how differences in Lanham Act investments changed outcomes for mothers and their children.
And when it came down to the Lanham Act’s expressed purpose—getting that riveter into “Rosie’s” hands—Herbst found that the program was a considerable success. States that spent more per child each week (presumably to offer more and better coverage) had significant increases in maternal employment:
[I]ncreasing states’ Lanham Act spending from the bottom to the top quartile of the distribution...would produce a 4.3 percentage point increase in the employment rate for treated women.
In other words, if a state’s Lanham Act spending rose from the bottom group to the top group, maternal employment would increase substantially.
In addition, Herbst found that the program contributed to a cultural shift in how American families viewed child care. In general, mothers loved the program. As Herbst puts it, “Fully 100 percent of mothers reported that ‘the child enjoyed nursery school,’ and 81 percent had a ‘generally favorable’ opinion of ‘early childhood education.’”
No surprise, then, that Herbst’s statistical models were able to detect a lasting increase in maternal employment even five years after the program was abruptly eliminated at the end of World War II.
This squares with research on how early education programs affect parents’ employment decisions. As I argued in a Daily Beast post last December,
[P]ublic programs that offer only a few hours of education for children of a particular age won’t free parents to work full-time jobs in an office for eight (or more) hours each day. At best, part-time early childhood programs free parents to pursue part-time jobs with limited security, benefits, and pay.
By contrast, the Lanham Act got impressive results because it was a full-time program. Many centers offered 12 hours of coverage each day, and some even got to 24-hour care (for mothers working night shifts).
Best of all, the program was also great for kids. Herbst uses U.S. Census data to calculate that its effects “are large and comparable in magnitude to several prominent early childhood education interventions [such as the Perry and Abecedarian programs].” (For more on pre-K research, click here.) Specifically, the program increased students’ future earnings, reduced the percentage who needed welfare services as adults, and decreased the high school dropout rate.
The pace of progress on early education policy can be frustrating. But the evidence on the Lanham Act shows that it shouldn’t take another World War to get the United States to build its next comprehensive early childhood system.