Work from home (WFH) has been around for decades. The share of people working from home three or more days per week was under 1% in 1980, growing modestly with the rise of the Internet to around 2.4% in 2010 and 4% in 2018. Then came Covid-19, forcing tens of millions around the world to work from home and triggering a mass workplace experiment that broke through the technological and cultural barriers that had prevented WFH adoption in the past.
Since May of 2020, economists Jose Maria Barrero (Instituto Tecnológico Autónomo de México), Nicholas Bloom (Stanford University), and Stephen J. Davis (University of Chicago) have been conducting monthly surveys to track the evolution of WFH. Their surveys ask questions about working arrangements during the pandemic, as well as about worker preferences and employer plans post-pandemic. Typical respondent are 40 to 50 years old, have one to three years of college, and earned $40 to $50 thousand in 2019.
I’ve been tracking the evolution of WFH by following their monthly surveys. One of their first surveys found that the percentage of paid full days worked from home once COVID hit in April of 2020 was 61.4%, a huge increase from their 4.8% WFH estimate just before COVID. The percentage then started to decline in subsequent months. One year later WFH was around 45%.
In an April 2021 working paper, “Why Working from Home Will Stick,” Barrero, Bloom, and Davis predicted that the shift to work from home would be one of the biggest legacies of the pandemic. They projected that “American workers will supply about 20 percent of full workdays from home in the post-pandemic economy, four times the pre-COVID level. Desires to work from home part of the week are pervasive across groups defined by age, education, gender, earnings, and family circumstances.” They estimated that higher levels of WFH would boost productivity by about 4.6%, with over half of the productivity gain reflecting the WFH savings in commuting time.