In 1990, the waning days of the Cold War cast a chilly uncertainty over California’s economy. The Pentagon had made drastic budget cuts, causing aerospace manufacturers and missile plants in Southern California to shutter. Suddenly, the military-industrial buildup that had been a major source of economic growth in the state looked fragile. But after crunching the numbers, most economic forecasters agreed that the ultimate price of peace would be minimal. The Los Angeles Times reassured readers in May 1990 that “California’s economy would be healthy enough and growing fast enough to make the defense transition without widespread hardship.” After all, California had been outperforming the country’s economy for the better part of two decades — how bad could it be?
Bad. Over the next three years, California would lose 520,000 jobs. Unemployment would climb to 9.8%. Vast changes in the business landscape — a decline in manufacturing and the downsizing of aerospace, to name just two — would profoundly alter the state’s economy and batter its middle class. A surge of people and businesses exited.
Only one major economic think tank had sounded the bullhorn: the UCLA Anderson Forecast. Throughout early 1990, David Hensley M.A. ’84, Ph.D. ’90, a young Ph.D.-turned-director of the Forecast, made the unpopular case that the state was in more trouble than anyone wanted to admit. “We believe there is a better than 50% chance that the U.S. is either in, or will soon be in, economic recession,” Hensley told the L.A. Times in August 1990. One month later, Hensley and his small team of UCLA forecasters removed all doubt: According to their September report that year, the U.S., like California, was not entering a recession — it was already in one.

“There was a big uproar. Across the state and across the country, people didn’t want to hear that the economy was slowing,” says former associate director Patricia Nomura ’83, who spent the better part of four decades working at the Forecast, beginning in the mid-1980s. “And in the end, it turned out it was such a severe recession, especially in California.”
The Forecast had been able to identify a reality that was difficult for others to see. Hensley and the team linked together disparate data points and geopolitical storylines — besides the military drawback, things like a statewide housing slump, the physical and psychological toll of the 1989 World Series earthquake, declining consumer confidence and the war in Kuwait — to create a more revealing narrative about the direction of the economy than their peers did.
It was a pivotal moment in the history of the Forecast, helping to burnish its reputation as a hub of storytelling as well as data analysis. As the reality of a recession settled in, the quarterly reports became more sought after — a pattern that Nomura has seen play out time and time again. “This business is sort of countercyclical,” she says. “If the economy is really booming, then people don’t really need to hear from the Forecast so much.”
Even as California rebounded following the recession of the early ’90s, Hensley became a sought-after expert in the national press before leaving to work on Wall Street. Meanwhile, the Forecast earned newfound respect as an independent, nonpartisan source of analysis on the state’s economic ebbs and flows.
“A lot of economic modeling suffers from groupthink,” says Wayne Winegarden, a senior fellow at the right-leaning Pacific Research Institute. “Forecasts like UCLA’s play a very important role because they help us get different views on what’s possible in the future.”
******
In 1952, the Forecast began under the direction of economics professor Robert M. Williams M.A. ’42. While known today for its use of advanced analytics, in its early days the UCLA Business Forecasting Project, as it was originally named, resembled a math-heavy edition of Twelve Angry Men. Williams and a group of colleagues would meet up, swap theories about a changing economy, then fill out a questionnaire concerning topics they had covered. The consensus results were spun into a report containing detailed predictions about the direction of the U.S. economy. From this process came the first Forecast report.
Believe it or not, that methodology was cutting edge for the time. “It was a bunch of guys sitting around in a conference room and prognosticating,” says Daniel J.B. Mitchell, who first joined the group in 1968 and directed the Forecast from 1997 to 1999. “This was pre-modeling, pre-people having computers and, well, pre pretty much everything.”
Within a few years of Mitchell’s joining the team, the Forecast had done away with such rudimentary tools at the direction of then–research director Donald Ratajczak, a former classmate of Mitchell’s at MIT. “Coming from MIT, [Don] was a little more familiar with computers, which at that point were big mainframe kinds of things,” Mitchell says. “But they did exist. And Ratajczak started working with Bob Williams to see if we could, in effect, modernize the forecast and start to do formal modeling.”

The calling card of the Forecast has become the artful ways that its directors and staff link their findings to world events, analyzing all of the numbers with a storyteller’s flair.
Within a decade, the Forecast had settled into a system for supplying data-driven economic predictions to the public. It started producing its California outlooks in 1968, the same year it hosted its first conference. By the late 1970s, the team had transitioned away from using mainframe computers to doing the analysis on personal computers. Regardless of the technology, the content remained largely unchanged: quarterly reports foretelling the impact of such variables as trade embargoes, wars, assassinations, droughts, government investments and new policies, as well as key indicators like the Gross Domestic Product.
The ability to crunch numbers faster and better while improving the accuracy of the Forecast, did not necessarily make its reports easier to understand. “By the early-to-mid 70s, there was a little bit more of a humble understanding that you can print out a lot of pretty charts and have all these detailed numbers, but you need to be telling some kind of a story,” says Mitchell.
What was often missing was narrative. Over the next several decades, the calling card of the Forecast — and, to a large extent, the reason the conferences remained popular — became the artful ways that directors and staff linked their findings to world events, analyzing the numbers with a storyteller’s flair. “I often used to say the audience doesn’t need the what of the forecasts as much as the why,” says Ed Leamer, former director of the forecast (2000–2017) and now UCLA Distinguished Professor Emeritus of Global Economics and Management.
“That’s one of the reasons our conferences stay relevant,” says current director Jerry Nickelsburg, who’s been at the helm since 2017. Nickelsburg first arrived at the Forecast in 2006 as a senior economist and a lecturer in UCLA Anderson’s M.B.A. program, where he continues to teach. “And for the next 70 years, our objective is not just to keep the Forecast relevant, but to further the mission of the Anderson School and UCLA.”
******
Since its inception 70 years ago, the Forecast has evolved from a tiny shop on the periphery of the business school to an essential component of Anderson’s brand. Along the way, it has enlightened policymakers, researchers and the public with potential scenarios to watch for in the economy. Every quarter, it releases two separate reports — one covering the nation, the other focused on California — that give updates on anticipated economic outcomes. Before the results are disseminated to the world, they are released to audiences at the Forecast’s quarterly conferences and to its subscribers.

“I lean on the Anderson Forecast in my work to understand where California’s dynamic, complex economy is headed,” says Sarah Bohn, vice president and senior fellow at the nonprofit Public Policy Institute of California. “The Forecast is rigorous and digestible, which also makes it an invaluable resource for the state’s decision-makers as they plan in an economically volatile time.”
The Forecast’s emphasis on contextual storytelling has gone beyond earning respect in the field of economics. “The Forecast is the main source of media attention for the Anderson School by far, and that’s important,” says Leamer. “It gets the name of the school in the prominent newspapers — The Wall Street Journal, The Washington Post and, of course, the Los Angeles Times.”
Nailing predictions — for instance, the recession of the early 1990s — has buoyed the credibility of the Forecast. Another calling card has been its transparency. In an age when the vast majority of economic reports peering into the future originate from banks (which can be self-interested) or governments (which tend to be dull and overly conservative to avoid panics), the Forecast offers reliable, unfettered analysis. “As an independent organization, we have no ax to grind,” says Nickelsburg. “Our mission, as an educational organization, makes us uniquely capable of elevating the discourse around policy, especially in the state of California.”
The Forecast has endured with a tiny staff and found new ways to extend its expertise while sustaining itself financially. In addition to regular reports and conferences, it has branched into special projects and ad hoc forecasts with corporate partners, such as a U.S.-China Economic Report (in partnership with Cathay Bank) and an evaluation of online microbusinesses (in partnership with GoDaddy). These reports are done with the same level of integrity, modeling, and analytics found in the Forecast’s quarterly reports, but applied to more niche topics. As the pandemic subsides, the

Forecast is also honing new ways of storytelling. In 2020, it launched the “Forecast Direct” podcast, featuring senior economist Leo Feler in conversation with some of the world’s leading economics and business minds, such as Emily Oster and Robert Gordon.
It’s also incorporating new lenses, such as climate science, into its projections: “We want to be able to say something based on the research that economists and climatologists have done, and to be able to say something about how that impacts the U.S. and California economies — but in particular the California economy,” says Nickelsburg. “It’s a very hard problem, but we’re going to be expanding our staff to do that.”
Over the decades, the work of the Forecast has resonated beyond the walls of Leon and Toby Gold Hall, where its offices lie in the Anderson School. Leamer, among others, has tried to use his insights to elevate the field of economic forecasting writ large. This past summer, he submitted his latest articlea working paper, titled A New Way of Forecasting Recessions, to the National Bureau of Economic Research.
So, with so many prognosticators predicting that another recession is on the horizon for the U.S. economy, what does the Forecast have to say about it? Its latest report sees a high likelihood of ongoing inflation and below-growth trends, but — as of this writing, at least — no recession. “We are, today, in a world that is more uncertain than I think most of us remember,” says Nickelsburg. “Right now, people are talking a lot about the potential for a recession, about inflation, about the impact of the geopolitical environment, and so on. What we're trying to do [at the Forecast], increasingly, is speak to the risks.”
Given such uncertainty, only one thing, he says, is certain. “We will continue to discuss potential scenarios, and why those scenarios might happen,” he says. “But what we really want to do is let the data speak.”
Read more from UCLA Magazine’s Winter 2023 issue.
