Rapid, sometimes dramatic change has become “the new normal,” but what we know now can help us improve predictive model accuracy.
Businesses have had to grapple with a lot of change caused by the COVID-19 pandemic. One of the obvious side effects was compromised predictive model accuracy. What worked well in 2019 won’t work as well or at all in 2020 if the training data is out of sync with what’s happening now.
In the beginning
COVID-19’s effects are truly novel. While there have been other pandemics and financial crises in recent history, none of them exactly mirror what’s happened in 2020. The Spanish Flu pandemic may be the closest parallel, but there’s little data available about it compared to the 2008 financial crisis, for example.
Unlike the early days of the COVID-19 pandemic, there’s now more information about its effects on organizational and customer behavior. However, at any moment, the current situation could change, such as a second round of shutdowns.
“We need to remind ourselves to be incredibly agile when it comes to building models,” said Drew Farris, director of AI research at Booz Allen Hamilton. “I’ve encountered some environments in the past where they roll out one new model every six months, and that’s just not tenable. I think increasing modeling agility through automation is more relevant now than any other time, just simply because the data is changing so quickly.”
Continued uncertainty
By now, it’s obvious that the pandemic and its effects won’t disappear anytime soon, so organizations and data scientists need to be able to adapt as necessary.
“As a data scientist, you need to be willing to challenge your assumptions, toss out the theories that you had yesterday and formulate new ones, but then also run the experiments with the data to be able to prove or disprove those hypotheses,” said Farris. “To the extent that you can use automated tooling to do that, it’s very important.”
The companies in the best position to adapt to sudden change have been modernizing their tech stacks to become more agile. Nevertheless, they still need a way to identify signals that indicate future trends. Booz Allen Hamilton was recently doing some work involving linear regressions, but it switched to agent-based modeling.
“It’s basically setting up a dynamic system where you have individual actors in that system that you’re modeling out, and you’re using the data about these actors to figure out what steps will happen next,” said Farris. “It’s really nothing new, but the bottom line is it allows us to look more forward into the future by analyzing system dynamics, as opposed to just sort of measuring the data that we’re seeing from past history.”
Given the constant state of change, it’s important for organizations to be able to respond and adapt to changing circumstances by identifying multiple sources of data that can provide a complete perspective of what’s taking place.
“It’s gotten to the point, or we’re rapidly getting to a point, where it is considerably less expensive to run a plethora of other models to understand different outcomes,” said Farris. “I think if there’s any takeaway that I have in this particular situation is [that] we have that ability to generate so much scale, do some really oddball stuff like run a model that expects the unexpected. Don’t be afraid to introduce complete and total randomness and look for wacky outcomes. Really don’t discount them for potentially what they might be showing you or telling you because that ultimately might prepare you for the next crisis.”
Scenario modeling helps prepare organizations for change
The future has always been uncertain. However, the global and systemic impacts of the COVID-19 pandemic have resulted in an unprecedented level of uncertainty in the modern business environment.
“We have seen from the business world increased requests [for] and usage of analytics and AI machine learning models and more importantly, simulation models, which can simulate different scenarios”, said Anand Rao, global & US artificial intelligence and US data & analytics leader at PwC. “[W]e’re also seeing various new techniques being used in AI, so the old techniques and new techniques being combined.”
Business leaders have been seeking advice about what they should be doing these last few months because their past experience has not prepared them for recent events.
“[E]xecutives basically start to say, I don’t know what to do. I don’t know where this is going,” said Rao. “Is there any way that you guys can come up with anything more than just tossing a coin because any technique is better than my random choice.”
The beauty of scenario modeling is it provides opportunities to plan for different possible futures, such as understanding the impacts of future government intervention on supply and demand or how different scenarios might affect business operations, staffing requirements or customer concerns. That way, should one of the scenarios become reality, business leaders know in advance what they should be doing.
Rao also said that data scientists need to develop their own version of agile so they can build and deploy models faster than they have before.
“This is something we should have adopted before the pandemic,” said Rao. “Now people are looking more at how [to] develop models in a much faster cycle because you don’t have six to eight weeks.”
For more on predictive analytics, read these articles:
How IT Can Get Predictive Analytics Right
IoT and Predictive Analytics: What We’re Driving Toward
Why Everyone’s Data and Analytics Strategy Just Blew Up
Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include … View Full Bio
More Insights
Leave a Reply