A key to enterprise success with predictive analytics is to get the IT group involved, and do it early in the process.
Understanding predictive analytics has become a key requisite for discovering customer needs and business opportunities from data. But the first steps in establishing a good predictive model lie in outlining data, its sources, and crucial data relationships. Those first steps will require good business resources to organize the details. IT can be that vital resource to quickly highlight opportunities that work effectively with data.
When tasked with preparing a predictive model, an analyst must ask what is the best approach for the initiative. When IT personnel are included in the initial stages, the answers can address different facets of the project, revealing ideas that may not be so obvious from the start.
Taking digital inventory
The best ways for staging a predictive analytics model arise from assessing whether a few key resources are in place in the project’s initial stages.
It is generally understood that an analyst must first determine what question the predictive model must answer with the data. After all, the model should display how the data relates statistically to the set objectives.
The analyst should first ask an IT professional or someone from the IT team to join the initial discussions on whether the needed data is immediately available.
Let’s say an analyst wishes to measure airline passenger survey response to a change in services offered on flights. An analyst can create a list of data and metadata from customer-related activity that an IT team can in turn verify in terms of how the data appears in a database or is called in an API.
This list can be a talking point that reveals how the project drives IT resources. The solutions must allow the data integration to scale in ways that best fit objectives and resources. Plenty of options exist in the technology market, but in many instances the insights from an IT team member can show how to best leverage those options.
Identify the data’s quality
Discussions with IT can next guide real efforts on detailing how everyone will address quality concerns for the predictive model. The analyst likely will understand how outliers, anomalies and other irregularities impact the quality. IT professionals can help by tying those concerns to the technical impacts that can occur from the data sources being requested.
For example, documenting anomalies promotes a shared understanding of which errors are technical, such as sensor issues, and which ones are from real world activity that the analysts will immediately recognize. The team can collectively act to address the issues to maintain model accuracy.
Select the best features at the lowest cost
IT involvement can also shape feature engineering. Feature engineering is the choice of variables that are suspected to be the best predictors for the output we want from a model. The advanced analysis for a predictive model — be it through a dataset examined in SPSS or a programming code created in R or Python — produces metrics that allow users to compare the predictive influence of the selected variables. Variables that are less influential statistically can be removed. The analysis can be repetitive in nature if variables are statistically close from a decision standpoint.
IT can offer alternative ways to conduct feature engineering when complexity exists in accessing data. An enterprise may maintain its own databases with IT in charge of the maintenance. Thus IT personnel can set up access rights for the data that will be a variable. The arrangement is especially important if the model will be used for machine learning training. The better the preliminary efforts to reduce the time for exploratory data analysis, the better IT can identify the prioritization of needed resources that support the data for that analysis and in turn highlight the business operations impacted.
For example you may want to examine metrics that highlight which customer segments lead to increased category spend, a useful analysis for retailers seeing which products to offer promotions or for firms looking to upsell services to its best customers. Industry knowledge can determine if second- or third-party data would inform a customer model for retaining customers for longer. IT could help identify the costs to obtain that data and which operations would be impacted by any ongoing analysis.
Predictive analytics is meant to simulate business decisions and see the outcomes. For a predictive analytics study, IT can help identify the operational tradeoffs that impact the decisions being simulated. That can help reduce technical debt, as well as other financial costs associated with the decision. The end result is an advanced analysis process that truly advances a firm on the fast track to understanding its customers.
Read more of our articles on technologies and trends that can advance your enterprise:
10 Strategic Technology Trends for 2020
Restart Data and AI Momentum This Year
Pierre DeBois is the founder of Zimana, a small business analytics consultancy that reviews data from Web analytics and social media dashboard solutions, then provides recommendations and Web development action that improves marketing strategy and business profitability. He … View Full Bio
More Insights
Leave a Reply