A brief overview of considerations

When it comes to deploying predictive analytics, there are several key considerations at different stages of the process:

  1. Deciding Whether to Deploy Predictive Analytics After a Proof of Concept
    • Revisit ethical issues: We have discussed the importance of ethical considerations at the start of a predictive analytics project, and that we should return to those considerations and our code of ethics throughout the life of a project. This will be critical also when deciding whether and how to deploy predictive analytics. Ask: What may have changed since embarking on the predictive analytics project. What communication with stakeholder has taken place? Have new issues or concerns emerged? Do plans for how results will be used still fit your goals and current context? What findings from the proof-of-concept need to be considered to update ethical considerations?

    • Predictive performance and bias: We have taken a deep dive into defining and measuring predictive performance and bias and the importance of doing this with held-out test data. The final template walks through this key step in the predictive analytics workflow.

    • Cost-Benefit Analysis: In a real-world application, it may also be helpful to determine if the benefits of deploying the model outweigh the costs involved in terms of resources, time, and potential risks.

  2. Deploying Predictive Analytics
    • Organizational Capacity: While this should be an important consideration at beore embarking on a predictive analytics proof-of-concept, it is worthwhile to revisit at this juncture. Assess if the organization has the necessary infrastructure, technical expertise, and resources to implement and support predictive analytics.

    • Integration with Existing Systems: Determine how the predictive analytics will integrate with existing systems and workflows. This could be as simple as taking coefficients from a regression model and applying them to data in a spreadsheet, or it could be a much more complex data engineering endeavor.

    • Frequency of Predictions: Decide on the frequency of making predictions. This depends on the nature of the data, the purpose of the model, the frequency of the need for updated information, how often new service recipients enter the picture, and the operational capabilities.

    • User Training and Support: In a real-world application, there would need to be in-depth consideration about how to provide adequate training and support to the users of the predictive analytics to ensure effective utilization.

  3. Ongoing Maintenance
    • Data Monitoring: After deployment, any predictive analytics deployment will require regularly checks of the data being fed into the model to ensure its quality and consistency. Changes in data sources, formats, or distributions can affect model performance.
    • Monitoring Performance and Bias Metrics: It is also essential to continuously monitor the model for changes in performance and potential biases. This is crucial for maintaining the accuracy and fairness of the predictions. This is very similar to rerunning the steps in 03_Learner_Testing with new data - once the outcomes have been realized. A challenge in some applications is that the outcomes may not be realized until long after predictions are made. For example, if we make predictions about whether students will graduate high school when the students enter high school (the prediction time point), then we would not know anything about the reliability or bias of our predictions until 4 years later. This is one reason why it also key to look at changes in data and to think through implications of changes in context. Another strategy to consider is additional test data that may become available. These test data may not be a part of actual deployment but could provide another opportunity to see if model performance is sufficient.
    • Model Refreshing: Based on findings from above, we may find that we need to update the model periodically based on changes in data patterns or the external context. This would involve retraining and revalidating - basically iterating the same steps in the proof-of-concept.
    • Response to Feedback and Changes: Also, it will be critical in a real-world application to be responsive to feedback from users and stakeholders, as well as to any changes in the operational environment or objectives. The model might need adjustments to stay aligned with evolving business goals or market conditions. Similarly, communication and training might need adjustments.

Each of these stages requires careful planning and consideration to ensure the successful deployment and ongoing effectiveness of predictive analytics in an organization. The goal is not only to deploy a technically sound model but also to ensure it remains relevant, fair, and valuable in its operational context.

Back to top