How to solve AI's reproducibility crisis
Reproducibility is often trampled underfoot in AI's rush to results. And the movement to agile methodologies may only exacerbate AI's reproducibility crisis. Without reproducibility, you can't really know what your AI system is doing or will do, and that's a huge risk when you use AI for any critical work, from diagnosing medical conditions to driving trucks to screening for security threats to managing just-in-time production flows. Data scientists' natural inclination is to skimp on documentation in the interest of speed when developing, training, and iterating machine learning, deep learning, and other AI models. But reproducibility depends on knowing the sequence of steps that produced a specific data-driven AI model, process, or decision.