One word - underspecification. https://arxiv.org/abs/2011.03395?ut...e=hs_email
Perspective - draw analogy between geophysical data processing (wiener-levinson,, homomorphic and minimum-entropy deconvolution) and ML models (logistic regression, neural network, LSTM, GAN), their ability to generalize depends on data. Both use statistics, both are straightforward to execute once implemented on a computer. But getting consistent results is anything but. Heard of the phase “it is more an art than science”?
"Art" is attributable to how well a deconvolution algorithm or ML model works on a certain subset of data, a type of distribution (shape of underlying variance), or on slightly adulterated data (often for convenience, e.g., auto-scaling to minimize influence of noise or undersampling).
By attending focused workshops and sampling across sessions, we may learn actionable practical tools to diagnose before-and-after results to narrow down the avenues for further investigation. So instead of tackling 3 different techniques, 5 independent variations and 10 hyperparameters, say an awesome 150 pressing combinations, we become wiser in adapting fail-fast approach to home in on the most promising selections. EiD 2021 can help us develop a framework to deliver ML result consistency and a close loop for rapid cross-discipline engagement to tackle the energy storage challenge (highlighted by Texas freeze out). Compress cycle time and deliver cost savings.