People keep on talking about the fear of inflated expectations eventually yielding another winter for Artificial Intelligence. If anything, this year saw the Fall of Everything Else, Winter is coming to these fields and it takes the shape of the Great Convergence .
The Great Convergence is when several engineering fields and Machine Learning techniques are beginning to use very similar if not identical algorithms.
Well for one, we have witnessed several reconstruction solvers that can be built from data ( also here, here, here, here). It's not just that, the same goes for matrix factorization (cocktail party source separation or here ) or as Tomasz pointed out recently it is also the case in more traditional computer vision techniques (learning optical flow, or deblurring) or even investigation of new imaging modalities using ML algorithms (hence the reason for this recent entry Hamming's Time: Making Hyperspectral Imaging Mainstream)
We have also seen engineering physics getting closer to traditional signal processing ( see Physics-driven inverse problems made tractable with cosparse regularization, Sunday Morning Insight: Physics Driven Sensor Design ?, Sunday Morning Insight entry two years ago entitled "The Linear Boltzmann Equation and Co-Sparsity" ) or convex formulations being used in more traditional machine learning settings (3D Shape Reconstruction from 2D Landmarks: A Convex Formulation, Sparse Reinforcement Learning via Convex Optimization).
We should rejoice for the Great Convergence in the same way we should rejoice for the winter solstice. The Fall of Everything Else is a good thing.
h/t Tomasz Malisiewicz , Laurent Duval, Laurent Daudet for insightful discussions.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.