In the past week, we have had two public reviews of preprints mentioned here on Nuit Blanche. Is this how post peer review publishing should work. I don't know. Let's examine what happened:
Peyman Milanfar provided some thoughtful comment on this entry Correcting Camera Shake by Incremental Sparse Approximation - implementation -. As a result, Paul Shearer has updated his paper based on Peyman's Google+ feedback. Here is the new version: Correcting Camera Shake by Incremental Sparse Approximation by Paul Shearer, Anna C. Gilbert, Alfred O. Hero III.
Zeno Gantner provided some thoughtful comment as to why the paper featured in That Netflix RMSE is way too low or is it ? ( Clustering-Based Matrix Factorization - implementation -) was not giving the right result.
....Found your bug in the evaluation. First of all, that rounding method. No. So wrong in so many ways.Independently of that, you use long for the variable totalErr. long is an integral type, not a floating point type. You must use double, otherwise you can lose precision. And you do, by rounding to zero, I guess, which makes your RMSEs significantly lower.If you had writtentotalErr = totalErr + errortemp;instead oftotalErr += errortemp;there would have been a warning about it:possible loss of precisionfound : floatrequired: long
After thanking Zeno Gantner several times, the original author did something that I half expected. He removed the preprint from Arxiv (a good thing) and also removed the entire conversation from LinkedIn. I am OK with both actions as long as I was able to salvage the part pointing to the error (see above). With a post publication peer review, we need to have some sort of mechanism by which the lesson learned above remains for future researchers (so they don't make the same mistake).
The second blog entry got a large 10 G+, a score showing the interest of people in the potentially ground breaking result of the paper.
I like how the discussion groups on the Google+ Community (212), the CompressiveSensing subreddit (79), the LinkedIn Compressive Sensing group (2043) or the Matrix Factorization (539) helped in getting this conversation going between people who would likely not talk to each other.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.