Wednesday, February 11, 2015

Paris Machine Learning #6 Season 2: Vowpal Wabbit, RL, Inmoov, libFM and more

The video of the event is below, except for the presentation of John Langford, the rest of the video is in French. All the presentations slides are in English however and are listed below. Thank you to Ecole 42 for hosting us and providing much help during the course of the event (Thank you Charles and Matthieu!).


We now have a Twitter account: ParisMLgroup and our hashtag is #MLParis.

The program and presentation slides:
Abstract: libFM: Factorization Machines
Factorization approaches provide high accuracy in several important prediction problems, for example, recommender systems. However, applying factorization approaches to a new prediction problem is a nontrivial task and requires a lot of expert knowledge. Typically, a new model is developed, a learning algorithm is derived, and the approach has to be implemented. Factorization machines (FM) are a generic approach since they can mimic most factorization models just by feature engineering. This way, factorization machines combine the generality of feature engineering with the superiority of factorization models in estimating interactions between categorical variables of large domain.
I will present how we can model with libFM different generic approaches and show some people use libFM to win Kaggle competition.
libFM is a software implementation for factorization machines that features stochastic gradient descent (SGD) and alternating least-squares (ALS) optimization, as well as Bayesian inference using Markov Chain Monte Carlo (MCMC)
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments: