Pages

Monday, February 10, 2014

Paris Machine Learning Meetup #8: Finding a needle in a Haystack, Beyond SGD, Analyser Wikipedia, Kolibree, Winning Kaggle "Dogs vs Cats"




On Wednesday evening, we will have five speakers for the 8th Paris Machine Learning MeetupLenka Zdeborova, Francis Bach, Guillaume Pitel, Pierre Sermanet et Loïc Cessot. The meetup will be hosted by DojoCrea/DojoEvents. You can register here. Please change your RSVP if you are not coming, it helps us in the long run. 

Lenka will talk to us about interesting results coming up in clustering and other problems, Francis will review some latest developments in convex optimization for robust and efficient large-scale supervised machine learning problems. Guillaume will show his project on how he makes sense of Wikipedia. Loïc will present a Wi-Fi connected toothbrush that requires some learning and finally, direct live from New York, we'll have Pierre who will talk to us about winning the lastest Kaggle competition on recognizing Dogs vs cats. Currently, we expect most of these presentations to be in French but the slides will be in English and will be available from the archives. The meetup will be streamed live on Google+ Hangout starting at 7:30 PM Paris time.

For any job announcement, please use directly the attendant Paris Machine Learning LinkedIn group.

Here are the summaries: 

Lenka Zdeborova, IPT, CEA,

Titre: How hard is it to find a needle in the haystack?
Everybody noticed that some problems in life are easy, some are hard and some are impossible. In machine learning this fact can be nicely formalized on simple examples. One example of interest in many application is clustering of data. Let us take a set of people, we split the people into several groups and consider that a given person is a friend with another person from the same group with probability p_in, and with a person from another group with probability p_out. Then we give the list of who is friend with who to a clever student and ask him to infer how were the people placed into the groups in the first place. Depending on the number of groups and the parameters p_in and p_out it can be analyzed whether the task we gave to the student is impossible, easy or possible but hard. We will describe these (sometimes counter-intuitive) results and their implications for the more generic task of clustering data. If time permits or in the discussion we will talk about other problems of interest in machine learning and signal processing where such situations were analyzed.

Francis Bach, INRIA

Titre: Beyond stochastic gradient descent for large-scale machine learning.
In this talk, I review some latest development sin convex optimization for robust and efficient large-scale supervised machine learning problems.

Guillaume Pitel, eXenSa
Titre: Analyser Wikipedia en long, en large, et en travers avec NCISC
Je présente l'utilisation d'un algorithme de ma création (NCISC) pour l'analyse de wikipedia selon différents critères : catégories, contenu textuel et liens entre pages. Je vais rapidement décrire le pipeline complet d'un tel projet, que nous avons totalement réécrit en Spark+Scala depuis octobre/novembre. Je donnerais quelques exemples de résultats obtenus avec notre approche sur des datasets standard, en le comparant par exemple aux méthodes orientées deep learning.

Loïc Cessot, Kolibree
Kolibree, the world's first connected toothbrush.
C'est un objet connecté présenté au CES 2014 de Las Vegas qui va incorporer des algorithmes d'apprentissage.
Pierre Sermanet , NYU, http://cs.nyu.edu/~sermanet/
Pierre vient de remporter le prix de Kaggle de reconnaissance Cats vs Dogs (http://www.kaggle.com/c/dogs-vs-cats/leaderboard). Il nous fera une presentation de comment il est arrive a cette premiere place en utilisant des algos de deep learning.

The organizers: Franck Bardol, Frederic Dembak, Igor Carron.

A savoir:

* Si vous ne pouvez pas venir, un grand merci d'avance pour changer votre RSVP.


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

1 comment: