Wednesday, February 13, 2019

The 100th Paris Machine Learning meetup tonight: La revanche des neurones, DL on Healthrecords, Search-Oriented Conversational AI, Nanotechnology and electricity consumption,





In order to come to the meetup, you MUST register on eventbrite to be able to enter + identity card
Link : eventbrite ticket (https://bit.ly/2I1XXL4)

Thanks for Scaleway for hosting us and provide catering !
The video streaming is below. 




As usual, there is no waiting list or reserved seat: first come first served (the room has a 120 seats capacity)

Schedule :
6:45 PM : door opening
7:00 PM : intro + speakers
9:00 PM : networking - cocktail
10:30 PM : door closing

Speakers


La controverse entre IA symbolique et connexioniste

Recent advances in artificial intelligence present a high energy cost, which poses a problem both for the environment and for their integration into connected objects.
This energy cost is due to the fact that AI algorithms are implemented on conventional computers, which are poorly adapted to them.
A promising approach is to use the brain as a model for developing new types of computers that use less energy. In this presentation I explore some key ideas, including the proximity of computation and memory as well as the management of errors and randomness.
Fajwel Fogel, (Sancare), Deep Learning on health records, www.sancare.fr 
Data scientists from Sancare will provide an overview of some of the challenges faced when training deep learning models on electronic health records (EHR), such as robustness and ability to provide explanations
Training deep learning models on electronic health records (EHR) can be prohibitively expensive in terms of computational cost.
Datasets typically include millions of records, each containing several thousands of words.
Moreover, due to the sensitive nature of EHR, all computations must be performed on-premise, i.e., on the campus of the hospital, where GPU resources are usually rare or non-existent.

Chatbots and intelligent personal assistants (such as Siri, Cortana, the Google Assistant, and Amazon Alexa) are being used increasingly more for different purposes, including information access and retrieval.
These dialog systems enable naturalistic human-like interactions where the information needs are expressed in natural language.
Unlike in traditional search engines, where a user-issued query is answered with a search result page, conversational agents can respond in a variety of ways, for example, asking questions back to the user for clarification.
In this talk, I will present our paper "A Reinforcement Learning-driven Translation Model for Search-Oriented Conversational Systems" where we focus on the understanding of natural language expressions for building keyword-based queries. We proposed a reinforcement learning-driven translation model framework able to
1) learn the translation from NL expressions to queries in a supervised way, and,
2) to overcome the lack of large-scale dataset by framing the translation model as a word selection approach and injecting relevance feedback in the learning process. Experiments are carried out on two TREC datasets and outline the effectiveness of our approach.

We use open-data and machine learning to compute and forecast where and how has been produced the electricity you consume



Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly