tag:blogger.com,1999:blog-61419802024-03-10T22:23:04.917-05:00Nuit Blanche"Defeating the data tsunami one algorithm at a time". Nuit Blanche covers compressive sensing, advanced matrix factorization, random numerical linear algebra and all their applications. Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.comBlogger4854125tag:blogger.com,1999:blog-6141980.post-45380577600786218492023-08-17T05:00:00.004-05:002023-08-17T05:00:00.146-05:00Large Language Models and Transformers (Videos, Simons Institute for the Theory of Computing)<div style="text-align: justify;">As some of you may know, <a href="https://lighton.ai" target="_blank">LightOn</a> has built a few Large Language Models, and we are now making them usable to Enterprise customers. In the meantime and on the theoretical side of things, the <a href="https://simons.berkeley.edu/workshops/large-language-models-transformers/schedule" target="_blank">Simons Institute for the Theory of Computing has organized a workshop on the topic of Large Language Models and Transformers</a>. The program is listed below, every link links to the video of the talk (that includes streaming this week).</div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="289" src="https://www.youtube.com/embed/AKMuA_TVz3A" width="419" youtube-src-id="AKMuA_TVz3A"></iframe></div><br /><div><br /></div><div style="text-align: justify;">Monday, Aug. 14, 2023</div><div style="text-align: justify;"><ul><li>9:15 – 10:15 a.m. <a href="https://simons.berkeley.edu/talks/yin-tat-lee-microsoft-research-2023-08-14">Sparks of Artificial General Intelligence</a>, <a href="https://yintat.com/">Yin Tat Lee (Microsoft Research)</a></li><li>11 a.m. – 12 p.m. <a href="https://simons.berkeley.edu/talks/yejin-choi-university-washington-2023-08-14">Possible Impossibilities and Impossible Possibilities</a>, <a href="https://homes.cs.washington.edu/~yejin/">Yejin Choi (University of Washington)</a></li><li>1:30 – 2:30 p.m. <a href="https://simons.berkeley.edu/talks/christopher-d-manning-stanford-university-2023-08-14">Towards Reliable Use of Large Language Models: Better Detection, Consistency, and Instruction-Tuning</a>, <a href="https://nlp.stanford.edu/~manning/">Christopher D. Manning (Stanford University)</a></li><li>3 – 4 p.m. <a href="https://simons.berkeley.edu/talks/ilya-sutskever-openai-2023-08-14">An observation on Generalization</a>, <a href="https://www.cs.toronto.edu/~ilya/">Ilya Sutskever (OpenAI)</a></li><li>4 – 4:45 p.m. <a href="https://simons.berkeley.edu/talks/2023-08-14">Panel Discussion (moderated by Alexei Efros)</a></li></ul></div><div style="text-align: justify;">Tuesday, Aug. 15, 2023</div><div style="text-align: justify;"><ul><li>9 – 10 a.m. <a href="https://simons.berkeley.edu/talks/yasaman-bahri-google-deepmind-2023-08-15">Understanding the Origins and Taxonomy of Neural Scaling Laws</a>, <a href="https://simons.berkeley.edu/workshops/large-language-models-transformers/schedule">Yasaman Bahri (Google DeepMind)</a></li><li>10 – 11 a.m. <a href="https://simons.berkeley.edu/talks/sasha-rush-cornell-university-hugging-face-2023-08-15">Scaling Data-Constrained Language Models</a>, <a href="https://rush-nlp.com/">Sasha Rush (Cornell University & Hugging Face)</a></li><li>11:30 a.m. – 12:30 p.m. <a href="https://simons.berkeley.edu/talks/sanjeev-arora-princeton-university-2023-08-15">A Theory for Emergence of Complex Skills in Language Models</a>, <a href="https://www.cs.princeton.edu/~arora/">Sanjeev Arora (Princeton University)</a></li><li>2 – 3 p.m. <a href="https://simons.berkeley.edu/talks/miles-cranmer-flatiron-institute-2023-08-15">Interpretability via Symbolic Distillation</a>, <a href="https://astroautomata.com/">Miles Cranmer (Flatiron Institute)</a></li><li>3:30 – 4:30 p.m. <a href="https://simons.berkeley.edu/talks/colin-raffel-university-north-carolina-hugging-face-2023-08-15">Build an Ecosystem, Not a Monolith</a>, <a href="https://colinraffel.com/">Colin Raffel (University of North Carolina & Hugging Face)</a></li><li>4:30 – 5:30 p.m. <a href="https://simons.berkeley.edu/talks/adam-tauman-kalai-microsoft-2023-08-15">How to Use Self-Play for Language Models to Improve at Solving Programming Puzzles</a>, <a href="https://www.microsoft.com/en-us/research/people/adum/">Adam Tauman Kalai (Microsoft)</a></li></ul></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Wednesday, Aug. 16, 2023</div><div style="text-align: justify;"><ul><li>9 – 10 a.m. <a href="https://simons.berkeley.edu/talks/pamela-samuelson-uc-berkeley-2023-08-16">Large Language Models Meet Copyright Law</a>, <a href="https://www.law.berkeley.edu/our-faculty/faculty-profiles/pamela-samuelson/#tab_profile">Pamela Samuelson (UC Berkeley)</a></li><li>10 – 10:45 a.m. <a href="https://simons.berkeley.edu/talks/2023-08-16">Panel Discussion (moderated by Shafi Goldwasser)</a></li><li>11:15 a.m. – 12:15 p.m. <a href="https://simons.berkeley.edu/talks/yonatan-belinkov-technion-israel-institute-technology-2023-08-16">On Localization in Language Models</a>, <a href="https://belinkov.com/">Yonatan Belinkov (Technion - Israel Institute of Technology)</a></li><li>2 – 3 p.m. <a href="https://simons.berkeley.edu/talks/jacob-steinhardt-uc-berkeley-2023-08-16">Language Models as Statisticians, and as Adapted Organisms</a>, <a href="https://jsteinhardt.stat.berkeley.edu/">Jacob Steinhardt (UC Berkeley)</a></li><li>3:30 – 4:30 p.m. <a href="https://simons.berkeley.edu/talks/nicholas-carlini-google-deepmind-2023-08-16">Are Aligned Language Models “Adversarially Aligned”?</a>, <a href="https://nicholas.carlini.com/">Nicholas Carlini (Google DeepMind)</a></li><li>4:30 – 5:30 p.m. <a href="https://simons.berkeley.edu/talks/paul-christiano-alignment-research-center-2023-08-16">Formalizing Explanations of Neural Network Behaviors</a>, <a href="https://paulfchristiano.com/">Paul Christiano (Alignment Research Center)</a></li></ul></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Thursday, Aug. 17, 2023</div><div style="text-align: justify;"><ul><li>9 – 10 a.m. <a href="https://simons.berkeley.edu/talks/steven-piantadosi-uc-berkeley-2023-08-17">Meaning in the age of large language models</a>, <a href="http://colala.berkeley.edu/people/piantadosi/">Steven Piantadosi (UC Berkeley)</a></li><li>10 – 11 a.m. <a href="https://simons.berkeley.edu/talks/josh-tenenbaum-mit-2023-08-17">Word Models to World Models</a>, <a href="http://web.mit.edu/cocosci/josh.html">Josh Tenenbaum (MIT)</a></li><li>11:30 a.m. – 12:30 p.m. <a href="https://simons.berkeley.edu/talks/jitendra-malik-uc-berkeley-2023-08-17">Beyond Language: Scaling up Robot Ontogeny</a>, <a href="http://people.eecs.berkeley.edu/~malik/">Jitendra Malik (UC Berkeley)</a></li><li>2 – 3 p.m. <a href="https://simons.berkeley.edu/talks/dan-klein-uc-berkeley-2023-08-17">Are LLMs the Beginning or End of NLP?</a>, <a href="https://www2.eecs.berkeley.edu/Faculty/Homepages/klein.html">Dan Klein (UC Berkeley)</a></li><li>3:30 – 4:30 p.m. <a href="https://simons.berkeley.edu/talks/diyi-yang-stanford-university-2023-08-17">Human-AI Interaction in the Age of Large Language Models</a>, <a href="https://cs.stanford.edu/~diyiy/">Diyi Yang (Stanford University)</a></li><li>4:30 – 5:30 p.m. <a href="https://simons.berkeley.edu/talks/scott-aaronson-ut-austin-openai-2023-08-17">Watermarking of Large Language Models</a>, <a href="https://www.scottaaronson.com/">Scott Aaronson (UT Austin & OpenAI)</a></li></ul></div><div style="text-align: justify;">Friday, Aug. 18, 2023</div><div style="text-align: justify;"><ul><li>9 – 10 a.m. <a href="https://simons.berkeley.edu/talks/gregory-valiant-stanford-university-2023-08-18">In-Context Learning: A Case Study of Simple Function Classes</a>, <a href="https://theory.stanford.edu/~valiant/">Gregory Valiant (Stanford University)</a></li><li>10 – 11 a.m. <a href="https://simons.berkeley.edu/talks/surya-ganguli-stanford-university-2023-08-18">Pretraining Task Diversity and the Emergence of Non-Bayesian In-Context Learning for Regression</a>, <a href="https://profiles.stanford.edu/surya-ganguli">Surya Ganguli (Stanford University)</a></li><li>11:30 a.m. – 12:30 p.m. <a href="https://simons.berkeley.edu/talks/ludwig-schmidt-university-washington-2023-08-18">A data-centric view on reliable generalization: From ImageNet to LAION-5B</a>, <a href="https://www.engr.washington.edu/facresearch/newfaculty/2021/schmidt">Ludwig Schmidt (University of Washington)</a></li><li>2 – 3:30 p.m. <a href="https://simons.berkeley.edu/talks/2023-08-18">Short Talks</a></li><li>4 – 5 p.m. <a href="https://simons.berkeley.edu/talks/2023-08-18-0">Short Talks</a></li></ul></div><div style="text-align: justify;"><br /></div><div><br /></div><div><br /></div><div><br /></div>** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **<div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-48041259013734959932021-12-31T11:30:00.001-06:002021-12-31T11:30:24.691-06:002021, the year AI ate HPC … and more<div style="text-align: justify;">Back in 2011, Marc Andreesen announced that <a href="https://d12xxm04.na1.hubspotlinksstarter.com/Btc/2M+113/d12xxm04/VW9DbS4VckmXW5HYBXn2b7fZSW22vCbp4CJVj8MhPPS73lSc3V1-WJV7CgT5gW554m-p5-WH0_W276jv42hkNsHW4F9JXq5FKJC-W5g3Q5X38BYxsW3cYDr15Sx93VW9fsb7n3CGvpVW1-V4KL5dH0NJMkHvrymK3_XW4nPXFG1snCSKW4rS2rt7c97DbW2mLCzb7ZY5ghW7q6gcX5TbTGCW9hszGs7Bp7chW2P8qcW7gPXlqW6wLD8R22x9F8W6qYd3S2mJmmRW2XRP9n3cKhz1MP3XDxmGqnyW8NnPP88W3lhhW6BL1PW329Ln1W3XNW3p7DJzCrN1ByCJCnVNtPV-4G3M257y0GF4HvPkXGL1t38_B1">Software was eating the world</a> while everyone was trying to make sense of the realities of the cloud versus brick and mortar businesses. Eight years later, Tarry Singh articulated how <a href="https://d12xxm04.na1.hubspotlinksstarter.com/Btc/2M+113/d12xxm04/VW9DbS4VckmXW5HYBXn2b7fZSW22vCbp4CJVj8MhPPQZ5knJ3V3Zsc37CgYyGW7_0T8Q2XsLKvV_PHYR3PHzxgW7DTR2c5-KrS-W7SBXxL2n5J-yW5CjjjT3X7y2TVcfj0m4f81McW3Q_6N_744Yf1W8J7lB86zzV58W6MGz8G5gtV8gW18-7G12fYLd-VZ6QXZ28DXm_W5174_S7jKtcYN3GRZTq-XxZ5W5LG30v5T6W5gW6Q8CP48QYZ4XW7CdP7P68vn8fW88_D8K35s59mN4yBXtfy-jrdW2tW1Js4_vhWrV19GRG6kTYxXW157XM48VqyBJW5hZqJ52g12x4VDwZyQ20NCN3W3_TwmN3q9lCLW3Y2KdG17qndTW7rQX4F4qxJZYW718Ljr6QkM3DW6RmF9P8XT9fxW4131hL7tcjyKW2f_bh76rldBmW8xNVp-944r3hW3lzwfh3h-bvs3c2W1">AI was eating software</a>; a year before GPT-3 and Codex would give solid ground to this prediction. Fast forward two years later, we just witnessed how AI ate HPC and we believe those are the first steps towards how AI is eating Learning, Creative and Office work.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Let me explain.</div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEgCP-A0MJUui-eIZ5CYP3YSTZdQ-jWHF-kX-qOUT5aLMiB9epmPksa_5i5J3voCNI88bYuCWeFfMYytsL-oQN-gfKzsUt3KF0QajzDacl5xt6dCBiF0r_NuNzBFpVkAZTjmktyhrzKM1_AS5_E2NZgsBiewLFHS5s7clbcbvrr1niHMMJ5Kf6k=s1200" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="417" data-original-width="1200" height="139" src="https://blogger.googleusercontent.com/img/a/AVvXsEgCP-A0MJUui-eIZ5CYP3YSTZdQ-jWHF-kX-qOUT5aLMiB9epmPksa_5i5J3voCNI88bYuCWeFfMYytsL-oQN-gfKzsUt3KF0QajzDacl5xt6dCBiF0r_NuNzBFpVkAZTjmktyhrzKM1_AS5_E2NZgsBiewLFHS5s7clbcbvrr1niHMMJ5Kf6k=w400-h139" width="400" /></a></div><br /><div style="text-align: justify;"><br /></div><div style="text-align: justify;">At <a href="https://lighton.ai/">LightOn</a>, we have been working on getting AI to be transformative for everyone. For that to happen, we used the Jean Zay French national supercomputer for two different yet somehow related reasons this past year. First, our <a href="https://d12xxm04.na1.hubspotlinksstarter.com/Btc/2M+113/d12xxm04/VW9DbS4VckmXW5HYBXn2b7fZSW22vCbp4CJVj8MhPPRS3lSbNV1-WJV7CgP-0W1Bt24G6bpyH8W3xxMJC1MDJXdN5X9p47fLy4tW6Hrny22W87QWW6KjY-16VYDhtW8zQ_HR875CczW7TMvVP6YZYGwW5xw-fq3q1ZsMVbqFMS53NpyHW2tzr1n1pWKRJN1LYSbvbT443W2Htky43HMmhQW42HQv_3B3dSYW3BpGqL1vvzTXW3q6lxs550rYMN3YN9fYM_gYjW2fDMPZ94MJLmMKYQM0qGClwW6m_Rpr4_GVF0N2PLW8dpppsqW1sCZzd31nC19W4CC07S4jGHtR3lrF1">LightOn’s Optical Processing Unit hardware</a> was integrated into this top105 supercomputer. Even though LightOn’s hardware is analog and uses a technology currently unknown to supercomputing, there are several good reasons the future of computing will use this technology. Relatedly, in a co-design fashion, we also used the Jean Zay facility to implement and run code for the building of Large Language/Foundation Models that we believe are key to Transformative AI. In March, we trained the largest French language model ever called Auriga and made it available to everyone through our PAGnol demo.</div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen='allowfullscreen' webkitallowfullscreen='webkitallowfullscreen' mozallowfullscreen='mozallowfullscreen' width='416' height='346' src='https://www.blogger.com/video.g?token=AD6v5dzBHSmdjTsVmKUZ1eCz7epRBwjA1CKBIqMfRvTqUvmC3Wvw9i2l3J7UfFHo0m6axa1eM7xWU7LZEv8' class='b-hbp-video b-uploaded' frameborder='0'></iframe></div><br /><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;">In July, we launched the <a href="https://d12xxm04.na1.hubspotlinksstarter.com/Btc/2M+113/d12xxm04/VW9DbS4VckmXW5HYBXn2b7fZSW22vCbp4CJVj8MhPPRy3lSbtV1-WJV7CgL7wW5LGWrl4Hd1WCW69Sh-4146fFBVdwwz711Yy2qW1f9WNr4DZmTMW4S18ns7p4p6bW4k0cqS76vM1gW5_szny46NbXfW4KMfzW3HlQ_9W6xmMBv4GGqrNW3blVVb7DyYnPN7__QFVjK-pVW8KyPK82FkvMMW58rk3c7dZFN0W8L2PjR66wdrRW7362c18dxD4MW2vk_rr8JspNyW7T5qhN5VSQVkW8cktW44kgW3zN2SFLDqJ05HCW8Sp-Dr4Vm7qG3l3F1">Muse API</a>, making our language models available for business use. Initially released in private beta, <a href="https://d12xxm04.na1.hubspotlinksstarter.com/Btc/2M+113/d12xxm04/VW9DbS4VckmXW5HYBXn2b7fZSW22vCbp4CJVj8MhPPRS3lSbNV1-WJV7CgNmpW8qvbkH1z6zVwV7Jdyj1LtMksVcDlg-9dYH5WW2fCvyj26m77JW11rXq38ghnJTN7v1cgDK-4xjVM4YkV4pwC7RW1gJwJj6q0hm3N9hb_SNTtFVNW1dd4s41rCjB0W1gBsNd7h-Ry3W1B9Jck7gVmZ6W8Jnx0Z8W7Hy2W37xxkz2TdX7xN7xJ7ZhfTwCFW5tb-qx70Jp1KW2HB1Xy7h6NDFW4ZFl0P4tJ-Q0W1SnZF52ryPfcW5zLzkm27kl3kW4DKbl_8yfCRbW59760p7p1_v92p-1">Muse</a> has quickly gained its first customers, and a public commercial version with five languages is to be released in early 2022. Some of these early customers are using this new AI to redefine SEO or the experience for website creation.</div></div><blockquote style="border: none; margin: 0 0 0 40px; padding: 0px;"><div><blockquote class="jp jq jr" style="background-color: white; box-shadow: rgb(41, 41, 41) 3px 0px 0px 0px inset; box-sizing: inherit; color: rgba(0, 0, 0, 0.8); font-family: medium-content-sans-serif-font, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen, Ubuntu, Cantarell, "Open Sans", "Helvetica Neue", sans-serif; margin: 0px 0px 0px -20px; padding-left: 23px;"><p class="hx hy js hz b ia ib ic id ie if ig ih ii ij ik il im in io ip iq ir is it iu dn gv" data-selectable-paragraph="" id="32db" style="box-sizing: inherit; color: #292929; font-family: charter, Georgia, Cambria, "Times New Roman", Times, serif; font-size: 21px; font-style: italic; letter-spacing: -0.003em; line-height: 32px; margin: 2em 0px -0.46em; text-align: left; word-break: break-word;"><span class="fy" style="box-sizing: inherit; font-style: normal;">“True happiness comes from the joy of deeds well done, the zest of creating things new” </span><span class="hz fz" style="box-sizing: inherit; font-weight: 700;">Antoine de Saint-Exupéry</span></p></blockquote></div></blockquote><div><br /></div><div><br /></div><div>Eventually, a major impact of these Large Language Models trained on HPC infrastructures will be the ability for everyone to personally learn faster and for office workers worldwide to get the job done in a fashion never seen before.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEidULn1QjgvYk-WnnCSHj7RuXOh_8TuIYlnBVaPMJwvpnKWwtaNzWIACpL-KHHLWHsNoE2YbA7BggvXSXVVN08e3GIo2veguCFdvCGh_vfQvY8GJTeIqXm7xWin1S4D6KkLji2MREHxqUFqn6bqGg2bYDt9uxSftUVnkkZyaZkybPHG2x8_DtI=s890" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="154" data-original-width="890" height="69" src="https://blogger.googleusercontent.com/img/a/AVvXsEidULn1QjgvYk-WnnCSHj7RuXOh_8TuIYlnBVaPMJwvpnKWwtaNzWIACpL-KHHLWHsNoE2YbA7BggvXSXVVN08e3GIo2veguCFdvCGh_vfQvY8GJTeIqXm7xWin1S4D6KkLji2MREHxqUFqn6bqGg2bYDt9uxSftUVnkkZyaZkybPHG2x8_DtI=w400-h69" width="400" /></a></div><br /><br /><div style="text-align: justify;">If you are a start-up company or an individual starting a business around this promise, don’t hesitate to join the <a href="https://d12xxm04.na1.hubspotlinksstarter.com/Btc/2M+113/d12xxm04/VW9DbS4VckmXW5HYBXn2b7fZSW22vCbp4CJVj8MhPPRS3lSbNV1-WJV7CgDL2W6Bc77b3WsYwZW5hGj_45lGxMDW67rBwL1Xs8wkW71QqSp2FDS2TW2jLDb3835nkjW60r8Dc4zF3_vW7zsHrF46q2NPVCpGk64WtcqcW2P99wl8Bb3hcW8dcgZ21SRY76W2rVb8X7tckh4V8vrjB3YkYQzW59kbD92_CB2mW8c91SM4nV6S5W3bR41p2dZx26W3qk9Fp7Zy9KTW5k89Gn960z4gW83FZBr8VMNBdW4l4WQs9072QYW1qkbC08ZZML6W8psnN05hdDPFW1S9clV2kxq4M3d2x1">Muse Partnership program</a>, and let’s start a discussion around how <a href="https://d12xxm04.na1.hubspotlinksstarter.com/Btc/2M+113/d12xxm04/VW9DbS4VckmXW5HYBXn2b7fZSW22vCbp4CJVj8MhPPRS3lSbNV1-WJV7CgJb0W3B5Kkl2V31BfW7lG4NX1PDck-W4WWC_r67y4s5W5Yx4rp2298GpW6DCHXP6kDf2fW72cK3q55T1W7W509Btr6j8KBjW7BYS5t705fpyW6Sb-Pv492Gr5W3wfRTc94kmjgN82RN7Z6H5JjW6mz2Zn4PN7XJW1zF9bj13189HN8_gfWfB7DvNW1cRyf74LldPvMVYq1B8_pTYW4hQCw85F0jDgW3f64ln3xk5_LN3jLrw7HdG7LW50Rp4597Rh-ZW4_T0hR6p9H1MW3lpvTm6n1fSt3d5b1">Muse</a> can help you.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">These models will also have the same effect in creative work and in the discovery process.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Stay tuned, the true AI revolution is really coming!</div><div><br /></div><div><br /></div><div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-56821200390513859692021-12-21T13:30:00.000-06:002021-12-21T13:30:04.289-06:00LightOn Photonic coprocessor integrated into European AI Supercomputer** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **<div><br /></div>This is history of computing in the making stuff!<div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEghMce7opR2TELEXEGyXiJBOOEvonE-6aBYtrydiKmwKzzwXBhnbAdJHZQaYp1zj6DVStHJOPo8Jl59usvSAjAWVQYHsYq5-mhdV-XtSgbq09y5G-xZneFAlWnY_baDFwy3mfrNeaH3zO2K9KdcuvzMTmEKwXzxFn0XBx40UbkSPOj7XkcxV4U=s1200" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="417" data-original-width="1200" height="141" src="https://blogger.googleusercontent.com/img/a/AVvXsEghMce7opR2TELEXEGyXiJBOOEvonE-6aBYtrydiKmwKzzwXBhnbAdJHZQaYp1zj6DVStHJOPo8Jl59usvSAjAWVQYHsYq5-mhdV-XtSgbq09y5G-xZneFAlWnY_baDFwy3mfrNeaH3zO2K9KdcuvzMTmEKwXzxFn0XBx40UbkSPOj7XkcxV4U=w406-h141" width="406" /></a></div><div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Four years ago to the day, <a href="https://lighton.ai/">LightOn</a>’s <a href="https://www.blogger.com/#">first Optical Processing Unit (OPU)</a> had its first light in a Data Center showing that our technology was data center ready.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">It is with immense pride and pleasure to announce that <a href="https://lighton.ai">LightOn</a>’s OPU has been installed in one of the world’s <a href="https://top500.org/">Top500 supercomputer</a> as part of a pilot program with GENCI and IDRIS/CNRS.</div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjndNHILvDpTfuvHNwd173nQSrWTZDtLT8pxd4Kt21f1aWwpmR986Vu09033wQ2KUF--2RapEYHpQxEr5yFWbWHF_RIdFUt5jricYus1w8X4EbNYhf12zxFtLvmaWED8gFMRGgxq_b0M_KCxdReEWgbEpkPP0KesCnuXitaP0M1l73Rxq9wFVI=s1532" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="316" data-original-width="1532" height="96" src="https://blogger.googleusercontent.com/img/a/AVvXsEjndNHILvDpTfuvHNwd173nQSrWTZDtLT8pxd4Kt21f1aWwpmR986Vu09033wQ2KUF--2RapEYHpQxEr5yFWbWHF_RIdFUt5jricYus1w8X4EbNYhf12zxFtLvmaWED8gFMRGgxq_b0M_KCxdReEWgbEpkPP0KesCnuXitaP0M1l73Rxq9wFVI=w465-h96" width="465" /></a></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div style="text-align: justify;">The team at <a href="https://lighton.ai">LightOn </a>is immensely proud to write the future of computing in this world-first integration of a computing photonic device into an HPC infrastructure.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">The press release can be found <a href="https://lighton.ai/wp-content/uploads/2021/12/Press-Release-LightOn-Photonic-coprocessor-integrated-into-European-AI-Supercomputer-Dec-21-2021.pdf">here</a>.</div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;">Thank you GENCI and IDRIS/CNRS for making this happen!</div><div style="text-align: justify;"><span style="background-color: white; color: #292929; font-family: charter, Georgia, Cambria, "Times New Roman", Times, serif; font-size: 21px; letter-spacing: -0.063px;"><br /></span></div><div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div></div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-76779866880834632372021-05-21T13:06:00.010-05:002021-05-21T13:08:39.873-05:00The Akronomicon: an Extreme-Scale Leaderboard** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **<div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiysEOU-iQyZp9yUXle46nZBqEJ-GDi2LWoRdzIxg6QSSrpMCcEGAyY6HBSE-5A3wDQYobt2pVXR9llUWD0NKQaWF001JxYnAUguzJ9h2VHUdfGbyloX5kaU4snm-hvRUdqcCi15g/s1881/The+akronomicon.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="889" data-original-width="1881" height="189" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiysEOU-iQyZp9yUXle46nZBqEJ-GDi2LWoRdzIxg6QSSrpMCcEGAyY6HBSE-5A3wDQYobt2pVXR9llUWD0NKQaWF001JxYnAUguzJ9h2VHUdfGbyloX5kaU4snm-hvRUdqcCi15g/w400-h189/The+akronomicon.png" width="400" /></a></div><br /><div><div style="text-align: justify;">As larger models seem to be providing more context and more ability for zero-shot learning, <a href="https://lolo.science/">Julien</a> just created <a href="https://lair.lighton.ai/akronomicon/">the Akronomicon: an Extreme-Scale Leaderboard</a> featuring the world's largest Machine Learning Models. And yes, <a href="https://LightOn;ai">LightOn</a> is on that board for the moment!</div></div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;"> Want to contribute? <a href="https://github.com/lightonai/akronomicon">https://github.com/lightonai/akronomicon</a> </div><div style="text-align: justify;"><br /></div><div><br /></div><div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-50006553694725130392021-04-28T00:00:00.028-05:002021-04-28T00:00:00.322-05:00 Virtual Workshop: Conceptual Understanding of Deep Learning (May 17th 9am-4pm PST)** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **<div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhyjHTzXgI9ipN8KjpbT8xPDLL7HRWgq74Y6Gw9kcvHIWnlMhR05VgKt4PRnt4J2Qu1Qr4K3dYc_JNdNMSwnTplbP5ePYMbZEO7yQYpZwh5FxanwSpqjahfvgRlzzaeXQk201RAyQ/s1414/cudl.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="304" data-original-width="1414" height="86" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhyjHTzXgI9ipN8KjpbT8xPDLL7HRWgq74Y6Gw9kcvHIWnlMhR05VgKt4PRnt4J2Qu1Qr4K3dYc_JNdNMSwnTplbP5ePYMbZEO7yQYpZwh5FxanwSpqjahfvgRlzzaeXQk201RAyQ/w400-h86/cudl.png" width="400" /></a></div><div><br /></div><div>Just got an email from <a href="https://www.blogger.com/#">Rina Panigrahy</a><br /><br /><blockquote><div style="text-align: justify;">Hi Igor,</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">I am an algorithms researcher at Google (<a href="http://theory.stanford.edu/~rinap">http://theory.stanford.edu/~rinap</a>) and I am organizing this workshop on "<a href="https://sites.google.com/view/conceptualdlworkshop/home">Conceptual Understanding of Deep Learning</a>" (details below). It's trying to understand the Brain/Mind as an algorithm from a mathematical/theoretical perspective. I believe that a mathematical/algorithmic approach for understanding the Mind is crucial and very much missing. I'd appreciate any help I can get with advertising this on your blog/mailing-lists/<a href="https://www.blogger.com/#">twitter</a>.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Best,</div><div style="text-align: justify;">Rina</div></blockquote><br />Here is the invite:<br /><br /><div style="text-align: justify;"></div><blockquote><div style="text-align: justify;">Please join us for a virtual Google workshop on “<a href="https://sites.google.com/view/conceptualdlworkshop/home">Conceptual Understanding of Deep Learning</a>”</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">When: May 17th 9am-4pm PST.</div><div style="text-align: justify;">Where: <a href="https://www.youtube.com/watch?v=g5DGBWjiULQ">Live over Youtube</a>,</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Goal: How does the Brain/Mind (perhaps even an artificial one) work at an algorithmic level? While deep learning has produced tremendous technological strides in recent decades, there is an unsettling feeling of a lack of “conceptual” understanding of why it works and to what extent it will work in the current form. The goal of the workshop is to bring together theorists and practitioners to develop an understanding of the right algorithmic view of deep learning, characterizing the class of functions that can be learned, coming up with the right learning architecture that may (provably) learn multiple functions, concepts and remember them over time as humans do, theoretical understanding of language, logic, RL, meta learning and lifelong learning.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">The speakers and panelists include Turing award winners Geoffrey Hinton, Leslie Valiant, and Godel Prize winner Christos Papadimitriou (<a href="https://sites.google.com/view/conceptualdlworkshop/home">full-details</a>).</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Panel Discussion: There will also be a panel discussion on the fundamental question of “Is there a mathematical model for the Mind?”. We will explore basic questions such as “Is there a provable algorithm that captures the essential capabilities of the mind?”, “How do we remember complex phenomena?”, “How is a knowledge graph created automatically?”, “How do we learn new concepts, function and action hierarchies over time?” and “Why do human decisions seem so interpretable?”</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Twitter: #ConceptualDLWorkshop.</div><div style="text-align: justify;">Please help advertise on mailing-lists/blog-posts and Retweet.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Hope to see you there!</div><div style="text-align: justify;"><a href="https://www.blogger.com/#">Rina Panigrahy</a></div><div style="text-align: justify;"></div></blockquote><div style="text-align: justify;"><br /></div><a href="https://www.blogger.com/#"></a><br /></div><div><br /></div><div><br /></div><div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-84084261057228119722021-04-27T00:00:00.021-05:002021-04-27T00:00:00.283-05:00Randomized Algorithms for Scientific Computing (RASC)** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **<div><br /></div><div style="text-align: justify;">At <a href="https://lighton.ai" target="_blank">LightOn</a>, we build photonic hardware that performs random projections and it is nice to find a source of materials on the subject in one document. <span style="text-align: left;">Here is a report comprehensively presenting how randomized algorithms are key to the future of computing:</span></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvrNZm0GyfAmwVqm1khp71QKjASRPKFBqedhfM1TPznAii7Yg1kUVEmtt4ZCYbWJw0jyMS1M0RvlQneZMRPSiH-YzPrI1K7yxOV8N6ZoVKkDaWzyt1jN3hniyGCZ1c1p8u25UnXg/s804/randomized+projection+tokamaks.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="525" data-original-width="804" height="261" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvrNZm0GyfAmwVqm1khp71QKjASRPKFBqedhfM1TPznAii7Yg1kUVEmtt4ZCYbWJw0jyMS1M0RvlQneZMRPSiH-YzPrI1K7yxOV8N6ZoVKkDaWzyt1jN3hniyGCZ1c1p8u25UnXg/w400-h261/randomized+projection+tokamaks.png" width="400" /></a></div><br /><div><div><br /></div><a href="https://arxiv.org/pdf/2104.11079.pdf" target="_blank">Randomized Algorithms for Scientific Computing (RASC)</a> by <a href="https://arxiv.org/search/cs?searchtype=author&query=Buluc%2C+A">Aydin Buluc</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Kolda%2C+T+G">Tamara G. Kolda</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Wild%2C+S+M">Stefan M. Wild</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Anitescu%2C+M">Mihai Anitescu</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=DeGennaro%2C+A">Anthony DeGennaro</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Jakeman%2C+J">John Jakeman</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Kamath%2C+C">Chandrika Kamath</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Ramakrishnan">Ramakrishnan</a> (Ramki)<a href="https://arxiv.org/search/cs?searchtype=author&query=Kannan">Kannan</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Lopes%2C+M+E">Miles E. Lopes</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Martinsson%2C+P">Per-Gunnar Martinsson</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Myers%2C+K">Kary Myers</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Nelson%2C+J">Jelani Nelson</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Restrepo%2C+J+M">Juan M. Restrepo</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Seshadhri%2C+C">C. Seshadhri</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Vrabie%2C+D">Draguna Vrabie</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Wohlberg%2C+B">Brendt Wohlberg</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Wright%2C+S+J">Stephen J. Wright</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Yang%2C+C">Chao Yang</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Zwart%2C+P">Peter Zwart</a><div><br /></div><div><blockquote style="text-align: justify;">Randomized algorithms have propelled advances in artificial intelligence and represent a foundational research area in advancing AI for Science. Future advancements in DOE Office of Science priority areas such as climate science, astrophysics, fusion, advanced materials, combustion, and quantum computing all require randomized algorithms for surmounting challenges of complexity, robustness, and scalability. This report summarizes the outcomes of that workshop, "Randomized Algorithms for Scientific Computing (RASC)," held virtually across four days in December 2020 and January 2021.</blockquote><br /><div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div></div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-66187229846169332742021-04-06T10:23:00.002-05:002021-04-06T11:08:27.949-05:00The $1,000 GPT-3** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **<div><br /></div><div><p class="graf graf--p" name="6dac" style="text-align: justify;">Progress usually comes from a steady technology bootstrap…until it doesn’t.</p><p class="graf graf--p" name="df6d" style="text-align: justify;">Take for instance the race for the $1,000 genome that started in the early 2000s. Initially, <a class="markup--anchor markup--p-anchor" data-href="https://www.genome.gov/about-genomics/fact-sheets/Sequencing-Human-Genome-cost" href="https://www.genome.gov/about-genomics/fact-sheets/Sequencing-Human-Genome-cost" rel="noopener" target="_blank">sequencing the human genome</a> meant a race between the well-funded public and private sectors but more importantly, the resources for the first breakthrough ended up costing upwards of $450M. Yet despite all the economic promise of genome sequencing, had Moore’s law been applied, sequencing one full genome would still cost $100,000 today. However, once the goal became clearer to everyone, a diversity of technologies and challengers emerged. This intense competition eventually yielded a growth faster than Moore’s Law. The main takeaway is that one cannot rely on the steady progress of one specific technology alone to commoditize tools.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihah19epDKmVBZZ22Abl4Zxn5qI1VFxlRQq5Rd094tMHWn0TkrYU0jEKzJrEcH4SQ-J7GEXIhfQvno_r-YpgNrM21y9J4Fbvsjqi1abtZoeH9milSEFfeITqmDZTeGWitUq2mAKg/s1000/cost+of+human+genome.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="562" data-original-width="1000" height="225" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihah19epDKmVBZZ22Abl4Zxn5qI1VFxlRQq5Rd094tMHWn0TkrYU0jEKzJrEcH4SQ-J7GEXIhfQvno_r-YpgNrM21y9J4Fbvsjqi1abtZoeH9milSEFfeITqmDZTeGWitUq2mAKg/w400-h225/cost+of+human+genome.png" width="400" /></a></div><br /><figure class="graf graf--figure" name="04a6"><br /><figcaption class="imageCaption">Figure from NIH <a class="markup--anchor markup--figure-anchor" data-href="https://www.genome.gov/about-genomics/fact-sheets/Sequencing-Human-Genome-cost" href="https://www.genome.gov/about-genomics/fact-sheets/Sequencing-Human-Genome-cost" rel="noopener" target="_blank">“Facts sheets about genomics: The cost of Sequencing a Human Genome”</a>, Dec 7th, 2020.</figcaption></figure><p class="graf graf--p" name="443c" style="text-align: justify;">What does this have to do with the current state of silicon computing and the new demand for Large Language Models (LLMs)? Everything if you ask us and here is how.</p><p class="graf graf--p" name="6cf0" style="text-align: justify;">Less than a year into existence, Large Language Models like GPT-3 have already <a class="markup--anchor markup--p-anchor" data-href="https://openai.com/blog/gpt-3-apps/" href="https://openai.com/blog/gpt-3-apps/" rel="noopener" target="_blank">spawned a new generation of startups</a> built on the ability of the model to respond to requests for which it was not trained. More importantly for us, hardware manufacturers are positing that <a class="markup--anchor markup--p-anchor" data-href="https://www.nextplatform.com/2021/02/11/the-billion-dollar-ai-problem-that-just-keeps-scaling/" href="https://www.nextplatform.com/2021/02/11/the-billion-dollar-ai-problem-that-just-keeps-scaling/" rel="noopener" target="_blank">one or several customers will be willing to put a billion dollars</a> on the table to train an even larger model in the coming years.</p><p class="graf graf--p" name="e331" style="text-align: justify;">Interestingly, much like the mass industrialization in the 1930s, the good folks at OpenAI are sketching new <a class="markup--anchor markup--p-anchor" data-href="https://arxiv.org/abs/2001.08361" href="https://arxiv.org/abs/2001.08361" rel="noopener" target="_blank">scaling laws</a> for the industrialization of these larger models.</p><p class="graf graf--p" name="51fc" style="text-align: justify;">The sad truth is that extrapolating their findings to the training of a 10 Trillion parameters model involves a supercomputer <em class="markup--em markup--p-em">running</em> <em class="markup--em markup--p-em">continuously for</em> <em class="markup--em markup--p-em">two decades</em>. The minimum capital expenditure of this adventure is estimated in the realm of several hundreds of million dollars.</p><p class="graf graf--p" name="6dab" style="text-align: justify;">Much like what happened in sequencing, while silicon improvement and architecture may achieve speedups in the following years, it is fair to say that, even with Moore’s law, no foreseeable technology can reasonably train a fully scaled-up GPT-4 and grab the economic value associated with it<strong class="markup--strong markup--p-strong">.</strong></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgT7pFK1DJdxUUr4ppnySQGFJoccxBgZC1BC-8QIYv2k16rKiXi0ze1yPVZ9P8rqC90zczcmM1FUuaKMbjD0A15Y15WdqOTgeXSyBziyTcGcFs2yo7glrf3XHdv8_-Vmg56aKzn0A/s1365/lighton+more+compute+less+hardware.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="761" data-original-width="1365" height="223" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgT7pFK1DJdxUUr4ppnySQGFJoccxBgZC1BC-8QIYv2k16rKiXi0ze1yPVZ9P8rqC90zczcmM1FUuaKMbjD0A15Y15WdqOTgeXSyBziyTcGcFs2yo7glrf3XHdv8_-Vmg56aKzn0A/w400-h223/lighton+more+compute+less+hardware.png" width="400" /></a></div><br /><figure class="graf graf--figure" name="b0b0"><br /></figure><p class="graf graf--p" name="202c" style="text-align: justify;"><strong class="markup--strong markup--p-strong">Rebooting silicon with a different physics, light, and NvNs</strong></p><p class="graf graf--p" name="5cb0" style="text-align: justify;">For a real breakthrough to occur, much like what happened in the sequencing story, different technologies need to be jointly optimized. In our case, this means performing co-design with new hardware and physics but also going rogue on full programmability.</p><p class="graf graf--p" name="4ad3" style="text-align: justify;"><a class="markup--anchor markup--p-anchor" data-href="https://lighton.ai/lighton-appliance/" href="https://lighton.ai/lighton-appliance/" rel="noopener" target="_blank">LightOn’s photonic hardware</a> can produce massively parallel matrix-vector multiplications with an equivalent of 2 trillion parameters “for free”: this is about one-fifth of the number of parameters needed for GPT-4. Next comes revisiting the programmability. Current LightOn’s technology keeps these weights fixed <em class="markup--em markup--p-em">by design</em>. Co-design means finding the algorithms for which CPUs and GPUs can perform some of the most intelligent computations and how LightOn’s massive Non-von Neumann (NvN) hardware can do the heavy lifting. We <a class="markup--anchor markup--p-anchor" data-href="https://papers.nips.cc/paper/2020/file/69d1fc78dbda242c43ad6590368912d4-Paper.pdf" href="https://papers.nips.cc/paper/2020/file/69d1fc78dbda242c43ad6590368912d4-Paper.pdf" rel="noopener" target="_blank">already published</a> how we are replacing backpropagation, the workhorse of Deep Learning, with an <a class="markup--anchor markup--p-anchor" data-href="https://venturebeat.com/2020/06/03/lighton-researchers-explain-how-they-trained-an-ai-model-on-an-optical-co-processor/" href="https://venturebeat.com/2020/06/03/lighton-researchers-explain-how-they-trained-an-ai-model-on-an-optical-co-processor/" rel="noopener" target="_blank">algorithm that unleashes</a> the full potential of our hardware in distributed training. We are also working similarly on an inference step that will take full advantage of the massive number of parameters at our disposal. This involved effort relies in a heavy part thanks to our access to ½ million GPU hours on some of <a class="markup--anchor markup--p-anchor" data-href="http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html" href="http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html" rel="noopener" target="_blank">France</a>’s and Europe’s largest supercomputers.</p><p class="graf graf--p" name="7537" style="text-align: justify;">And this is just the beginning. There is a vast untapped potential for repurposing large swaths of optical technologies directed primarily for entertainment and telecommunication into computing.</p><p class="graf graf--p" name="b829" style="text-align: justify;"><strong class="markup--strong markup--p-strong">The road towards a $1,000 GPT-3</strong></p><p class="graf graf--p" name="71d0" style="text-align: justify;">Based on the GPT-3 <a class="markup--anchor markup--p-anchor" data-href="https://lambdalabs.com/blog/demystifying-gpt-3/" href="https://lambdalabs.com/blog/demystifying-gpt-3/" rel="noopener" target="_blank">training cost estimates</a>, achieving a $1,000 GPT-3 requires four orders of magnitude improvements. Much like what occurred in 2007 with the genome sequencing revolution, Moore’s law may take care of the first two orders of magnitude in the coming decade but the next two rely on an outburst of new efficient technologies — hardware <em class="markup--em markup--p-em">and </em>algorithms. It just so happens that GPT-3 has close to 100 layers, so achieving two orders of magnitude savings may arise faster than you can imagine. Stay tuned!</p><p class="graf graf--p" name="d1a8" style="text-align: justify;">Igor Carron is the CEO and co-founder at <a class="markup--anchor markup--p-anchor" data-href="https://lighton.ai" href="https://lighton.ai" rel="noopener" target="_blank">LightOn</a></p></div><div style="text-align: justify;"><br /></div><div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-91482683351298119122021-03-24T13:36:00.004-05:002021-03-24T13:36:41.612-05:00Computing with Light: How LightOn intends to unlock Transformative AI<div><span style="text-align: justify;">I gave a talk at </span><a href="https://www.linkedin.com/feed/hashtag/?keywords=mathia2021&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6780551699456167936" style="text-align: justify;">#mathia2021</a><span style="text-align: justify;"> conference on March 9th, 2021 where I drew a parallel between the scaling laws that enabled industrialization in the 1920's and the new scaling laws in AI of the 2020's. AI is at its infancy and it needs to have guiding principles (as embedded in these empirical laws) and it also needs to develop new hardware. I showed how, in this context, </span><a href="https://www.linkedin.com/company/lighton/" style="text-align: justify;">LightOn</a><span style="text-align: justify;"> can help unlock Transformative AI. Enjoy!</span></div><div style="text-align: justify;"><br /></div><div><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="266" src="https://www.youtube.com/embed/0QtY4_UJF0w" width="320" youtube-src-id="0QtY4_UJF0w"></iframe></div><div style="text-align: justify;"><br /></div><span style="background-color: white; color: rgba(0, 0, 0, 0.75); font-family: "Source Serif Pro", serif; font-size: 20px; white-space: pre-wrap;"><div style="text-align: justify;"><br /></div></span></div><div style="text-align: justify;">All these other presentations by <a href="http://yann.lecun.com/">Yann LeCun</a>, <a href="https://people.epfl.ch/kathryn.hess">Kathryn Hess,</a> <a href="https://people.eecs.berkeley.edu/~jordan/">Michael Jordan</a>, <a href="https://statweb.stanford.edu/~candes/">Emmanuel Candès</a> and others can be found in this <a href="https://vimeo.com/showcase/8236351">collection of videos on Vimeo</a>. Let me note that Michael made a similar argument as mine where we think of current stage of AI at its infancy in terms of industrialization. </div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;"> </div><div style="text-align: justify;">Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a></div>
<div style="text-align: justify;"><br /></div>
<div style="text-align: justify;"><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.</div>
<div style="text-align: justify;"><br /></div><div style="text-align: justify;">Other links:</div>
<b><div style="text-align: justify;"><b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a></div></b><div style="text-align: justify;"><u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
</div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-1354722182467214842021-03-08T12:59:00.002-06:002021-03-08T12:59:58.833-06:00Unveiling LightOn Appliance<div style="text-align: justify;">Today is a big day at <a href="http://lighton.ai" target="_blank">LightOn</a> as we unveil a hardware product, the Appliance, the world's first commercially available photonic co-processor for AI and HPC</div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;">If interested pre-ordering information is here: <a href="http://lighton.ai/lighton-appliance">http://lighton.ai/lighton-appliance</a> </div><div class="separator" style="clear: both; text-align: justify;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvOnEEt3H7UtNAoRv9GVhJFjWPnjpykFHbRZvEUxkXoBJZb259_KMuQMWuBIXLDkDqgKyOWI6BOSSwt6XO_oDAgT1T23eg69tS51LzCFqiMeutmVQDU2zHCFJXDwPtUsvwjpOAWA/s2048/LightOn+Appliance+%2528release+March+8+2021%2529.png" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"><img border="0" data-original-height="1046" data-original-width="2048" height="204" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvOnEEt3H7UtNAoRv9GVhJFjWPnjpykFHbRZvEUxkXoBJZb259_KMuQMWuBIXLDkDqgKyOWI6BOSSwt6XO_oDAgT1T23eg69tS51LzCFqiMeutmVQDU2zHCFJXDwPtUsvwjpOAWA/w400-h204/LightOn+Appliance+%2528release+March+8+2021%2529.png" width="400" /></a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">We have had a few of these optical processing units in our own LightOn Cloud for the past two years and just retired one after more than 800 days working full time. </div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Here is the <a href="http://lighton.ai/lighton-appliance-press-release/">press release</a>: </div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;">The future is now! </div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Leasing starts at 1900€/month or about US$2250/month </div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;"> </div><div style="text-align: justify;">Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a></div>
<div style="text-align: justify;"><br /></div>
<div style="text-align: justify;"><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.</div>
<div style="text-align: justify;"><br /></div><div style="text-align: justify;">Other links:</div>
<b><div style="text-align: justify;"><b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a></div></b><div style="text-align: justify;"><u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
</div></div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-46698234007102868962021-03-04T08:49:00.009-06:002021-03-04T10:06:04.794-06:00Video: LightOn unlocks Transformative AI<div style="text-align: justify;"><span class="css-901oao css-16my406 r-poiln3 r-bcqeeo r-qvutc0" style="background-color: rgba(0, 0, 0, 0.03); border: 0px solid black; box-sizing: border-box; color: #0f1419; display: inline; font-stretch: inherit; font-variant-east-asian: inherit; font-variant-numeric: inherit; line-height: inherit; margin: 0px; min-width: 0px; overflow-wrap: break-word; padding: 0px; white-space: pre-wrap;"><span style="font-family: inherit;"></span></span></div>In the coming days, we'll be making another announcement but I wanted to first share a video we did recently. At <a href="https://lighton.ai">LightOn</a>, we don't build photonic computing hardware because it's fancy or cool (even though, <b>it is cool</b>) but because computing hardware is hitting the limits. I know what some say about Moore's law not being dead but the recent focus on <a href="https://arxiv.org/abs/1706.03762">Transformers</a> and their <a href="https://arxiv.org/abs/2001.08361">attendant scaling laws</a> makes it obvious that in order for more people to have access to these models, we need a new computing paradigm. Indeed not everyone can afford to spend a <a href="https://www.nextplatform.com/2021/02/11/the-billion-dollar-ai-problem-that-just-keeps-scaling/">billion dollars in training these models</a>. As <a href="https://www.exponentialview.co/">Azeem was recently pointing out in one of his newsletters</a>, this is how bad things will become:<blockquote><div style="text-align: justify;"><span class="css-901oao css-16my406 r-poiln3 r-bcqeeo r-qvutc0" style="background-color: rgba(0, 0, 0, 0.03); border: 0px solid black; box-sizing: border-box; color: #0f1419; display: inline; font-stretch: inherit; font-variant-east-asian: inherit; font-variant-numeric: inherit; line-height: inherit; margin: 0px; min-width: 0px; overflow-wrap: break-word; padding: 0px; white-space: pre-wrap;"><i><span face="Roboto, RobotoDraft, Helvetica, Arial, sans-serif" style="background-color: white; color: #1a1a1a; font-size: 16px; text-align: -webkit-left; white-space: normal;">The amazing thing is that we can start to compare the cost of training single AI models with the cost of building the physical fabs that make chips. TSMC’s state-of-the-art 3nm </span><span class="il" face="Roboto, RobotoDraft, Helvetica, Arial, sans-serif" style="background-color: white; color: #1a1a1a; font-size: 16px; text-align: -webkit-left; white-space: normal;">fab</span><span face="Roboto, RobotoDraft, Helvetica, Arial, sans-serif" style="background-color: white; color: #1a1a1a; font-size: 16px; text-align: -webkit-left; white-space: normal;"> </span><a data-saferedirecturl="https://www.google.com/url?q=http://email.substack1.exponentialview.co/c/eJxdkEuOwyAMQE9TlhHmk8-CxWx6jYiA06JJISJOM7n9ONPdSGCQZcvPL3jCR6mnW8tG4gojnSu6jMe2IBFWsW9YxxQd6GGw0oroOpiMUSJt41wRXz4tTqz7tKTgKZV8FStllXi62XoN0SgvY5zbHoNpoetkN0n-BSk_E_0eE-aADt9Yz5JRLO5JtG43_XVTdz7HcTTk0-HzBdaE8mro4DxmDleKH91LKzVjOSUVSMXX2M4ODTR-bmWL4I22OliYZwhyitDrSWmFergZue3TRj58Q4M_KyNkSn55Jzx4mKgusSSuWtLjSSU3qVwLj8zx2nOic8TspwWjo7qjoI_NPzPjAzNWthxHTw5a0H2vzaAkdJ_lWZZWnYLWasEQsXBXdv8gfgF5wI0m&source=gmail&ust=1614953733705000&usg=AFQjCNERZoaS_uhl1jBshC2GE2GCTL_yRw" href="http://email.substack1.exponentialview.co/c/eJxdkEuOwyAMQE9TlhHmk8-CxWx6jYiA06JJISJOM7n9ONPdSGCQZcvPL3jCR6mnW8tG4gojnSu6jMe2IBFWsW9YxxQd6GGw0oroOpiMUSJt41wRXz4tTqz7tKTgKZV8FStllXi62XoN0SgvY5zbHoNpoetkN0n-BSk_E_0eE-aADt9Yz5JRLO5JtG43_XVTdz7HcTTk0-HzBdaE8mro4DxmDleKH91LKzVjOSUVSMXX2M4ODTR-bmWL4I22OliYZwhyitDrSWmFergZue3TRj58Q4M_KyNkSn55Jzx4mKgusSSuWtLjSSU3qVwLj8zx2nOic8TspwWjo7qjoI_NPzPjAzNWthxHTw5a0H2vzaAkdJ_lWZZWnYLWasEQsXBXdv8gfgF5wI0m" style="background-color: white; color: #1a1a1a; font-family: Roboto, RobotoDraft, Helvetica, Arial, sans-serif; font-size: 16px; text-align: -webkit-left; white-space: normal;" target="_blank">will run to around $20bn</a><span face="Roboto, RobotoDraft, Helvetica, Arial, sans-serif" style="background-color: white; color: #1a1a1a; font-size: 16px; text-align: -webkit-left; white-space: normal;"> when it is completed in two years. A </span><span class="il" face="Roboto, RobotoDraft, Helvetica, Arial, sans-serif" style="background-color: white; color: #1a1a1a; font-size: 16px; text-align: -webkit-left; white-space: normal;">fab</span><span face="Roboto, RobotoDraft, Helvetica, Arial, sans-serif" style="background-color: white; color: #1a1a1a; font-size: 16px; text-align: -webkit-left; white-space: normal;"> like this may be competitive for 5-7 years, which means it’ll need to churn out $7-8m worth of chips every day before it pays back.</span></i></span></div></blockquote><div style="text-align: justify;"><span class="css-901oao css-16my406 r-poiln3 r-bcqeeo r-qvutc0" style="background-color: rgba(0, 0, 0, 0.03); border: 0px solid black; box-sizing: border-box; color: #0f1419; display: inline; font-stretch: inherit; font-variant-east-asian: inherit; font-variant-numeric: inherit; line-height: inherit; margin: 0px; min-width: 0px; overflow-wrap: break-word; padding: 0px; white-space: pre-wrap;"><span face="Roboto, RobotoDraft, Helvetica, Arial, sans-serif" style="background-color: white; color: #1a1a1a; font-size: 16px; text-align: -webkit-left; white-space: normal;"><i></i></span></span></div><div style="text-align: justify;"><span class="css-901oao css-16my406 r-poiln3 r-bcqeeo r-qvutc0" style="background-color: rgba(0, 0, 0, 0.03); border: 0px solid black; box-sizing: border-box; color: #0f1419; display: inline; font-stretch: inherit; font-variant-east-asian: inherit; font-variant-numeric: inherit; line-height: inherit; margin: 0px; min-width: 0px; overflow-wrap: break-word; padding: 0px; white-space: pre-wrap;"><span style="font-family: inherit;"><br /></span></span></div><div style="text-align: justify;"><span class="css-901oao css-16my406 r-poiln3 r-bcqeeo r-qvutc0" style="background-color: rgba(0, 0, 0, 0.03); border: 0px solid black; box-sizing: border-box; color: #0f1419; display: inline; font-stretch: inherit; font-variant-east-asian: inherit; font-variant-numeric: inherit; line-height: inherit; margin: 0px; min-width: 0px; overflow-wrap: break-word; padding: 0px; white-space: pre-wrap;"><span style="font-family: inherit;">And so at </span><a href="https://lighton.ai" style="font-family: inherit;">LightOn</a><span style="font-family: inherit;">, we think that a <a href="https://github.com/lightonai">combination of</a> <a href="https://lighton.ai/publications/">algorithms</a> and (cool) hardware as the only pathway forward for computing large-scale AI. The video is right <a href="https://youtu.be/f2jwgcziECQ">here</a>, enjoy!</span></span></div><div style="text-align: justify;"><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="266" src="https://www.youtube.com/embed/f2jwgcziECQ" width="320" youtube-src-id="f2jwgcziECQ"></iframe></div><br /><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div>
<br />Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-16708763902131345932020-12-29T10:42:00.002-06:002020-12-29T10:42:10.358-06:00The Awesome Implicit Neural Representations Highly Technical Reference Page<div style="text-align: center;">** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> ** </div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Here is a <a href="https://nuit-blanche.blogspot.com/p/reference-page.html">new curated page</a> on the topic of Implicit Neural Representations aptly called <a href="https://github.com/vsitzmann/awesome-implicit-representations" target="_blank">Awesome Implicit Neural Representations</a>. It is curated by <a href="https://www.blogger.com/#">Vincent Sitzmann</a> (@vincesitzmann) and has been added to the <a href="https://nuit-blanche.blogspot.com/p/reference-page.html" target="_blank">Highly Technical Reference Page</a>:</div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: justify;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrTAShmZaq2SP3-LPtAH7rx0GJArAMKOtuRyjIr8EbNtrV75khNohO8X0is8P2LrPH39y9o4_GJFhN7kWOE_M8-xQqbiG5d-JyWzCqi1yexRLw4PmRPx5B4CAC3XjHDGT3mcJ4PA/s1171/awesomenerf.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="586" data-original-width="1171" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrTAShmZaq2SP3-LPtAH7rx0GJArAMKOtuRyjIr8EbNtrV75khNohO8X0is8P2LrPH39y9o4_GJFhN7kWOE_M8-xQqbiG5d-JyWzCqi1yexRLw4PmRPx5B4CAC3XjHDGT3mcJ4PA/w400-h200/awesomenerf.png" width="400" /></a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">From <a href="https://www.blogger.com/blog/post/edit/6141980/1670876390213134593#" target="_blank">the page</a>:</div><div><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px;"></p><blockquote><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">A curated list of resources on implicit neural representations, inspired by <a href="https://github.com/jbhuang0604/awesome-computer-vision" style="background-color: initial; box-sizing: border-box; text-decoration-line: none;">awesome-computer-vision</a>. Work-in-progress.</p><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">This list does not aim to be exhaustive, as implicit neural representations are a rapidly evolving & growing research field with hundreds of papers to date.</p><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">Instead, this list aims to list papers introducing key concepts & foundations of implicit neural representations across applications. It's a great reading list if you want to get started in this area!</p><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">For most papers, there is a short summary of the most important contributions.</p><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">Disclosure: I am an author on the following papers:</p><ul style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; padding-left: 2em;"><li style="box-sizing: border-box; text-align: justify;"><a href="https://vsitzmann.github.io/srns/" rel="nofollow" style="background-color: initial; box-sizing: border-box; text-decoration-line: none;">Scene Representation Networks: Continuous 3D-Structure-Aware Neural Scene Representations</a></li><li style="box-sizing: border-box; margin-top: 0.25em; text-align: justify;"><a href="https://vsitzmann.github.io/metasdf/" rel="nofollow" style="background-color: initial; box-sizing: border-box; text-decoration-line: none;">MetaSDF: MetaSDF: Meta-Learning Signed Distance Functions</a></li><li style="box-sizing: border-box; margin-top: 0.25em; text-align: justify;"><a href="https://vsitzmann.github.io/siren/" rel="nofollow" style="background-color: initial; box-sizing: border-box; text-decoration-line: none;">Implicit Neural Representations with Periodic Activation Functions</a></li><li style="box-sizing: border-box; margin-top: 0.25em; text-align: justify;"><a href="https://www.computationalimaging.org/publications/semantic-srn/" rel="nofollow" style="background-color: initial; box-sizing: border-box; text-decoration-line: none;">Inferring Semantic Information with 3D Neural Scene Representations</a></li></ul><h2 style="background-color: white; border-bottom: 1px solid var(--color-border-secondary); box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; line-height: 1.25; margin-bottom: 16px; margin-top: 24px; padding-bottom: 0.3em;"><a aria-hidden="true" class="anchor" href="https://github.com/vsitzmann/awesome-implicit-representations#what-are-implicit-neural-representations" id="user-content-what-are-implicit-neural-representations" style="background-color: initial; box-sizing: border-box; float: left; line-height: 1; margin-left: -20px; padding-right: 4px; text-decoration-line: none;"><svg aria-hidden="true" class="octicon octicon-link" height="16" version="1.1" viewbox="0 0 16 16" width="16"><path d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z" fill-rule="evenodd"></path></svg></a><div style="text-align: justify;">What are implicit neural representations?</div></h2><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">Implicit Neural Representations (sometimes also referred to coordinate-based representations) are a novel way to parameterize signals of all kinds. Conventional signal representations are usually discrete - for instance, images are discrete grids of pixels, audio signals are discrete samples of amplitudes, and 3D shapes are usually parameterized as grids of voxels, point clouds, or meshes. In contrast, Implicit Neural Representations parameterize a signal as a <em style="box-sizing: border-box;">continuous function</em> that maps the domain of the signal (i.e., a coordinate, such as a pixel coordinate for an image) to whatever is at that coordinate (for an image, an R,G,B color). Of course, these functions are usually not analytically tractable - it is impossible to "write down" the function that parameterizes a natural image as a mathematical formula. Implicit Neural Representations thus approximate that function via a neural network.</p><h2 style="background-color: white; border-bottom: 1px solid var(--color-border-secondary); box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; line-height: 1.25; margin-bottom: 16px; margin-top: 24px; padding-bottom: 0.3em;"><a aria-hidden="true" class="anchor" href="https://github.com/vsitzmann/awesome-implicit-representations#why-are-they-interesting" id="user-content-why-are-they-interesting" style="background-color: initial; box-sizing: border-box; float: left; line-height: 1; margin-left: -20px; padding-right: 4px; text-decoration-line: none;"><svg aria-hidden="true" class="octicon octicon-link" height="16" version="1.1" viewbox="0 0 16 16" width="16"><path d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z" fill-rule="evenodd"></path></svg></a><div style="text-align: justify;">Why are they interesting?</div></h2><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">Implicit Neural Representations have several benefits: First, they are not coupled to spatial resolution anymore, the way, for instance, an image is coupled to the number of pixels. This is because they are continuous functions! Thus, the memory required to parameterize the signal is <em style="box-sizing: border-box;">independent</em> of spatial resolution, and only scales with the complexity of the underyling signal. Another corollary of this is that implicit representations have "infinite resolution" - they can be sampled at arbitrary spatial resolutions.</p><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">This is immediately useful for a number of applications, such as super-resolution, or in parameterizing signals in 3D and higher dimensions, where memory requirements grow intractably fast with spatial resolution.</p><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px; text-align: justify;">However, in the future, the key promise of implicit neural representations lie in algorithms that directly operate in the space of these representations. In other words: What's the "convolutional neural network" equivalent of a neural network operating on images represented by implicit representations? Questions like these offer a path towards a class of algorithms that are independent of spatial resolution!..........</p></blockquote><p style="background-color: white; box-sizing: border-box; color: #24292e; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji"; font-size: 16px; margin-bottom: 16px; margin-top: 0px;"></p><div style="text-align: justify;"><br /></div><div style="text-align: justify;">h/t <a href="https://ttic.uchicago.edu/~shubhendu/">Shubhendu Trivedi</a> (<a href="http://twitter.com/@_onionesque">@_onionesque</a>)</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a></div>
<div style="text-align: justify;"><br /></div>
<div style="text-align: justify;"><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.</div>
<div style="text-align: justify;"><br /></div><div style="text-align: justify;">Other links:</div>
<b><div style="text-align: justify;"><b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a></div></b><div style="text-align: justify;"><u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
</div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-1447234342003180292020-12-21T16:56:00.003-06:002020-12-21T16:56:40.370-06:00Hardware Beyond Backpropagation: a Photonic Co-Processor for Direct Feedback Alignment ** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> ** <div><br /></div><div style="text-align: justify;">We presented this work at the <a href="https://beyondbackprop.github.io/" target="_blank">Beyond Backpropagation workshop at NeurIPS</a>. A great conjunction between computational hardware and algorithm! </div><div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3o711itbtH7n_OwemdQBxvDFOWBQ_xn4tP8P_xd3R5tIXVrWhOWsdav9TQQ9QYXSUDPqliMZEIUShL3-liTK-QDCMdao49nSZ88oTyuBXu4O4tDnv_R9MxXXIPRWKh88D7r53GA/s1014/greatconjunction2020.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="451" data-original-width="1014" height="178" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3o711itbtH7n_OwemdQBxvDFOWBQ_xn4tP8P_xd3R5tIXVrWhOWsdav9TQQ9QYXSUDPqliMZEIUShL3-liTK-QDCMdao49nSZ88oTyuBXu4O4tDnv_R9MxXXIPRWKh88D7r53GA/w400-h178/greatconjunction2020.png" width="400" /></a></div><br /><div><br /></div><div style="text-align: justify;"><a href="https://arxiv.org/pdf/2012.06373.pdf" target="_blank">Hardware Beyond Backpropagation: a Photonic Co-Processor for Direct Feedback Alignment</a> by <a href="https://arxiv.org/search/cs?searchtype=author&query=Launay%2C+J">Julien Launay</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Poli%2C+I">Iacopo Poli</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=M%C3%BCller%2C+K">Kilian Müller</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Pariente%2C+G">Gustave Pariente</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Carron%2C+I">Igor Carron</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Daudet%2C+L">Laurent Daudet</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Krzakala%2C+F">Florent Krzakala</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Gigan%2C+S">Sylvain Gigan</a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"></div><blockquote><div style="text-align: justify;">The scaling hypothesis motivates the expansion of models past trillions of parameters as a path towards better performance. Recent significant developments, such as GPT-3, have been driven by this conjecture. However, as models scale-up, training them efficiently with backpropagation becomes difficult. Because model, pipeline, and data parallelism distribute parameters and gradients over compute nodes, communication is challenging to orchestrate: this is a bottleneck to further scaling. In this work, we argue that alternative training methods can mitigate these issues, and can inform the design of extreme-scale training hardware. Indeed, using a synaptically asymmetric method with a parallelizable backward pass, such as Direct Feedback Alignement, communication needs are drastically reduced. We present a photonic accelerator for Direct Feedback Alignment, able to compute random projections with trillions of parameters. We demonstrate our system on benchmark tasks, using both fully-connected and graph convolutional networks. Our hardware is the first architecture-agnostic photonic co-processor for training neural networks. This is a significant step towards building scalable hardware, able to go beyond backpropagation, and opening new avenues for deep learning.</div><div style="text-align: justify;"></div></blockquote><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a></div>
<div style="text-align: justify;"><br /></div>
<div style="text-align: justify;"><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.</div>
<div style="text-align: justify;"><br /></div><div style="text-align: justify;">Other links:</div>
<b><div style="text-align: justify;"><b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a></div></b><div style="text-align: justify;"><u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
</div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-54454010027087547472020-12-19T17:00:00.007-06:002020-12-19T17:00:02.931-06:00Diffraction-unlimited imaging based on conventional optical devices<div style="text-align: justify;">** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> ** </div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: left;"><a href="https://scholar.google.com/citations?hl=fr&user=Js_zdpYAAAAJ&view_op=list_works&sortby=pubdate" style="text-align: justify;" target="_blank">Aurélien</a> sent me an email back in October and we are now in December! Time flies.</div><div style="text-align: justify;"></div><blockquote><div style="text-align: justify;">Dear Igor,</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">I hope things are well.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">I have been following your NuitBlanche blog for quite a few years. It would thus be great for us if you consider a recent paper of ours to appear in your blog, entitled “Diffraction-unlimited imaging based on conventional optical devices”. This paper has been published in Optics Express this year and its link is: <a href="https://www.osapublishing.org/oe/abstract.cfm?uri=oe-28-8-11243">https://www.osapublishing.org/oe/abstract.cfm?uri=oe-28-8-11243</a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">This manuscript proposes a new imaging paradigm for objects that are too far away to be illuminated or accessed, which allows them to be resolved beyond the limit of diffraction---which is thus distinct from the microscopy setting. Our concept involves an easy-to-implement acquisition procedure where a spatial light modulator (SLM) is placed some distance from a conventional optical device. After acquisition of a sequence of images for different SLM patterns, the object is reconstructed numerically. The key novelty of our acquisition approach is to ensure that the SLM modulates light before information is lost due to diffraction.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Feel free to let us know what you think, and happy to provide more information/pictures if needed. Thanks a lot for your time and consideration!</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Best regards,</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Aurélien Bourquard</div></blockquote><div style="text-align: justify;"></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Thank you <a href="https://scholar.google.com/citations?hl=fr&user=Js_zdpYAAAAJ&view_op=list_works&sortby=pubdate" target="_blank">Aurélien</a>! </div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"> Here is the paper's abstract:</div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: justify;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfATfdGqepeWVnvVJc2HqR5XPoAd53M5Hxa4IFtNv5G21piAPcVQVMqxxWHUdUKwQsL6ecNtbPbOHzHwuIVhIpyvVrba_ORFkyhyAO-n12HUyRpieiohYsaNVYoJb6peWPjLL6aA/s757/acquisition+reconstruction+pipeline.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="467" data-original-width="757" height="246" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfATfdGqepeWVnvVJc2HqR5XPoAd53M5Hxa4IFtNv5G21piAPcVQVMqxxWHUdUKwQsL6ecNtbPbOHzHwuIVhIpyvVrba_ORFkyhyAO-n12HUyRpieiohYsaNVYoJb6peWPjLL6aA/w400-h246/acquisition+reconstruction+pipeline.png" width="400" /></a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><a href="https://www.osapublishing.org/oe/fulltext.cfm?uri=oe-28-8-11243&id=429676" target="_blank">Diffraction-unlimited imaging based on conventional optical devices</a> by <a href="https://www.creatis.insa-lyon.fr/~ducros/WebPage/index.html" target="_blank">Nicolas Ducros</a> and <a href="https://scholar.google.com/citations?hl=fr&user=Js_zdpYAAAAJ&view_op=list_works&sortby=pubdate" target="_blank">Aurélien Bourquard</a></div><div style="text-align: justify;"><blockquote style="text-align: justify;">We propose a computational paradigm where off-the-shelf optical devices can be used to image objects in a scene well beyond their native optical resolution. By design, our approach is generic, does not require active illumination, and is applicable to several types of optical devices. It only requires the placement of a spatial light modulator some distance from the optical system. In this paper, we first introduce the acquisition strategy together with the reconstruction framework. We then conduct practical experiments with a webcam that confirm that this approach can image objects with substantially enhanced spatial resolution compared to the performance of the native optical device. We finally discuss potential applications, current limitations, and future research directions.</blockquote></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">I note that <a href="https://www.blogger.com/blog/post/edit/6141980/5445401002708754747#">Aurélien</a> has also published some exciting research on <a href="https://www.blogger.com/#">Differential Imaging Forensics</a>. His co-author <a href="https://www.blogger.com/#">Nicolas</a> has also some interesting work on <a href="https://www.blogger.com/#">Single Pixel cameras</a>.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div><div style="text-align: center;"><br /></div><div><div style="text-align: center;"><br /></div><div style="text-align: center;">Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a></div>
<div style="text-align: center;"><br /></div>
<div style="text-align: center;"><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.</div>
<div style="text-align: center;"><br /></div><div style="text-align: center;">Other links:</div>
<b><div style="text-align: center;"><b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a></div></b><div style="text-align: center;"><u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
</div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-83565057013629238112020-12-09T00:00:00.001-06:002020-12-09T00:00:07.339-06:00LightOn at #NeurIPS2020** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> ** <div><br /></div><div><br /><div class="separator" style="clear: both; text-align: left;">I posted the following on <a href="https://lighton.ai/">LightOn</a>'s <a href="https://medium.com/@LightOnIO/lighton-at-neurips2020-236182417a08">Blog</a>.</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjM1wyyuPVmGHlVpbDuBjhcV0wQXxK3y62ncllvUgWnl-88lq4bXwPe2IiTn0RnEelTQk8xbuog5zUhbZutKCzyB_vGf42d4r4s5bHAuqGKJEM6WkctEqOMCAyJSOZqfqfvOmRc4w/s1204/LightOn+Neurips.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="232" data-original-width="1204" height="78" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjM1wyyuPVmGHlVpbDuBjhcV0wQXxK3y62ncllvUgWnl-88lq4bXwPe2IiTn0RnEelTQk8xbuog5zUhbZutKCzyB_vGf42d4r4s5bHAuqGKJEM6WkctEqOMCAyJSOZqfqfvOmRc4w/w400-h78/LightOn+Neurips.png" width="400" /></a></div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><span style="background-color: white; color: #292929; font-family: charter, Georgia, Cambria, "Times New Roman", Times, serif; letter-spacing: -0.003em;"><br /></span></div><div style="text-align: justify;">We live in interesting times!</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">A combination of post-Moore’s law era and the advent of very large ML models require all of us to think up new approaches to computing hardware and AI algorithms at the same time. <a href="https://www.blogger.com/blog/post/edit/6141980/8356505701362923811#">LightOn</a> is one of the few (20) companies in the world publishing in both AI and hardware venues to engage both communities into thinking how theories and workflows may eventually be transformed by the photonic technology we develop.</div><div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">This year, thanks to the awesome Machine Learning team at <a href="https://lighton.ai/">LightOn</a>, we have two accepted papers at <a href="https://neurips.cc/">NeurIPS</a>, the AI flagship conference, and have five papers in its“<a href="https://beyondbackprop.github.io/">Beyond Backpropagation” satellite workshop</a> that will take place on Saturday. This is significant on many levels, not the least being that these papers have been nurtured and spearheaded by two Ph.D. students (<a href="https://rubenohana.github.io/">Ruben Ohana</a> and <a href="https://lolo.science/">Julien Launay</a>) who are doing their thesis as LightOn engineers.</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Here is the list of the different papers accepted at NeurIPS this year that involved LightOn members:</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><ul><li><b>Reservoir Computing meets Recurrent Kernels and Structured Transforms</b>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Jonathan%20Dong">Jonathan Dong</a>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Ruben%20Ohana">Ruben Ohana</a>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Mushegh%20Rafayelyan">Mushegh Rafayelyan</a>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Florent%20Krzakala">Florent Krzakala</a>. Links: <a href="https://neurips.cc/virtual/2020/protected/session_oral_21072.html">Oral</a>, <a href="https://neurips.cc/virtual/2020/protected/poster_c348616cd8a86ee661c7c98800678fad.html">poster</a>, <a href="https://proceedings.neurips.cc//paper_files/paper/2020/hash/c348616cd8a86ee661c7c98800678fad-Abstract.html">paper</a> (presenter: <a href="https://rubenohana.github.io/">Ruben Ohana</a>). Poster Session 4 on Wed, Dec 9th, 2020 @ 18:00–20:00 CET. GatherTown: Deep learning ( <a href="https://neurips.gather.town/app/zkzLGtGEWnRhM4J8/posterRoomE1?spawnx=7&spawny=35&map=neuripscustom-entrance">Town E1 — Spot C0 </a>) <a href="https://neurips.gather.town/app/zkzLGtGEWnRhM4J8/posterRoomE1?spawnx=7&spawny=35&map=neuripscustom-entrance">Join GatherTown</a>. Only if and only if poster is crowded, <a href="https://us02web.zoom.us/j/89407788437?pwd=OFMrd0EvZExaL25LcXUrRTJ4Q24zdz09">join Zoom</a></li><li><b>Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures</b>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Jonathan%20Dong">Jonathan Dong</a>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Ruben%20Ohana">Ruben Ohana</a>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Mushegh%20Rafayelyan">Mushegh Rafayelyan</a>, <a href="https://neurips.cc/virtual/2020/protected/papers.html?filter=authors&search=Florent%20Krzakala">Florent Krzakala</a>. Links: <a href="https://neurips.cc/virtual/2020/protected/poster_69d1fc78dbda242c43ad6590368912d4.html">Poster</a>, <a href="https://proceedings.neurips.cc//paper_files/paper/2020/hash/69d1fc78dbda242c43ad6590368912d4-Abstract.html">paper</a> (Presenter: <a href="https://lolo.science/">Julien Launay</a>). Poster Session 6, on Thu, Dec 10th, 2020 @ 18:00–20:00 CET. GatherTown: Neuroscience and Cognitive Science ( <a href="https://neurips.gather.town/app/UBoZsXcMD6omtfsI/posterRoomA3?spawnx=7&spawny=22&map=neuripscustom-entrance">Town A3 — Spot B0 </a>)</li></ul></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">And at the <a href="https://neurips.cc/virtual/2020/protected/workshop_16108.html">NeurIPS Beyond Backpropagation workshop</a> taking place on Saturday, December 12:</div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: left;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMSDnra32NeYQkdJIhuOoYyxU98SnSAODl_-q_akTFgv6GrWyvT_8fcI1VPLun5R2BxbXZEHV03WyWJwcJ7VFrSledbiHNaQZD_96TS35NIgW1ym0Wnsr46ae-yXt9YxdrqpjhGw/s1276/beyond+backprop.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="459" data-original-width="1276" height="144" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMSDnra32NeYQkdJIhuOoYyxU98SnSAODl_-q_akTFgv6GrWyvT_8fcI1VPLun5R2BxbXZEHV03WyWJwcJ7VFrSledbiHNaQZD_96TS35NIgW1ym0Wnsr46ae-yXt9YxdrqpjhGw/w400-h144/beyond+backprop.png" width="400" /></a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><ul><li><b>Hardware Beyond Backpropagation: a Photonic Co-Processor for Direct Feedback Alignment</b>, <a href="https://slideslive.com/s/julien-launay-43082">Julien Launay</a>, <a href="https://slideslive.com/s/iacopo-poli-43083">Iacopo Poli</a>, <a href="https://slideslive.com/s/kilian-muller-54599">Kilian Muller</a>, <a href="https://slideslive.com/s/igor-carron-54600">Igor Carron</a>, <a href="https://slideslive.com/s/laurent-daudet-51766">Laurent Daudet</a>, <a href="https://slideslive.com/s/florent-krzakala-17753">Florent Krzakala</a>, <a href="https://slideslive.com/s/sylvain-gigan-54601">Sylvain Gigan</a></li><li><b>Direct Feedback Alignment Scales to Modern Deep Learning Tasks and Architectures</b>, Julien Launay, François Boniface, Iacopo Poli, Florent Krzakala (Presenter: Julien Launay).</li><li><b>Ignorance is Bliss: Adversarial Robustness by Design through Analog Computing and Synaptic Asymmetry</b>, Alessandro Cappelli, Ruben Ohana, Julien Launay, Iacopo Poli, Florent Krzakala (Presenter: Alessandro Cappelli). We had a <a href="https://medium.com/@LightOnIO/ignorance-is-bliss-adversarial-robustness-by-design-with-lighton-opus-4f143fa629b">blog post on this recently</a>.</li><li><b>Align, then Select: Analysing the Learning Dynamics of Feedback Alignment</b>, Maria Refinetti, Stéphane d’Ascoli, Ruben Ohana, Sebastian Goldt <a href="https://arxiv.org/abs/2011.12428">paper</a> (Presenter: Ruben Ohana).</li><li><b>How and When does Feedback Alignment Work</b>, Stéphane d’Ascoli, Maria Refinetti, Ruben Ohana, Sebastian Goldt. <a href="https://arxiv.org/abs/2011.12428">paper</a> (Presenter: Ruben Ohana)</li></ul></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Some of these presentations are given in French at the <a href="https://medium.com/@ParisMLgroup/le-programme-des-d%C3%A9jeuners-virtuels-de-neurips2020-7c10f94609fd">“Déjeuners virtuels de NeurIPS”</a></div><br /><br /></div><div>
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div></div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-35483812890861395792020-10-14T04:55:00.005-05:002020-10-14T05:03:11.251-05:00 Weight Agnostic Neural Networks, a virtual presentation by Adam Gaier, Thursday October 15th, LightOn AI meetup #7<div style="text-align: center;">** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> ** </div><div style="text-align: center;"><br /></div><div style="text-align: justify;">Ever since we started LightOn, we have been putting some emphasis on having great minds think how new algorithms are possible and how they can be enabled with our photonic chips. We also have a regular meetup where we see how other great minds are devising new algorithms. </div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgD5fkctzKkugNlneXAkzPfxblPUkH-bYZHjAkaQlyrHnvwg70AV29MVtCMG5qPj_BEHoEat0YsT0cXM91l7d9vhF_oBmQ2moWzAaekpStdTWzbT1rpADUYXoSbW3shxTR7AIuOBQ/s1089/LightOn_meetup7.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="565" data-original-width="1089" height="166" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgD5fkctzKkugNlneXAkzPfxblPUkH-bYZHjAkaQlyrHnvwg70AV29MVtCMG5qPj_BEHoEat0YsT0cXM91l7d9vhF_oBmQ2moWzAaekpStdTWzbT1rpADUYXoSbW3shxTR7AIuOBQ/w320-h166/LightOn_meetup7.png" width="320" /></a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Tomorrow, Thursday (October 15th) we are continuing this journey by having <a href="https://www.linkedin.com/in/adamgaier/">Adam Gaier</a> who will talk to us about Weight Agnostic Neural Networks. The virtual meetup will start at:</div><div><ul style="text-align: left;"><li>16:00 (UTC+2) Paris time but also </li><li>7AM PST, </li><li>10AM CST, </li><li>11PM JST. </li></ul></div><div><div style="text-align: justify;">To have more information about connecting to the meetup, please register here: <a href="https://t.co/3mEZqX18Z5?amp=1">https://meetup.com/LightOn-meetup/events/273660363/</a></div><div><br /></div><div style="text-align: justify;"><br /></div><div><div style="text-align: justify;"><br /></div><div style="text-align: center;">Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a></div>
<div style="text-align: center;"><br /></div>
<div style="text-align: center;"><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.</div>
<div style="text-align: center;"><br /></div><div style="text-align: center;">Other links:</div>
<b><div style="text-align: center;"><b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a></div><div style="text-align: center;"><b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a></div></b><div style="text-align: center;"><u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
</div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-81161614926448136662020-10-10T08:36:00.002-05:002020-10-10T08:36:30.786-05:00As The World Turns: Implementations now on ArXiv thanks to Paper with Code** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> ** <div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyiAyEtRhZZZgx0_ImpNyzEKD2MKqFSYTCFF_vHqqTwisaeCcZ6u-3wPy-W_yfOUKthzCuMVEdtR1enR4VJ9tacU8ROBpPm-8T8mFC0MIBxDyDGxd_3msjjhYRLSFD_Y8m0txMNA/s669/rainbow.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="666" data-original-width="669" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyiAyEtRhZZZgx0_ImpNyzEKD2MKqFSYTCFF_vHqqTwisaeCcZ6u-3wPy-W_yfOUKthzCuMVEdtR1enR4VJ9tacU8ROBpPm-8T8mFC0MIBxDyDGxd_3msjjhYRLSFD_Y8m0txMNA/s320/rainbow.png" width="320" /></a></div><br /><div><div><br /></div><div style="text-align: justify;"><i>It's the little things. </i></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">In the 2000s, after featuring good work on <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a>, I was usually following through by asking authors where their codes were. This is how the <a href="https://nuit-blanche.blogspot.com/search/label/implementation">implementation tag</a> was born. Some of the answers were along the lines of: "I didn't make it available because I thought it was not worthy". But what I usually responded was that, in effect, releasing one's code had a compounding effect on the community: </div><div style="text-align: justify;"><i><blockquote>"You may not think it's worthy of release, but somehow, someone somewhere needs your code for reasons you cannot fathom"</blockquote></i></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"> As a result, I made a conscious choice of featuring those papers that were actively featuring their implementations. The earliest post with featured implementations was <a href="https://nuit-blanche.blogspot.com/2007/02/there-are-now-three-compressed-sensing.html">February 28th, 2007 with a blog post featuring three different implementations of reconstruction solver for compressed sensing</a>. Yes, implementations were already available before that, but within the compressive sensing community, it was a point in time with a collective realization that releasing one's code would bring others to reuse one's work and advance the field as a result. At some point, I started making a <a href="https://nuit-blanche.blogspot.com/p/blog-page_4.html">long list of implementation available</a> but got swamped after a while because it became, most of the time, the default behavior (a good thing).</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Five years ago, <a href="https://github.com/samim23">Samim Winiger</a> started <a href="https://github.com/samim23/GitXiv">GitXiv</a> around Machine Learning papers. <a href="https://nuit-blanche.blogspot.com/2016/02/gitxiv-most-awesomest-page-on-interweb.html">I was ecstatic</a> but the site eventually stopped working. Two years ago, the <a href="https://www.blogger.com/#">Paper with code </a>site started around the same issue and flourished. Congratulations to <a href="https://twitter.com/rbstojnic">Robert</a>, <a href="https://twitter.com/rosstaylor90">Ross</a>, <a href="https://twitter.com/misterkardas">Marcin</a>, <a href="https://twitter.com/ViktorKerkez">Viktor</a>, and <a href="https://twitter.com/LudovicViaud">Ludovic</a> on starting a vibrant community around this need for listing papers with their attendant code. Two days ago, the next logical step occurred with the <a href="https://medium.com/paperswithcode/papers-with-code-partners-with-arxiv-ecc362883167">featuring of codes within ArXiv, a fantastic advance for Science. Woohoo!</a></div><div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Congratulations to <a href="https://twitter.com/rbstojnic">Robert</a>, <a href="https://twitter.com/rosstaylor90">Ross</a>, <a href="https://twitter.com/misterkardas">Marcin</a>, <a href="https://twitter.com/ViktorKerkez">Viktor</a>, and <a href="https://twitter.com/LudovicViaud">Ludovic</a> on making this happen! </div><div style="text-align: justify;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNznm4leq6DboLGZ8VzUaXoem_2cArkquTWOZnR655J9ZNs_xEex0kFVce5JMxqPCHEIuaFP9EBLCHywUMjEACdlh1KvjmuRiFOhoHGCDxZIBwi8AqVagVJUWayMG3AtQJ6xjM9w/s875/papers+with+code+arxiv.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="729" data-original-width="875" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNznm4leq6DboLGZ8VzUaXoem_2cArkquTWOZnR655J9ZNs_xEex0kFVce5JMxqPCHEIuaFP9EBLCHywUMjEACdlh1KvjmuRiFOhoHGCDxZIBwi8AqVagVJUWayMG3AtQJ6xjM9w/s320/papers+with+code+arxiv.png" width="320" /></a></div><br /><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">My next question is: </div><div style="text-align: center;"><i>When are they going to get a prize for this?</i></div><div style="text-align: justify;"><br /></div><div><br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>< br/>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div></div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-47753632046186067772020-05-29T09:56:00.001-05:002020-05-29T09:57:09.107-05:00Photonic Computing for Massively Parallel AI is out and it is spectacular!<div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEisJExkX6SzBIE-aLHk2PNnAaAYTdmE1goGaoSNoeQwZHZz5XzxEYQgb0u567W4D0Q5h4UEre0ZyqvC5Vpy4g8JeNw0D5f3k2n-Ems6DZshLuZuh0VpAC0ej6AZ54bBPbSJl6WMcg/" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="555" data-original-width="878" height="253" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEisJExkX6SzBIE-aLHk2PNnAaAYTdmE1goGaoSNoeQwZHZz5XzxEYQgb0u567W4D0Q5h4UEre0ZyqvC5Vpy4g8JeNw0D5f3k2n-Ems6DZshLuZuh0VpAC0ej6AZ54bBPbSJl6WMcg/w400-h253/white+paper+Lighton+logo.png" width="400" /></a></div><div><br /></div><br /><div style="text-align: justify;">It’s been a long time brewing but we just released our first <a href="https://lighton.ai/wp-content/uploads/2020/05/LightOn-White-Paper-v1.0-S.pdf">white paper on Photonic Computing for Massively Parallel AI</a>. The document features the technology we develop at <a href="http://lighton.ai">LightOn</a>, some of its use, some testimonials, and how we see the future of computing. It is downloadable <a href="https://lighton.ai/wp-content/uploads/2020/05/LightOn-White-Paper-v1.0-S.pdf">here</a> or from our website: <a href="https://lighton.ai/">LightOn.ai</a></div><div style="text-align: justify;"><br /></div><div style="text-align: justify;">Enjoy!</div><div style="text-align: justify;"><br /></div><div style="text-align: justify;"><br /></div><div><br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or j<font size="4">oin</font> the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0px;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a><div><b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><div>
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div></div></div>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-71555876770802550562020-05-15T00:00:00.000-05:002020-05-15T00:00:02.831-05:00Tackling Reinforcement Learning with the Aurora OPU** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh5dvvehLIX3PxgVckPWg3L0_sz9xtFqc51BwmpHkhnwo5_C8kB1fFpruXnS9OaJeWCbMINWLPtDLF2w9EV3-HvKtV0fHzKyb5rbmeNwm9Eq-i2qbjMYENTuRKc2NtE5LRxmfANIA/s1600/episodic+control+LightOn+OPU.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="688" data-original-width="1250" height="220" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh5dvvehLIX3PxgVckPWg3L0_sz9xtFqc51BwmpHkhnwo5_C8kB1fFpruXnS9OaJeWCbMINWLPtDLF2w9EV3-HvKtV0fHzKyb5rbmeNwm9Eq-i2qbjMYENTuRKc2NtE5LRxmfANIA/s400/episodic+control+LightOn+OPU.png" width="400" /></a></div>
<br />
<br />
<div style="text-align: justify;">
<a href="https://www.linkedin.com/in/martin-graive/?originalSubdomain=fr">Martin Graive</a> did an internship at <a href="https://lighton.ai/">LightOn</a> and decided to investigate how to use Random Projections in the context of Reinforcement Learning. He just wrote a blog post on the matter entitled "<a href="https://medium.com/@LightOnIO/tackling-reinforcement-learning-with-the-aurora-opu-88f3ffff137a">Tackling Reinforcement Learning with the Aurora OPU</a>". The attendant <a href="https://github.com/lightonai/reinforcement-learning-opu">GitHub repo is located here</a>. Enjoy!</div>
<div>
<div style="text-align: center;">
<br /></div>
</div>
<div>
<div style="text-align: center;">
<iframe allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/2K7p4PTYxXw" width="440"></iframe>
</div>
<div>
<div style="text-align: center;">
<br /></div>
</div>
<div>
<br />
<br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a><br />
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
</div>
Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-63239801946497632992020-04-29T07:49:00.001-05:002020-05-10T11:43:37.798-05:003-year PhD studentship in Inverse Problems and Optical Computing, LightOn, Paris, France<div style="text-align: center;">
** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
</div>
<br />
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Come and join us at <a href="https://lighton.ai/">LightOn</a>, we have a 3-year PhD fellowship available for someone who can help us build our future photonic cores. Here is </div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCZFr8vjEhCJVuuUoOqhoH4rNYYBWF3R66cp38AaDWFoFC4eyQ0uhIdpXz6xGll8reLMGYK8NH3EOgZpi1Nni1g_8vm-PVwNfKrnpm8v2uRLLdDtwWq82jj0i3fND7AHAAmAOJfg/s1600/LightOn+Logo+with+Baseline.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="686" data-original-width="1600" height="171" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCZFr8vjEhCJVuuUoOqhoH4rNYYBWF3R66cp38AaDWFoFC4eyQ0uhIdpXz6xGll8reLMGYK8NH3EOgZpi1Nni1g_8vm-PVwNfKrnpm8v2uRLLdDtwWq82jj0i3fND7AHAAmAOJfg/s400/LightOn+Logo+with+Baseline.jpg" width="400" /></a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
As part of the newly EU-funded ITN project “Post-Digital”, <a href="https://lighton.ai/">LightOn</a> has an opening for a fully-funded 3 year Ph.D. studentship to join its R&D team, at the crossroads between Computer Science and Physics. </div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
The goal of this 3 year Ph.D. position is to theoretically, numerically, and experimentally investigate how optimization techniques can be used in the design of hybrid computing pipelines, including a number of photonic building blocks (“photonic cores”). In particular, the optimized networks will be used to solve large-scale physics-based inverse problems in science and engineering - for instance in medical imaging (e.g. ultrasound), or simulation problems. The candidate will first investigate how LigthOn’s current range of photonics co-processors can be integrated within task-specific networks. The candidate will then develop a computational framework for the optimization of electro-optical systems. Finally, optimized systems will be built and evaluated on experimental data. This project will be part of LightOn’s internal THEIA project, aiming at automating the design of hybrid computing architectures, including combinations of LightOn’s photonic cores and traditional silicon chips.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
In the framework of the EU funded ITN Post-Digital network, this project involves collaborations and 3-month secondments with two research groups led by:</div>
<div style="text-align: justify;">
<ul>
<li><a href="https://members.femto-st.fr/daniel-brunner/">Daniel Brunner</a> (Université Bourgogne Franche-Comté / FEMTO-ST Besançon), who will be the academic supervisor - The candidate will be registered as a Ph.D. student at UBFC.</li>
<li><a href="https://www.photonics.intec.ugent.be/contact/people.asp?ID=5">Pieter Bienstman</a> (IMEC, Leuven, Belgium).</li>
</ul>
</div>
<div style="text-align: justify;">
The supervisor at <a href="https://lighton.ai/">LightOn</a> will be Laurent Daudet, CTO - currently on leave from his position of professor of physics at Université de Paris.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Due to the EU funding source, please make sure you comply with the mobility and eligibility rule before applying. Application: Position to be filled no later than Sept 1st, 2020.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Send your application with a CV to <a href="mailto:jobs@lighton.io">jobs@lighton.io</a> with [Post-Digital PhD] in the subject line. Shortlisted applicants will be asked to provide references. This project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No 860830.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
For more information: <a href="https://slack-redir.net/link?url=https%3A%2F%2Flighton.ai%2Fcareers%2F">https://lighton.ai/careers/</a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a><br />
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-37625281550015072012020-04-07T10:53:00.000-05:002020-04-07T10:53:47.132-05:00LightOn Cloud 2.0 featuring LightOn Aurora OPUs** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjO-2T8_H8g3wC7ttNF3dZmuD4CdyBbJFjINE5uZS-WTmSgFSTJJe4YLCoKGcKcv2bhafawCzNnqRKdSRVQLTeOBNpz_Wu9I6d48gnf2ovu6ye8iOrBHw-IO9ag408I9ln5hhTayA/s1600/welcome+to+new+lighton+cloud.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="641" data-original-width="1400" height="182" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjO-2T8_H8g3wC7ttNF3dZmuD4CdyBbJFjINE5uZS-WTmSgFSTJJe4YLCoKGcKcv2bhafawCzNnqRKdSRVQLTeOBNpz_Wu9I6d48gnf2ovu6ye8iOrBHw-IO9ag408I9ln5hhTayA/s400/welcome+to+new+lighton+cloud.png" width="400" /></a></div>
<br />
<br />
<div style="text-align: justify;">
At <a href="http://lighton.ai/">LightOn</a>, we just launched LightOn Cloud 2.0 that feature several Aurora Optical Processing Unit for use by the Machine Learning Community. the blog post about this <a href="https://medium.com/@LightOnIO/welcome-to-lighton-cloud-2-0-featuring-lighton-aurora-opus-f2f77a89f196">can be found here</a>. You can request access to the Cloud at <a href="https://cloud.lighton.ai/">https://cloud.lighton.ai/</a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
We are also having a LightOn Cloud for Research program: <a href="https://cloud.lighton.ai/lighton-research/">https://cloud.lighton.ai/lighton-research/</a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsjEtiZDZA1Fjo6FP1He-V78EkJvfI4SnT4OPc5n_y5CcinVQixhfBZXIV5CwQZ4tfFMBTb8JIU-YULs8wulfEcPLHSV-gkA_kGPGN-aSboOjFCmiF5P_6u3R5TL82sM1zn0VyjQ/s1600/lighton+cloud+2+5+auroras+v3.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="483" data-original-width="429" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsjEtiZDZA1Fjo6FP1He-V78EkJvfI4SnT4OPc5n_y5CcinVQixhfBZXIV5CwQZ4tfFMBTb8JIU-YULs8wulfEcPLHSV-gkA_kGPGN-aSboOjFCmiF5P_6u3R5TL82sM1zn0VyjQ/s320/lighton+cloud+2+5+auroras+v3.png" width="284" /></a></div>
<br />
<div style="text-align: justify;">
<ul>
<li>[En] Press Release: <a href="https://www.lighton.ai/wp-content/uploads/2020/04/LightOnPressRelease_April2020_EN.pdf">LightOn launches LightOn Cloud 2.0 featuring Aurora OPUs, </a>April 7th, 2020 </li>
<li>[Fr] Communiqué de presse: <a href="https://www.lighton.ai/wp-content/uploads/2020/04/LightOnPressRelease_April2020_FR.pdf">LightOn lance le LightOn Cloud 2.0 avec des OPUs Aurora, </a>7 Avril 2020 </li>
</ul>
</div>
<div style="text-align: justify;">
<br /></div>
<br /><br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a><br />
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-52182583153044573352020-03-26T00:00:00.000-05:002020-03-26T00:00:05.965-05:00Accelerating SARS-COv2 Molecular Dynamics Studies with Optical Random Features** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiU5OOwjzQpOX4CVxAIaZ2znHv55YHkD7mRud9jLGhXaLtYWy9zwiL9iehyNYBRexjr5OZi0LYR0Y2C3gsZK-aoSu9uTwYoTES27kYBREwng07syO5QRdEpsa8C_uw-YTjI1MlOQQ/s1600/covid19map.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="574" data-original-width="1000" height="228" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiU5OOwjzQpOX4CVxAIaZ2znHv55YHkD7mRud9jLGhXaLtYWy9zwiL9iehyNYBRexjr5OZi0LYR0Y2C3gsZK-aoSu9uTwYoTES27kYBREwng07syO5QRdEpsa8C_uw-YTjI1MlOQQ/s400/covid19map.png" width="400" /></a></div>
<br />
<div style="text-align: justify;">
We just published a new <a href="https://www.lighton.ai/blog/">blog post</a> at <a href="http://www.lighton.ai/">LightOn</a>. This time, we used <a href="http://lighton.ai/">LightOn</a>'s Optical Processing Unit to show how our hardware can help in speeding up global sampling studies that are using <a href="https://en.wikipedia.org/wiki/Molecular_dynamics">Molecular Dynamics simulations</a>, such as in the case of <a href="https://en.wikipedia.org/wiki/Metadynamics">metadynamics</a>. Our engineer, <a href="https://www.linkedin.com/in/amelie-chatelain/">Amélie Chatelain</a> wrote a blog post about it and it is here: <a href="https://medium.com/@LightOnIO/accelerating-sars-cov2-molecular-dynamics-studies-with-optical-random-features-b8cffdb99b01">Accelerating SARS-COv2 Molecular Dynamics Studies with Optical Random Features</a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<div class="graf graf--p" name="a9c1">
We showed that LightOn's OPU, in tandem with the <a href="https://arxiv.org/abs/1805.08061">NEWMA algorithm</a>, becomes very interesting (compared to CPU implementations of <a href="https://people.eecs.berkeley.edu/~brecht/papers/07.rah.rec.nips.pdf">Random Fourier Features</a> and <a href="http://proceedings.mlr.press/v28/le13.pdf">FastFood</a>) for simulations featuring more than 4 000 atoms.</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGrCalwB5T3RKfsq_bX7x5Guz7T-IxC2x9z9vwi1gGg828hDKCmWV8wR3eIfP4bUNbadQMprvuT9DCdGX4Za4pNfeLLFHxo1rNmH9b2818Z7NDwYCpcljYcPIaexIZmXmtIRxofA/s1600/comparison+rff+ff+opu.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="600" data-original-width="800" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGrCalwB5T3RKfsq_bX7x5Guz7T-IxC2x9z9vwi1gGg828hDKCmWV8wR3eIfP4bUNbadQMprvuT9DCdGX4Za4pNfeLLFHxo1rNmH9b2818Z7NDwYCpcljYcPIaexIZmXmtIRxofA/s400/comparison+rff+ff+opu.png" width="400" /></a></div>
<div class="graf graf--p" name="a9c1">
</div>
<div class="graf graf--p" name="a9c1">
<br /></div>
</div>
<div style="text-align: justify;">
Because building computational hardware makes no sense if we don't have a community that lifts us, the code used to generate the plots in that blog post is publicly available at the following link: <a href="https://github.com/lightonai/newma-md">https://github.com/lightonai/newma-md</a>.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Other links:</div>
<div style="text-align: justify;">
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>< br/>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a></div>
<div style="text-align: justify;">
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-66020994986531921682020-03-14T08:12:00.000-05:002020-03-14T08:12:08.910-05:00Au Revoir Backprop ! Bonjour Optical Transfer Learning !** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhuLR2C1lOptlPWtCgcutI5skSwucVMhc2VBQ9P0124TWwOSpipV9YLbqlvey618IH7Vp18yhyU47xDGSSn_IRau8AOnlZo4rvBetDo5RCWZI-rbmrKEFUtGb-am8NG_DsFXRwmrg/s1600/optical+transfer+learning+meme.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="500" data-original-width="750" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhuLR2C1lOptlPWtCgcutI5skSwucVMhc2VBQ9P0124TWwOSpipV9YLbqlvey618IH7Vp18yhyU47xDGSSn_IRau8AOnlZo4rvBetDo5RCWZI-rbmrKEFUtGb-am8NG_DsFXRwmrg/s320/optical+transfer+learning+meme.png" width="320" /></a></div>
<br />
<div style="text-align: justify;">
We recently used <a href="http://lighton.ai/">LightOn</a>'s Optical Processing Unit to show how our hardware fared in the context of Transfer learning. Our engineer, <a href="https://www.linkedin.com/in/giuseppelucatommasone/">Luca Tommasone</a> wrote a blog post about it and it is here: <a href="https://medium.com/@LightOnIO/au-revoir-backprop-bonjour-optical-transfer-learning-5f5ae18e4719">Au Revoir Backprop! Bonjour Optical Transfer Learning!</a></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Because building computational hardware makes no sense if we don't have a community that lifts us, the code used to generate the plots in that blog post is publicly available at the following link: <a href="https://github.com/lightonai/transfer-learning-opu">https://github.com/lightonai/transfer-learning-opu</a>.</div>
<div style="text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhvuTgesh0y714IrjOvGswT9muWtM9BCd1n3WGdxsrwc-ySpUM0YmDZNq_GuzWC45tJiihp0waojr2cotT6jcbj7NWy7f4p0syQVlfK8oJHrdC2lqSEhs-qZgjjXm3kDVwQVS_z5g/s1600/animals+densenet169.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="699" data-original-width="1047" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhvuTgesh0y714IrjOvGswT9muWtM9BCd1n3WGdxsrwc-ySpUM0YmDZNq_GuzWC45tJiihp0waojr2cotT6jcbj7NWy7f4p0syQVlfK8oJHrdC2lqSEhs-qZgjjXm3kDVwQVS_z5g/s400/animals+densenet169.png" width="400" /></a></div>
<div>
<br /><div>
<br /></div>
<div>
Enjoy and most importantly stay safe !</div>
<div style="text-align: justify;">
<br /></div>
<div>
<em class="hr" style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.84); font-family: medium-content-serif-font, Georgia, Cambria, "Times New Roman", Times, serif; font-size: 21px; letter-spacing: -0.084px;"><br /></em></div>
<div>
<em class="hr" style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.84); font-family: medium-content-serif-font, Georgia, Cambria, "Times New Roman", Times, serif; font-size: 21px; letter-spacing: -0.084px;"><br /></em></div>
<br />
<br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>< br/>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-21081194932229832092020-01-15T05:11:00.000-06:002020-01-15T05:11:22.721-06:00Beyond Overfitting and Beyond Silicon: The double descent curve** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
<br />
<br />
<div style="text-align: justify;">
We recently tried a small experiment with <a href="http://lighton.ai/">LightOn</a>'s Optical Processing Unit on the issue of generalization. Our engineer,<span style="text-align: justify;"> </span><a href="https://www.linkedin.com/in/alessandro-cappelli-aa8060172/" style="text-align: justify;">Alessandro Cappelli</a>, did the experiment and wrote a blog post on it and it is here: <a href="https://medium.com/@LightOnIO/beyond-overfitting-and-beyond-silicon-the-double-descent-curve-18b6d9810e1b" style="text-align: justify;">Beyond Overfitting and Beyond Silicon: The double descent curve</a><span style="text-align: justify;"> </span></div>
<div style="text-align: justify;">
<span style="text-align: justify;"><br /></span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivX2k3p_wRpFI6rb8zcZ0_VSA4lcJ0w09GOIneQkWTYKabXmtv2ZHJmtipAB2faUJ8s2nDha2h3_SF6ACrkdiIBqI1VyRcAvAWnEsTLad-rQlHzd5eMbOmuB9veIcxfTUUALxYYQ/s1600/mnist-doubledescent-lighton.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="720" data-original-width="720" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivX2k3p_wRpFI6rb8zcZ0_VSA4lcJ0w09GOIneQkWTYKabXmtv2ZHJmtipAB2faUJ8s2nDha2h3_SF6ACrkdiIBqI1VyRcAvAWnEsTLad-rQlHzd5eMbOmuB9veIcxfTUUALxYYQ/s320/mnist-doubledescent-lighton.png" width="320" /></a></div>
Two days ago, <a href="https://voices.uchicago.edu/willett/" style="text-align: center;">Becca Willett</a> was talking on the same subject at the <a href="https://www.turing.ac.uk/">Turing Institute in London</a>.<br />
<br />
<div style="text-align: center;">
A function space view of overparameterized neural networks <a href="https://voices.uchicago.edu/willett/">Rebecca Willett</a>.</div>
<br />
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<iframe allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/g-rTv8CzEYk?start=9641" width="440"></iframe>
</div>
<div style="text-align: center;">
<br /></div>
Attendant preprint is here:<br />
<br /><a href="https://arxiv.org/pdf/1910.01635.pdf">A Function Space View of Bounded Norm Infinite Width ReLU Nets: The Multivariate Case</a> by <a href="https://arxiv.org/search/cs?searchtype=author&query=Ongie%2C+G">Greg Ongie</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Willett%2C+R">Rebecca Willett</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Soudry%2C+D">Daniel Soudry</a>, <a href="https://arxiv.org/search/cs?searchtype=author&query=Srebro%2C+N">Nathan Srebro</a><br /><blockquote class="abstract mathjax" style="background-color: white; font-family: "Lucida Grande", helvetica, arial, verdana, sans-serif; font-size: 1.05em; line-height: 1.55; margin-bottom: 1.5em;">
A key element of understanding the efficacy of overparameterized neural networks is characterizing how they represent functions as the number of weights in the network approaches infinity. In this paper, we characterize the norm required to realize a function <span class="MathJax" id="MathJax-Element-1-Frame" style="border: 0px; direction: ltr; display: inline; float: none; font-size: 13.608px; line-height: normal; margin: 0px; max-height: none; max-width: none; min-height: 0px; min-width: 0px; overflow-wrap: normal; padding: 0px; white-space: nowrap; word-spacing: normal;" tabindex="0"><nobr style="border: 0px; line-height: normal; margin: 0px; max-height: none; max-width: none; min-height: 0px; min-width: 0px; padding: 0px; transition: none 0s ease 0s; vertical-align: 0px;"><span class="math" id="MathJax-Span-1" style="border: 0px; display: inline-block; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 5.939em;"><span style="border: 0px; display: inline-block; font-size: 16.3296px; height: 0px; line-height: normal; margin: 0px; padding: 0px; position: relative; transition: none 0s ease 0s; vertical-align: 0px; width: 4.96em;"><span style="border: 0px; clip: rect(1.288em, 1004.96em, 2.757em, -999.997em); left: 0em; line-height: normal; margin: 0px; padding: 0px; position: absolute; top: -2.384em; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mrow" id="MathJax-Span-2" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-3" style="border: 0px; display: inline; font-family: MathJax_Math-italic; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">f<span style="border: 0px; display: inline-block; height: 1px; line-height: normal; margin: 0px; overflow: hidden; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0.064em;"></span></span><span class="mo" id="MathJax-Span-4" style="border: 0px; display: inline; font-family: MathJax_Main; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.309em; position: static; transition: none 0s ease 0s; vertical-align: 0px;">:</span><span class="msubsup" id="MathJax-Span-5" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.309em; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span style="border: 0px; display: inline-block; height: 0px; line-height: normal; margin: 0px; padding: 0px; position: relative; transition: none 0s ease 0s; vertical-align: 0px; width: 1.166em;"><span style="border: 0px; clip: rect(3.124em, 1000.74em, 4.165em, -999.997em); left: 0em; line-height: normal; margin: 0px; padding: 0px; position: absolute; top: -3.975em; transition: none 0s ease 0s; vertical-align: 0px;"><span class="texatom" id="MathJax-Span-6" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mrow" id="MathJax-Span-7" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-8" style="border: 0px; display: inline; font-family: MathJax_AMS; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">R</span></span></span><span style="border: 0px; display: inline-block; height: 3.981em; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0px;"></span></span><span style="border: 0px; left: 0.737em; line-height: normal; margin: 0px; padding: 0px; position: absolute; top: -4.403em; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-9" style="border: 0px; display: inline; font-family: MathJax_Math-italic; font-size: 11.545px; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">d<span style="border: 0px; display: inline-block; height: 1px; line-height: normal; margin: 0px; overflow: hidden; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0.003em;"></span></span><span style="border: 0px; display: inline-block; height: 3.981em; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0px;"></span></span></span></span><span class="mo" id="MathJax-Span-10" style="border: 0px; display: inline; font-family: MathJax_Main; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.309em; position: static; transition: none 0s ease 0s; vertical-align: 0px;">→</span><span class="texatom" id="MathJax-Span-11" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.309em; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mrow" id="MathJax-Span-12" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-13" style="border: 0px; display: inline; font-family: MathJax_AMS; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">R</span></span></span></span><span style="border: 0px; display: inline-block; height: 2.39em; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0px;"></span></span></span><span style="border-bottom-style: initial; border-color: initial; border-image: initial; border-left-style: solid; border-right-style: initial; border-top-style: initial; border-width: 0px; display: inline-block; height: 1.472em; line-height: normal; margin: 0px; overflow: hidden; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: -0.29em; width: 0px;"></span></span></nobr></span> as a single hidden-layer ReLU network with an unbounded number of units (infinite width), but where the Euclidean norm of the weights is bounded, including precisely characterizing which functions can be realized with finite norm. This was settled for univariate univariate functions in Savarese et al. (2019), where it was shown that the required norm is determined by the L1-norm of the second derivative of the function. We extend the characterization to multivariate functions (i.e., networks with d input units), relating the required norm to the L1-norm of the Radon transform of a (d+1)/2-power Laplacian of the function. This characterization allows us to show that all functions in Sobolev spaces <span class="MathJax" id="MathJax-Element-2-Frame" style="border: 0px; direction: ltr; display: inline; float: none; font-size: 13.608px; line-height: normal; margin: 0px; max-height: none; max-width: none; min-height: 0px; min-width: 0px; overflow-wrap: normal; padding: 0px; white-space: nowrap; word-spacing: normal;" tabindex="0"><nobr style="border: 0px; line-height: normal; margin: 0px; max-height: none; max-width: none; min-height: 0px; min-width: 0px; padding: 0px; transition: none 0s ease 0s; vertical-align: 0px;"><span class="math" id="MathJax-Span-14" style="border: 0px; display: inline-block; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 4.287em;"><span style="border: 0px; display: inline-block; font-size: 16.3296px; height: 0px; line-height: normal; margin: 0px; padding: 0px; position: relative; transition: none 0s ease 0s; vertical-align: 0px; width: 3.553em;"><span style="border: 0px; clip: rect(1.349em, 1003.43em, 2.818em, -999.997em); left: 0em; line-height: normal; margin: 0px; padding: 0px; position: absolute; top: -2.384em; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mrow" id="MathJax-Span-15" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="msubsup" id="MathJax-Span-16" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span style="border: 0px; display: inline-block; height: 0px; line-height: normal; margin: 0px; padding: 0px; position: relative; transition: none 0s ease 0s; vertical-align: 0px; width: 2.084em;"><span style="border: 0px; clip: rect(3.124em, 1001.04em, 4.165em, -999.997em); left: 0em; line-height: normal; margin: 0px; padding: 0px; position: absolute; top: -3.975em; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-17" style="border: 0px; display: inline; font-family: MathJax_Math-italic; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">W<span style="border: 0px; display: inline-block; height: 1px; line-height: normal; margin: 0px; overflow: hidden; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0.125em;"></span></span><span style="border: 0px; display: inline-block; height: 3.981em; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0px;"></span></span><span style="border: 0px; left: 1.105em; line-height: normal; margin: 0px; padding: 0px; position: absolute; top: -4.342em; transition: none 0s ease 0s; vertical-align: 0px;"><span class="texatom" id="MathJax-Span-18" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mrow" id="MathJax-Span-19" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-20" style="border: 0px; display: inline; font-family: MathJax_Math-italic; font-size: 11.545px; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">s</span><span class="mo" id="MathJax-Span-21" style="border: 0px; display: inline; font-family: MathJax_Main; font-size: 11.545px; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">,</span><span class="mn" id="MathJax-Span-22" style="border: 0px; display: inline; font-family: MathJax_Main; font-size: 11.545px; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">1</span></span></span><span style="border: 0px; display: inline-block; height: 3.981em; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0px;"></span></span></span></span><span class="mo" id="MathJax-Span-23" style="border: 0px; display: inline; font-family: MathJax_Main; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">(</span><span class="texatom" id="MathJax-Span-24" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mrow" id="MathJax-Span-25" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-26" style="border: 0px; display: inline; font-family: MathJax_AMS; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">R</span></span></span><span class="mo" id="MathJax-Span-27" style="border: 0px; display: inline; font-family: MathJax_Main; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">)</span></span><span style="border: 0px; display: inline-block; height: 2.39em; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0px;"></span></span></span><span style="border-bottom-style: initial; border-color: initial; border-image: initial; border-left-style: solid; border-right-style: initial; border-top-style: initial; border-width: 0px; display: inline-block; height: 1.472em; line-height: normal; margin: 0px; overflow: hidden; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: -0.364em; width: 0px;"></span></span></nobr></span>, <span class="MathJax" id="MathJax-Element-3-Frame" style="border: 0px; direction: ltr; display: inline; float: none; font-size: 13.608px; line-height: normal; margin: 0px; max-height: none; max-width: none; min-height: 0px; min-width: 0px; overflow-wrap: normal; padding: 0px; white-space: nowrap; word-spacing: normal;" tabindex="0"><nobr style="border: 0px; line-height: normal; margin: 0px; max-height: none; max-width: none; min-height: 0px; min-width: 0px; padding: 0px; transition: none 0s ease 0s; vertical-align: 0px;"><span class="math" id="MathJax-Span-28" style="border: 0px; display: inline-block; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 5.021em;"><span style="border: 0px; display: inline-block; font-size: 16.3296px; height: 0px; line-height: normal; margin: 0px; padding: 0px; position: relative; transition: none 0s ease 0s; vertical-align: 0px; width: 4.165em;"><span style="border: 0px; clip: rect(1.288em, 1004.1em, 2.451em, -999.997em); left: 0em; line-height: normal; margin: 0px; padding: 0px; position: absolute; top: -2.139em; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mrow" id="MathJax-Span-29" style="border: 0px; display: inline; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;"><span class="mi" id="MathJax-Span-30" style="border: 0px; display: inline; font-family: MathJax_Math-italic; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px;">s</span><span class="mo" id="MathJax-Span-31" style="border: 0px; display: inline; font-family: MathJax_Main; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.309em; position: static; transition: none 0s ease 0s; vertical-align: 0px;">≥</span><span class="mi" id="MathJax-Span-32" style="border: 0px; display: inline; font-family: MathJax_Math-italic; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.309em; position: static; transition: none 0s ease 0s; vertical-align: 0px;">d<span style="border: 0px; display: inline-block; height: 1px; line-height: normal; margin: 0px; overflow: hidden; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0.003em;"></span></span><span class="mo" id="MathJax-Span-33" style="border: 0px; display: inline; font-family: MathJax_Main; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.248em; position: static; transition: none 0s ease 0s; vertical-align: 0px;">+</span><span class="mn" id="MathJax-Span-34" style="border: 0px; display: inline; font-family: MathJax_Main; line-height: normal; margin: 0px; padding: 0px 0px 0px 0.248em; position: static; transition: none 0s ease 0s; vertical-align: 0px;">1</span></span><span style="border: 0px; display: inline-block; height: 2.145em; line-height: normal; margin: 0px; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: 0px; width: 0px;"></span></span></span><span style="border-bottom-style: initial; border-color: initial; border-image: initial; border-left-style: solid; border-right-style: initial; border-top-style: initial; border-width: 0px; display: inline-block; height: 1.179em; line-height: normal; margin: 0px; overflow: hidden; padding: 0px; position: static; transition: none 0s ease 0s; vertical-align: -0.217em; width: 0px;"></span></span></nobr></span>, can be represented with bounded norm, to calculate the required norm for several specific functions, and to obtain a depth separation result. These results have important implications for understanding generalization performance and the distinction between neural networks and more traditional kernel learning.</blockquote>
<br />
<br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a>< br/>
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-68961904813466956642019-12-18T06:25:00.000-06:002019-12-19T04:47:08.724-06:00LightOn’s AI Research Workshop — FoRM #4: The Future of Random Matrices. Thursday, December 19th** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2udsoXZWma15MLVLPlKq2GviRm-LScE4ufvwR3YQdEKPo7ZIlSYq70XkJ7OxlC04siBtEHvC4qW2H9938iJucEy2_Hij-xbqTI1Vm564QDeSUGIt1Dxdtk7gpasuY43nGXWB6TA/s1600/LightOn+FORM+4.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="675" data-original-width="1200" height="225" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2udsoXZWma15MLVLPlKq2GviRm-LScE4ufvwR3YQdEKPo7ZIlSYq70XkJ7OxlC04siBtEHvC4qW2H9938iJucEy2_Hij-xbqTI1Vm564QDeSUGIt1Dxdtk7gpasuY43nGXWB6TA/s400/LightOn+FORM+4.jpg" width="400" /></a></div>
<br />
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Tomorrow we will feature LightOn’s 4th AI Research workshop on the Future of Random Matrices (FoRM). It starts at <a href="https://www.timeanddate.com/worldclock/personal.html?cities=195,179,136,248,224,1232">2pm on Thursday, December 19th</a> (That’s 2pm CET/Paris, 1pm GMT/UTC/London, 8am EST/NY-Montreal, 5am PST/California, 9pm UTC+8/ Shenzhen). We have an exciting and diverse line-up with talks on compressive learning, binarized neural networks, particle physics, and matrix factorization.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Feel free to <a href="https://www.meetup.com/fr-FR/LightOn-meetup/events/266805517/">join us</a>, or to catch the event livestream — link to be available on this page on the day of the event.<br />
<div style="text-align: center;">
<br /></div>
<div style="text-align: center;">
<iframe allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" frameborder="0" height="240" src="https://www.youtube.com/embed/xpDNDKMHu1g" width="424"></iframe>
</div>
<div style="text-align: center;">
<br /></div>
Without further ado, here is the program:</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Program</div>
<ul>
<li>1:45pm — Welcome coffee and opening. A short introduction about <a href="http://lighton.ai/">LightOn</a>, <a href="https://www.linkedin.com/in/IgorCarron">Igor Carron</a></li>
<li>2:00pm — Compressive Learning with Random Projections, <a href="https://www.cs.bham.ac.uk/~axk/">Ata Kaban</a></li>
<li>2:45pm — Medical Applications of Low Precision Neuromorphic Systems, <a href="http://penkovsky.com/">Bodgan Penkovsky</a></li>
<li>3:30pm — Comparing Low Complexity Linear Transforms, <a href="https://gra.ygav.in/">Gavin Gray</a>4:00pm — Coffee break and discussions</li>
<li>4:20pm —LightOn’s OPU+Particle Physics, <a href="https://scholar.google.com/citations?user=Xzdm_9IAAAAJ&hl=fr">David Rousseau</a>, <a href="https://twitter.com/Aishik_Ghosh_">Aishik Ghosh</a>, <a href="https://www.lri.fr/membre.php?mb=2590">Laurent Basara</a>, Biswajit Biswas</li>
<li>5:00pm — Accelerated Weighted (Nonnegative) Matrix Factorization with Random Projections, <a href="http://www-lisic.univ-littoral.fr/~puigt/">Matthieu Puigt</a></li>
<li>5:45pm — Wrapping-up and beers on our rooftop</li>
</ul>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
Talks and abstracts</div>
<br />
<div style="box-sizing: inherit; line-height: 1.18; margin: 1.72em 0px -0.31em;">
<a href="https://www.cs.bham.ac.uk/~axk/" style="text-align: justify;">Ata Kaban</a><span style="text-align: justify;">, University of Birmingham.</span><br />
Compressive Learning with Random Projections</div>
<blockquote class="tr_bq">
<div style="text-align: justify;">
By direct analogy to compressive sensing, compressive learning has been originally coined to mean learning efficiently from random projections of high dimensional massive data sets that have a sparse representation. In this talk we discuss compressive learning without the sparse representation requirement, where instead we exploit the</div>
<div style="text-align: justify;">
natural structure of learning problems.</div>
</blockquote>
<br />
<div style="box-sizing: inherit; line-height: 1.18; margin: 1.72em 0px -0.31em;">
<a href="http://penkovsky.com/" style="text-align: justify;">Bodgan Penkovsky</a><span style="text-align: justify;">, Paris-Sud University.</span><br />
Medical Applications of Low Precision Neuromorphic Systems</div>
<div style="box-sizing: inherit; line-height: 1.18; margin: 1.72em 0px -0.31em;">
</div>
<blockquote class="tr_bq" style="text-align: justify;">
The advent of deep learning has considerably accelerated machine learning development, but its development at the edge is limited by its high energy cost and memory requirement. With new memory technology available, emerging Binarized Neural Networks (BNNs) are promising to reduce the energy impact of the forthcoming machine learning hardware generation, enabling machine learning on the edge devices and avoiding data transfer over the network. In this talk we will discuss strategies to apply BNNs to biomedical signals such as electrocardiography and electroencephalography, without sacrificing accuracy and improving energy use. The ultimate goal of this research is to enable smart autonomous healthcare devices.</blockquote>
<br />
<a href="https://gra.ygav.in/">Gavin Gray</a>, Edinburgh University.<br />
Comparing Low Complexity Linear Transforms<br />
<div>
<blockquote class="tr_bq" style="text-align: justify;">
In response to the development of recent efficient dense layers, this talk discusses replacing linear components in pointwise convolutions with structured linear decompositions for substantial gains in the efficiency/accuracy tradeoff. Pointwise convolutions are fully connected layers and are thus prepared for replacement by structured transforms. Networks using such layers are able to learn the same tasks as those using standard convolutions, and provide Pareto-optimal benefits in efficiency/accuracy, both in terms of computation (mult-adds) and parameter count (and hence memory).</blockquote>
<div style="box-sizing: inherit; line-height: 1.18; margin: 1.72em 0px -0.31em;">
<br /></div>
<div style="text-align: justify;">
<a href="https://scholar.google.com/citations?user=Xzdm_9IAAAAJ&hl=fr">David Rousseau</a>, <a href="https://twitter.com/Aishik_Ghosh_">Aishik Ghosh</a>, <a href="https://www.lri.fr/membre.php?mb=2590">Laurent Basara</a>, Biswajit Biswas. LAL Orsay, LRI Orsay, BITS University.</div>
<div style="text-align: justify;">
OPU+Particle Physics</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
LightOn’s OPU is opening a new machine learning paradigm. Two use cases have been selected to investigate the potentiality of OPU for particle physics:</div>
<div style="text-align: justify;">
<ul>
<li><b>End-to-End learning</b>: high energy proton collision at the Large Hadron Collider have been simulated, each collision being recorded as an image representing the energy flux in the detector. Two classes of events have been simulated: signal are created by a hypothetical supersymmetric particle, and background by known processes. The task is to train a classifier to separate the signal from the background. Several techniques using the OPU will be presented, compared with more classical particle physics approaches.</li>
<li><b>Tracking</b>: high energy proton collisions at the LHC yield billions of records with typically 100,000 3D points corresponding to the trajectory of 10,000 particles. Various investigations of the potential of the OPU to digest this high dimensional data will be reported.</li>
</ul>
</div>
<div style="text-align: justify;">
<br /></div>
<br />
<div style="text-align: justify;">
<a href="http://www-lisic.univ-littoral.fr/~puigt/">Matthieu Puigt</a>, Université du Littoral Côte d’Opale.</div>
<div style="text-align: justify;">
Accelerated Weighted (Nonnegative) Matrix Factorization with Random Projections</div>
<div style="box-sizing: inherit; line-height: 1.18; margin: 1.72em 0px -0.31em;">
</div>
<blockquote class="tr_bq" style="text-align: justify;">
Random projections belong to the major techniques used to process big data. They have been successfully applied to, e.g., (Nonnegative) Matrix Factorization ((N)MF). However, missing entries in the matrix to factorize (or more generally weights which model the confidence in the entries of the data matrix) prevent their use. In this talk, I will present the framework that we recently proposed to solve this issue, i.e., to apply random projections to weighted (N)MF. We experimentally show the proposed framework to significantly speed-up state-of-the-art weighted NMF methods under some mild conditions.</blockquote>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
The workshop will take place at <a href="https://www.google.com/maps/place/6+Rue+Jean+Calvin,+75005+Paris/@48.841676,2.346878,17z/data=!3m1!4b1!4m5!3m4!1s0x47e671eead849419:0xd554b5d8ece400aa!8m2!3d48.841676!4d2.3490667">IPGG, 6 Rue Jean Calvin, 75005 Paris</a>. The location is close to both the Place Monge and the Censier-Daubenton subway stations on line7. it is also close to the Luxembourg station on the RER B line. The location is close to bus stops on the 21, 24, 27, 47, and 89 routes. Note that <a href="https://www.ratp.fr/presse-mouvementsocial">strikes are still ongoing</a>, and some of these options may not be available.</div>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
We will be in the main amphitheater, downstairs on your right when you enter the building. Please register in advance on our <a href="https://www.meetup.com/fr-FR/LightOn-meetup/events/266805517/">meetup group</a> so as to help us in the organization of the workshop.</div>
<div style="text-align: justify;">
<br /></div>
<br />
<br />
<br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a><br />
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a></div>
Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0tag:blogger.com,1999:blog-6141980.post-2338037694773741552019-12-11T06:16:00.000-06:002019-12-11T12:38:39.970-06:00Ce Soir: Paris Machine Learning Meetup #2 Season 7: Symbolic maths, Data Generation thru GAN, "Prevision Retards" @SNCF, Retail and AI, Rapids.ai Leveraging GPUs** <a href="https://nuit-blanche.blogspot.com/">Nuit Blanche</a> is now on Twitter: <a href="https://twitter.com/NuitBlog">@NuitBlog</a> **
<br />
<br />
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
A big thank you to <a href="https://www.scaleway.com/en/">Scaleway</a> for hosting us in their inspiring office and sponsoring the networking event afterwards.</div>
<div style="text-align: justify;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhWVjTaHPagbSJTOIgRs3ss0g8DMzST0HokW0EE9D8T4UfqXvX8fc60RV5Y1TQzsngtLhQWuoWwD-NZWvCEMSm5xjiJXheydftJsw_ACK4h2H22Symej0ui2vi-qttsoiTdR1Z-OQ/s1600/paris+machine+learning+meetup+2+Season+7.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="900" data-original-width="1600" height="225" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhWVjTaHPagbSJTOIgRs3ss0g8DMzST0HokW0EE9D8T4UfqXvX8fc60RV5Y1TQzsngtLhQWuoWwD-NZWvCEMSm5xjiJXheydftJsw_ACK4h2H22Symej0ui2vi-qttsoiTdR1Z-OQ/s400/paris+machine+learning+meetup+2+Season+7.png" width="400" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
So this is quite exciting. Our meetup group has 7 999 members and we are going to organize a meetup in a town that is paralyzed by strikes. During the course of existence of this meetup, we have seen worse. </div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
For those of you who will not be able to make it, all information slides and link to streaming are below:</div>
<div style="text-align: justify;">
<div style="text-align: center;">
<br /></div>
</div>
<div style="text-align: center;">
<a href="https://youtu.be/bvKzKfj-8uE">https://youtu.be/bvKzKfj-8uE</a></div>
<div style="text-align: justify;">
<div style="text-align: center;">
<br /></div>
</div>
<div style="text-align: justify;">
0. <a href="https://drive.google.com/open?id=0Bzn5T3uRgAXCemtRTGwzdEltbE9lOHE4bFBUem1LNHNxMVg4">Presentation Scaleway</a>, <a href="https://www.linkedin.com/in/melodiemorice/">Mélodie Morice</a></div>
<div style="text-align: justify;">
1. <a href="https://www.linkedin.com/in/aurelia-negre/">Aurélia Negre</a>, <a href="https://www.linkedin.com/in/micha%C3%ABl-sok-656514a7/?originalSubdomain=fr">Michael Sok</a>, Quantmetry, <a href="https://drive.google.com/open?id=0Bzn5T3uRgAXCd3VUWTdJRFZDV3Y2anBiNUpvRElXUzQ3SGNR">"Data generation through GANs"</a></div>
<blockquote class="tr_bq" style="text-align: justify;">
Tabular data are the most common within companies. Generating synthetic data that respects the statistical properties of the original data can have several applications: a machine learning that respects data privacy, improving the robustness of a model in relation to data drift, etc. Since 2018, there has been an increasing number of academic publications presenting the use of GANs on this type of data, particularly on patient medical data. We have performed a proof of concept on real data, and present the results of several models from the research, namely the Wasserstein GAN, the Wasserstein GAN with Gradient Penalty and the Cramér-GAN, with the objective of "model compatibility", i.e. the possibility of using synthetic data to replace real data to train a classifier.</blockquote>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
2. <a href="https://www.linkedin.com/in/h%C3%A9lo%C3%AFse-nonne-3907b166/?originalSubdomain=fr">Eloïse Nonne</a>, <a href="https://www.linkedin.com/in/soumaya-ihihi-564a8b159/">Soumaya Ihihi</a>, <a href="https://drive.google.com/open?id=0Bzn5T3uRgAXCLUdYWUl0dU5EZU9TT2ZFcFViaGotaXJZdjE0">"Prévisions Retards"</a> a Machine Learning project led by e.SNCF's Data IoT team.</div>
<blockquote class="tr_bq" style="text-align: justify;">
Its goal is to integrate predictions of train delays into the SNCF mobile application. Every day, our model predicts delays for the next 7 days, at each stop, for every train in Paris area network. The challenge of this project is to improve the reliability of passenger information and to provide more relevant routes for the application users. We will present the project, from the definition of needs and exploratory data analysis, to its industrialization in the cloud and the reliability of its predictions.</blockquote>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
3. <a href="https://www.linkedin.com/in/l%C3%A9a-dalle-lucche/">Léa Dalle Lucche</a> , <a href="https://www.linkedin.com/in/elina-ashkinazi-ildis-754660/?originalSubdomain=fr">Elina Ashkinazi-Ildis</a>, <a href="https://www.linkedin.com/in/kasra-mansouri-0478733b/">Kasra Mansouri</a>, <a href="https://drive.google.com/open?id=0Bzn5T3uRgAXCd0hCbTJXNlJzRGdfcGFJWlNHRnlCUnFfR2dJ">Retail and AI</a>, Carrefour Data Lab, Artefact</div>
<blockquote class="tr_bq" style="text-align: justify;">
This talk is focussed on AI and ML applications in retail. Discover how Carrefour is transforming through the introduction of the Google - Carrefour Lab by Elina Ashkinazi-Ildis, Director of the Lab. Then go further with the "shelf out detection" usecase presented by Kasra Mansouri, Data Scientist within Artefact.</blockquote>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
<div style="text-align: justify;">
4. <a href="https://www.linkedin.com/in/arnaudwald/">Arnaud Wald</a>, <a href="https://www.scaleway.com/en/">Scaleway</a>, <a href="https://drive.google.com/open?id=0Bzn5T3uRgAXCUEJFLUtDZlN1bVBLZWVCY21LUWtDc0MxelBV">"RAPIDS.AI, Leveraging GPUs for accelerated data science and data analytics"</a></div>
<blockquote class="tr_bq" style="text-align: justify;">
RAPIDS makes it possible to have end-to-end data science pipelines run entirely on GPU architecture. It capitalizes on the parallelization capabilities of GPUs to accelerate data preprocessing pipelines, with a pandas-like dataframe syntax. GPU-optimized versions of scikit-learn algorithms are available, and RAPIDS also integrates with major deep learning frameworks.<br />
This talk will present RAPIDS and its capabilities, and how to integrate it in your pipelines.</blockquote>
<div style="text-align: justify;">
<br /></div>
<div style="text-align: justify;">
5. <a href="https://www.linkedin.com/in/fran%C3%A7ois-charton-214187120/?locale=en_US">François Charton</a>, <a href="https://drive.google.com/open?id=0Bzn5T3uRgAXCTnhZVk1pTTY3Rko2Mk12Z0x6ekR2TVdCN1NN">"Deep Learning for Symbolic Mathematics"</a></div>
<div style="text-align: justify;">
<br /></div>
<blockquote class="tr_bq" style="text-align: justify;">
Neural networks have a reputation for being better at solving statistical or approximate problems than at performing calculations or working with symbolic data. In this paper, we show that they can be **surprisingly good** at more elaborated tasks in mathematics, such as symbolic integration and solving differential equations. We propose a syntax for representing mathematical problems, and methods for generating large datasets that can be used to train sequence-to-sequence models. We achieve results that outperform commercial Computer Algebra Systems such as Matlab or Mathematica.<br />
<a href="https://arxiv.org/abs/1912.01412">https://arxiv.org/abs/1912.01412</a></blockquote>
<div style="text-align: justify;">
<br /></div>
</div>
<br />
<br />
Follow <a href="https://twitter.com/NuitBlog">@NuitBlog</a> or join the <a href="http://www.reddit.com/r/CompressiveSensing/">CompressiveSensing Reddit</a>, the <a href="https://www.facebook.com/pages/Nuit-Blanche/166441866740790">Facebook page</a>, the Compressive Sensing group on <a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr">LinkedIn</a><a href="http://www.linkedin.com/groups?gid=683737&trk=myg_ugrp_ovr"> </a> or the Advanced Matrix Factorization group on <a href="http://www.linkedin.com/groups?gid=4084620&trk=myg_ugrp_ovr">LinkedIn</a><br />
<br />
<a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml"><img alt="" src="http://www.feedburner.com/fb/images/pub/feed-icon32x32.png" style="border: 0;" /></a><a href="http://feeds.feedburner.com/blogspot/wCeDd" rel="alternate" title="Subscribe to my feed" type="application/rss+xml">Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from</a>. You can also <a href="http://feedburner.google.com/fb/a/mailverify?uri=blogspot/wCeDd&loc=en_US">subscribe to Nuit Blanche by Email</a>.<br />
<br />
Other links:<br />
<b><u><i>Paris Machine Learning</i></u></b>: <a href="http://www.meetup.com/Paris-Machine-learning-applications-group/">Meetup.com</a>||<a href="http://nuit-blanche.blogspot.dk/p/paris-based-meetups-on-machine-learning.html">@Archives</a>||<a href="https://www.linkedin.com/groups/6400776/">LinkedIn</a>||<a href="https://www.facebook.com/ParisMachineLearning">Facebook</a>|| <a href="https://twitter.com/ParisMLgroup">@ParisMLGroup</a><br />
<b><u><i>About <a href="http://www.lighton.io/">LightOn</a></i></u></b>: <a href="http://us14.campaign-archive1.com/home/?u=701605c9443ad5e332f87331f&id=85e0ce1094">Newsletter</a> ||<a href="https://twitter.com/LightOnIO">@LightOnIO</a>|| on <a href="https://www.linkedin.com/company/lighton/">LinkedIn </a>|| on <a href="https://www.crunchbase.com/organization/lighton">CrunchBase</a> || our <a href="https://medium.com/@LightOnIO/">Blog</a><br />
<u><i><b>About myself</b></i></u>: <a href="http://www.lighton.io/">LightOn</a> || <a href="https://scholar.google.fr/citations?user=Cjrs0lAAAAAJ&hl=fr&oi=sra">Google Scholar</a> || <a href="http://www.linkedin.com/in/igorcarron">LinkedIn</a> ||<a href="http://www.twitter.com/igorcarron">@IgorCarron</a> ||<a href="https://sites.google.com/site/igorcarron2/home">Homepage</a>||<a href="https://arxiv.org/search/?query=igor+carron&searchtype=all">ArXiv</a>Igorhttp://www.blogger.com/profile/17474880327699002140noreply@blogger.com0