Thursday, June 20, 2019

Efficient Forward Architecture Search - implementation -

** Nuit Blanche is now on Twitter: @NuitBlog **



In this work, we propose a neural architecture search (NAS) algorithm that iteratively augments existing networks by adding shortcut connections and layers. At each iteration, we greedily select among the most cost-efficient models a parent model, and insert into it a number of candidate layers. To learn which combination of additional layers to keep, we simultaneously train their parameters and use feature selection techniques to extract the most promising candidates which are then jointly trained with the parent model. The result of this process is excellent statistical performance with relatively low computational cost. Furthermore, unlike recent studies of NAS that almost exclusively focus on the small search space of repeatable network modules (cells), this approach also allows direct search among the more general (macro) network structures to find cost-effective models when macro search starts with the same initial models as cell search does. Source code is available at https://github.com/microsoft/petridishnn

Follow @NuitBlog or join the CompressiveSensing Reddit, the Facebook page, the Compressive Sensing group on LinkedIn  or the Advanced Matrix Factorization group on LinkedIn

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email.

Other links:
Paris Machine LearningMeetup.com||@Archives||LinkedIn||Facebook|| @ParisMLGroup< br/> About LightOnNewsletter ||@LightOnIO|| on LinkedIn || on CrunchBase || our Blog
About myselfLightOn || Google Scholar || LinkedIn ||@IgorCarron ||Homepage||ArXiv

No comments:

Printfriendly