Toward Instance-aware Neural Architecture Search by An-Chieh Cheng, Chieh Hubert Lin, Da-Cheng Juan, Wei Wei and Min Sun
Recent advancements in Neural Architecture Search (NAS) have achieved significant improvements in both single and multiple objectives settings. However, current lines of research only consider searching for a single best architecture within a search space. Such an assumption restricts the model from capturing the high diversity and variety of real-world data. With this observation, we propose InstaNAS, an instance-ware NAS framework that aims to search for a distribution of architectures. Intuitively, we assume that real-world data consists of many domains (e.g., different difficulties or structural characteristics), and each domain can have one or multiple experts that have relatively more preferable performance. The controller of InstaNAS is not only responsible for sampling architectures during its search phase, but also needs to identify which down-stream expert architecture to use for each input instance during the inference phase. We demonstrate the effectiveness of InstaNAS in a multiple-objective NAS setting that considers the trade-offs between accuracy and latency. Within a search space inspired by MobileNetV2 on a series of datasets, experiments show that InstaNAS can achieve either higher accuracy with same latency or significant latency reduction without compromising accuracy against MobileNetV2.The attendant implementation is here: https://github.com/AnjieZheng/InstaNAS
Follow @NuitBlog or join the CompressiveSensing Reddit, the Facebook page, the Compressive Sensing group on LinkedIn or the Advanced Matrix Factorization group on LinkedIn
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email.
Other links:
Paris Machine Learning: Meetup.com||@Archives||LinkedIn||Facebook|| @ParisMLGroup< br/> About LightOn: Newsletter ||@LightOnIO|| on LinkedIn || on CrunchBase || our Blog
About myself: LightOn || Google Scholar || LinkedIn ||@IgorCarron ||Homepage||ArXiv
No comments:
Post a Comment