Improving Weight-Sharing NAS with Better Search Space and Better Supernet Training

Abstract

Weight-sharing neural architecture search (NAS) is an effective technique for automating efficient neural architecture design. Weight-sharing NAS builds a supernet that assembles all the architectures as its sub-networks and jointly trains the supernet with the sub-networks. The success of weight-sharing NAS heavily relies on 1) the search space design and 2) the supernet training strategies. In this talk, we discuss our recent progress on improving the weight-sharing NAS by designing better search space and better supernet training algorithms to achieve state-of-the-art performance for various computer vision tasks.

Date
Apr 20, 2021 11:00 AM — 12:00 PM
Meng Li
Meng Li
Staff Research Scientist

I am currently a staff research scientist and tech lead in the Meta On-Device AI team with a focus on researching and productizing efficient AI algorithms and hardwares for next generation AR/VR devices. I received my Ph.D. degree in the Department of Electrical and Computer Engineering, University of Texas at Austin under the supervision of Prof. David Z. Pan and my bachelor degree in Peking University under the supervision of Prof. Ru Huang and Prof. Runsheng Wang. My research interests include efficient and secure AI algorithms and systems.

var dimensionValue = 'SOME_DIMENSION_VALUE'; ga('set', 'dimension1', dimensionValue);