Improving Weight-Sharing NAS with Better Search Space and Better Supernet Training

摘要

Weight-sharing neural architecture search (NAS) is an effective technique for automating efficient neural architecture design. Weight-sharing NAS builds a supernet that assembles all the architectures as its sub-networks and jointly trains the supernet with the sub-networks. The success of weight-sharing NAS heavily relies on 1) the search space design and 2) the supernet training strategies. In this talk, we discuss our recent progress on improving the weight-sharing NAS by designing better search space and better supernet training algorithms to achieve state-of-the-art performance for various computer vision tasks.

日期
4月 20, 2021 11:00 AM — 12:00 PM
李萌
李萌
助理教授、研究员、博雅青年学者

李萌,北京大学人工智能研究院和集成电路双聘助理教授、研究员、博雅青年学者。他的研究兴趣集中于高效、安全的多模态人工智能加速算法和芯片,旨在通过算法到芯片的跨层次协同设计和优化,为人工智能构建高能效、高可靠、高安全的算力基础。

var dimensionValue = 'SOME_DIMENSION_VALUE'; ga('set', 'dimension1', dimensionValue);