Meng Li's Homepage
Meng Li's Homepage
Home
News
Talks
Publications
Projects
Students
Contact
Light
Dark
Automatic
Deep Learning
NASViT: Neural Architecture Search for Efficient Vision Transformer with Gradient Conflict-Aware Supernet Training
Propose gradient conflict-aware training to improve supernet-based NAS and develop a family of optimized hybrid CNN/ViT networks that achieve state-of-the-art performance Pareto.
AlphaNet: Improved Training of Supernet with Alpha-Divergence
Develop AlphaNet to improve the supernet-based NAS with a more generalized alpha-divergence-based knowledge distillation and achieve state-of-the-art performance Pareto.
AttentiveNAS: Improving Neural Architecture Search via Attentive Sampling
Develop AttentiveNAS that focuses on improving the sampling strategy for supernet-based NAS to achieve state-of-the-art performance Pareto.
Cite
×
var dimensionValue = 'SOME_DIMENSION_VALUE'; ga('set', 'dimension1', dimensionValue);