李萌的个人主页
李萌的个人主页
首页
新闻
最新讲座
论文发表
开源项目
学生
联系方式
浅色
深色
自动
中文 (简体)
English
Efficient AI
AttentiveNAS: Improving Neural Architecture Search via Attentive Sampling
Neural architecture search (NAS) has shown great promise in designing state-of-the-art (SOTA) models that are both accurate and …
Dilin Wang
,
李萌
,
Chengyue Gong
,
Vikas Chandra
Improving efficiency in neural network accelerator using operands hamming distance optimization
Neural network accelerator is a key enabler for the on-device AI inference, for which energy efficiency is an important metric. The …
李萌
,
Yilei Li
,
Vikas Chandra
Co-exploration of neural architectures and heterogeneous asic accelerator designs targeting multiple tasks
Neural architecture search (NAS) has shown great promise in designing state-of-the-art (SOTA) models that are both accurate and …
Lei Yang
,
Zheyu Yan
,
李萌
,
Hyoukjun Kwon
,
Liangzhen Lai
,
Tushar Krishna
,
Vikas Chandra
,
Weiwen Jiang
,
Yiyu Shi
«
引用
×