Follow
Shien Zhu
Shien Zhu
School of Computer Science and Engineering, Nanyang Technological University, Singapore
Verified email at e.ntu.edu.sg - Homepage
Title
Cited by
Cited by
Year
Xor-net: an efficient computation pipeline for binary neural network inference on edge devices
S Zhu, LHK Duong, W Liu
2020 IEEE 26th international conference on parallel and distributed systems …, 2020
202020
EDLAB: A benchmark for edge deep learning accelerators
H Kong, S Huai, D Liu, L Zhang, H Chen, S Zhu, S Li, W Liu, M Rastogi, ...
IEEE Design and Test, 2021
142021
TAB: Unified and optimized ternary, binary, and mixed-precision neural network inference on the edge
S Zhu, LHK Duong, W Liu
ACM Transactions on Embedded Computing Systems (TECS) 21 (5), 1-26, 2022
72022
FAT: An in-memory accelerator with fast addition for ternary weight neural networks
S Zhu, LHK Duong, H Chen, D Liu, W Liu
IEEE Transactions on Computer-Aided Design of Integrated Circuits and …, 2022
52022
imad: An in-memory accelerator for addernet with efficient 8-bit addition and subtraction operations
S Zhu, S Li, W Liu
Proceedings of the Great Lakes Symposium on VLSI 2022, 65-70, 2022
42022
Parallel multipath transmission for burst traffic optimization in point-to-point NoCs
H Chen, Z Zhang, P Chen, S Zhu, W Liu
Proceedings of the 2021 on Great Lakes Symposium on VLSI, 289-294, 2021
22021
An Efficient Sparse LSTM Accelerator on Embedded FPGAs with Bandwidth-Oriented Pruning
S Li, S Zhu, X Luo, T Luo, W Liu
2023 33rd International Conference on Field-Programmable Logic and …, 2023
2023
iMAT: Energy-Efficient In-Memory Acceleration for Ternary Neural Networks With Sparse Dot Product
S Zhu, S Huai, G Xiong, W Liu
2023 IEEE/ACM International Symposium on Low Power Electronics and Design …, 2023
2023
Deep learning acceleration: from quantization to in-memory computing
S Zhu
Nanyang Technological University, 2022
2022
Cross-filter compression for CNN inference acceleration
F Lyu, S Zhu, W Liu
arXiv preprint arXiv:2005.09034, 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–10