Follow
Ruizhe Li
Ruizhe Li
Lecturer (Assistant Professor) in Computing Science, University of Aberdeen
Verified email at abdn.ac.uk - Homepage
Title
Cited by
Cited by
Year
A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification
R Li, C Lin, M Collinson, X Li, G Chen
The SIGNLL Conference on Computational Natural Language Learning (CoNLL), 2019
422019
A Stable Variational Autoencoder for Text Modelling
R Li, X Li, C Lin, M Collinson, R Mao
The 12th International Conference on Natural Language Generation (INLG), 2019
282019
Latent Space Factorisation and Manipulation via Matrix Subspace Projection
X Li, C Lin, R Li, C Wang, F Guerin
The 37th International Conference on Machine Learning (ICML), 2020
212020
DGST: a Dual-Generator Network for Text Style Transfer
X Li, G Chen, C Lin, R Li
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
132020
Improving Variational Autoencoder for Text Modelling with Timestep-Wise Regularisation
R Li, X Li, G Chen, C Lin
The 28th International Conference on Computational Linguistics (COLING), 2020
122020
Abdn at semeval-2018 task 10: Recognising discriminative attributes using context embeddings and wordnet
R Mao, G Chen, R Li, C Lin
Proceedings of the 12th International Workshop on Semantic Evaluation, 1017-1021, 2018
32018
On the Latent Holes 🧀 of VAEs for Text Generation
R Li, X Peng, C Lin
arXiv:2110.03318, 2021
2021
Deep Latent Variable Models for Text Modelling
R Li
University of Sheffield, 2021
2021
Affective Decoding for Empathetic Response Generation
C Zeng, G Chen, C Lin, R Li, C Zhigang
The 14th International Conference on Natural Language Generation (INLG), 2021
2021
On the Low-density Latent Regions of VAE-based Language Models
R Li, X Peng, C Lin, W Rong, C Zhigang
NeurIPS 2020 Workshop on Pre-registration in Machine Learning, PMLR 148, 343-357, 2021
2021
On the Low-density Latent Regions of VAE-based Language Models
R Li, X Peng, C Lin, F Guerin, W Rong
NeurIPS Pre-registration Workshop, 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–11