Marjan Ghazvininejad
Marjan Ghazvininejad
Research Scientist, FAIR (Facebook AI Research)
Verified email at - Homepage
Cited by
Cited by
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 2019
Multilingual denoising pre-training for neural machine translation
Y Liu, J Gu, N Goyal, X Li, S Edunov, M Ghazvininejad, M Lewis, ...
Transactions of the Association for Computational Linguistics 8, 726-742, 2020
A knowledge-grounded neural conversation model
M Ghazvininejad, C Brockett, MW Chang, B Dolan, J Gao, W Yih, ...
Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018
Mask-predict: Parallel decoding of conditional masked language models
M Ghazvininejad, O Levy, Y Liu, L Zettlemoyer
arXiv preprint arXiv:1904.09324, 2019
Generating Topical Poetry
M Ghazvininejad, X Shi, Y Choi, K Knight
Empirical Methods on Natural Language Processing, 2016
Hafez: an Interactive Poetry Generation System
M Ghazvininejad, X Shi, J Priyadarshi, K Knight
proceeding of ACL Demo Track, 2017
Towards controllable story generation
N Peng, M Ghazvininejad, J May, K Knight
Proceedings of the First Workshop on Storytelling, 43-49, 2018
Pre-training via paraphrasing
M Lewis, M Ghazvininejad, G Ghosh, A Aghajanyan, S Wang, ...
Advances in Neural Information Processing Systems 33, 18470-18481, 2020
Non-autoregressive machine translation with disentangled context transformer
J Kasai, J Cross, M Ghazvininejad, J Gu
International conference on machine learning, 5144-5155, 2020
Aligned cross entropy for non-autoregressive machine translation
M Ghazvininejad, V Karpukhin, L Zettlemoyer, O Levy
International Conference on Machine Learning, 3515-3523, 2020
Delight: Deep and light-weight transformer
S Mehta, M Ghazvininejad, S Iyer, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2008.00623, 2020
Training on synthetic noise improves robustness to natural noise in machine translation
V Karpukhin, O Levy, J Eisenstein, M Ghazvininejad
arXiv preprint arXiv:1902.01509, 2019
Detecting hallucinated content in conditional neural sequence generation
C Zhou, G Neubig, J Gu, M Diab, P Guzman, L Zettlemoyer, ...
arXiv preprint arXiv:2011.02593, 2020
Improving zero and few-shot abstractive summarization with intermediate fine-tuning and data augmentation
AR Fabbri, S Han, H Li, H Li, M Ghazvininejad, S Joty, D Radev, ...
arXiv preprint arXiv:2010.12836, 2020
Semi-autoregressive training improves mask-predict decoding
M Ghazvininejad, O Levy, L Zettlemoyer
arXiv preprint arXiv:2001.08785, 2020
From local similarity to global coding: An application to image classification
A Shaban, HR Rabiee, M Farajtabar, M Ghazvininejad
Proceedings of the IEEE Conference on Computer Vision and Pattern …, 2013
Prompting contrastive explanations for commonsense reasoning tasks
B Paranjape, J Michael, M Ghazvininejad, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2106.06823, 2021
A review on language models as knowledge bases
B AlKhamissi, M Li, A Celikyilmaz, M Diab, M Ghazvininejad
arXiv preprint arXiv:2204.06031, 2022
Simple and effective retrieve-edit-rerank text generation
N Hossain, M Ghazvininejad, L Zettlemoyer
Proceedings of the 58th Annual Meeting of the Association for Computational …, 2020
Recipes for adapting pre-trained monolingual and multilingual models to machine translation
AC Stickland, X Li, M Ghazvininejad
arXiv preprint arXiv:2004.14911, 2020
The system can't perform the operation now. Try again later.
Articles 1–20