Follow
Katharina von der Wense
Katharina von der Wense
Other namesKatharina Kann
Johannes Gutenberg University Mainz & University of Colorado Boulder
Verified email at colorado.edu - Homepage
Title
Cited by
Cited by
Year
Comparative study of CNN and RNN for natural language processing
W Yin, K Kann, M Yu, H Schütze
arXiv preprint arXiv:1702.01923, 2017
13572017
Intermediate-task transfer learning with pretrained models for natural language understanding: When and why does it work?
Y Pruksachatkun, J Phang, H Liu, PM Htut, X Zhang, RY Pang, C Vania, ...
arXiv preprint arXiv:2005.00628, 2020
2792020
The CoNLL--SIGMORPHON 2018 Shared Task: Universal Morphological Reinflection
R Cotterell, C Kirov, J Sylak-Glassman, G Walther, E Vylomova, ...
arXiv preprint arXiv:1810.07125, 2018
1462018
MED: The LMU system for the SIGMORPHON 2016 shared task on morphological reinflection
K Kann, H Schütze
Proceedings of the 14th SIGMORPHON Workshop on Computational Research in …, 2016
1132016
Single-model encoder-decoder with explicit morphological representation for reinflection
K Kann, H Schütze
arXiv preprint arXiv:1606.00589, 2016
952016
Sentence-level fluency evaluation: References help, but can be spared!
K Kann, S Rothe, K Filippova
arXiv preprint arXiv:1809.08731, 2018
762018
English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too
J Phang, PM Htut, Y Pruksachatkun, H Liu, C Vania, K Kann, I Calixto, ...
AACL 2020, 2020
712020
Findings of the AmericasNLP 2021 shared task on open machine translation for indigenous languages of the Americas
M Mager, A Oncevay, A Ebrahimi, J Ortega, AR Gonzales, A Fan, ...
Proceedings of the First Workshop on Natural Language Processing for …, 2021
702021
AmericasNLI: Evaluating zero-shot natural language understanding of pretrained multilingual models in truly low-resource languages
A Ebrahimi, M Mager, A Oncevay, V Chaudhary, L Chiruzzo, A Fan, ...
arXiv preprint arXiv:2104.08726, 2021
702021
Fortification of neural morphological segmentation models for polysynthetic minimal-resource languages
K Kann, M Mager, I Meza-Ruiz, H Schütze
arXiv preprint arXiv:1804.06024, 2018
682018
Training data augmentation for low-resource morphological inflection
T Bergmanis, K Kann, H Schütze, S Goldwater
Proceedings of the CoNLL SIGMORPHON 2017 Shared Task: Universal …, 2017
622017
jiant 1.2: A software toolkit for research on general-purpose text understanding models
A Wang, IF Tenney, Y Pruksachatkun, K Yu, J Hula, P Xia, R Pappagari, ...
Note: http://jiant. info/Cited by: footnote 4, 2019
522019
Neural morphological analysis: Encoding-decoding canonical segments
K Kann, R Cotterell, H Schütze
Proceedings of the 2016 conference on empirical methods in natural language …, 2016
512016
Towards realistic practices in low-resource natural language processing: The development set
K Kann, K Cho, SR Bowman
arXiv preprint arXiv:1909.01522, 2019
502019
How to adapt your pretrained multilingual model to 1600 languages
A Ebrahimi, K Kann
arXiv preprint arXiv:2106.02124, 2021
482021
Verb argument structure alternations in word and sentence embeddings
K Kann, A Warstadt, A Williams, SR Bowman
arXiv preprint arXiv:1811.10773, 2018
482018
One-shot neural cross-lingual transfer for paradigm completion
K Kann, R Cotterell, H Schütze
arXiv preprint arXiv:1704.00052, 2017
432017
Probing for semantic classes: Diagnosing the meaning content of word embeddings
Y Yaghoobzadeh, K Kann, TJ Hazen, E Agirre, H Schütze
arXiv preprint arXiv:1906.03608, 2019
392019
Comparative study of CNN and RNN for natural language processing (2017)
W Yin, K Kann, M Yu, H Schütze
arXiv preprint arXiv:1702.01923, 2017
382017
Neural multi-source morphological reinflection
K Kann, R Cotterell, H Schütze
arXiv preprint arXiv:1612.06027, 2016
372016
The system can't perform the operation now. Try again later.
Articles 1–20