Skip to Main content Skip to Navigation
Journal articles

Can word vectors help corpus linguists?

Abstract : Two recent methods based on distributional semantic models (DSMs) have proved very successful in learning high-quality vector representations of words from large corpora: word2vec (Mikolov, Chen, et al. 2013; Mikolov, Yih, et al. 2013) and GloVe (Pennington et al. 2014). Once trained on a very large corpus, these algorithms produce distributed representations for words in the form of vectors. DSMs based on deep learning and neural networks have proved efficient in representing the meaning of individual words. In this paper, I assess to what extent state-of-the-art word-vector semantics can help corpus linguists annotate large datasets for semantic classes. Although word vectors suggest decisive opportunities for resolving semantic annotation issues, it has yet to improve in terms of its representation of polysemy, homonymy, and multiword expressions.
Complete list of metadata

Cited literature [41 references]  Display  Hide  Download
Contributor : Guillaume Desagulier <>
Submitted on : Wednesday, October 3, 2018 - 12:56:23 PM
Last modification on : Tuesday, March 2, 2021 - 10:14:32 AM


Files produced by the author(s)



Guillaume Desagulier. Can word vectors help corpus linguists?. Studia Neophilologica, Taylor & Francis (Routledge): SSH Titles, 2019, ⟨10.1080/00393274.2019.1616220⟩. ⟨halshs-01657591v2⟩



Record views


Files downloads