Sparse Coding of Neural Word Embeddings for Multilingual Sequence Labeling
Márton Makrai
June 15, 2017, 8:15
MTA SZTAKI (Lágymányosi u. 11, Budapest) Room 306 or 506
MTA SZTAKI (Lágymányosi u. 11, Budapest) Room 306 or 506
Márton Makrai presents a paper by Gábor Berend (2016 TACL) Sparse Coding of Neural Word Embeddings for Multilingual Sequence Labeling.
From the abstract:
- (near) state-of-the art performance for
- both part-of speech tagging and named entity recognition
- for a variety of languages
- reasonable results for more than 40 treebanks for POS tagging,
- model relies only on a few thousand sparse coding-derived features,
- without applying any modification of the word representations employed for the different tasks. The proposed model has
- favorable generalization properties