< Terug naar vorige pagina

Publicatie

A Brief Overview of Universal Sentence Representation Methods: A Linguistic View

Tijdschriftbijdrage - Tijdschriftartikel

How to transfer the semantic information in a sentence to a computable numerical embedding form is a fundamental problem in natural language processing. An informative universal sentence embedding can greatly promote subsequent natural language processing tasks. However, unlike universal word embeddings, a widely accepted general-purpose sentence embedding technique has not been developed. This survey summarizes the current universal sentence-embedding methods, categorizes them into four groups from a linguistic view, and ultimately analyzes their reported performance. Sentence embeddings trained from words in a bottom-up manner are observed to have different, nearly opposite, performance patterns in downstream tasks compared to those trained from logical relationships between sentences. By comparing differences of training schemes in and between groups, we analyze possible essential reasons for different performance patterns. We additionally collect incentive strategies handling sentences from other models and propose potentially inspiring future research directions.
Tijdschrift: ACM Computing Surveys
ISSN: 0360-0300
Issue: 3
Volume: 55
Pagina's: 1 - 42
Jaar van publicatie:2022
Toegankelijkheid:Closed