Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Sep 20, 2023 · Abstract:We present an approach for assessing how multilingual large language models (LLMs) learn syntax in terms of multi-formalism ...
2023. Assessment of Pre-Trained Models Across Languages and Grammars. In Proceedings of the 13th International Joint Conference on Natural Language Processing ...
Mar 14, 2024 · Assessment of Pre-Trained Models Across Languages and Grammars ... Different Languages: Probing Morphosyntax in Multilingual Pre-trained Models.
Sep 20, 2023 · We present an approach for assessing how multilingual large language models (LLMs) learn syntax in terms of multi-formalism syntactic ...
InfoXLM: an information-theoretic framework for cross-lingual language model pre-training. ... Scaling language models: methods, analysis & insights from training ...
Jul 6, 2024 · In the first approach, we built correlational models looking at the relationship between LLMs-surprisals and aphasia severity, and LLM- ...
Mar 16, 2024 · Large language models (LLMs) aim to provide a single set of representations that captures both linguistic knowledge and world knowledge across a ...
Sep 1, 2023 · This paper extensively observes word embeddings, contextual embeddings, and transformer-based pre-trained models, exploring various techniques, ...
Nov 11, 2023 · (2022) discuss work that pre-trains models on abstract reasoning templates to improve their performance in NLI. It is not clear to what extent ...