Sun / Liu / Che | Chinese Computational Linguistics | Buch | 978-3-031-18314-0 | sack.de

Buch, Englisch, 351 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 563 g

Reihe: Lecture Notes in Computer Science

Sun / Liu / Che

Chinese Computational Linguistics

21st China National Conference, CCL 2022, Nanchang, China, October 14-16, 2022, Proceedings
1. Auflage 2022
ISBN: 978-3-031-18314-0
Verlag: Springer International Publishing

21st China National Conference, CCL 2022, Nanchang, China, October 14-16, 2022, Proceedings

Buch, Englisch, 351 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 563 g

Reihe: Lecture Notes in Computer Science

ISBN: 978-3-031-18314-0
Verlag: Springer International Publishing


This book constitutes the proceedings of the 21st China National Conference on Computational Linguistics, CCL 2022, held in Nanchang, China, in October 2022.
The 22 full English-language papers in this volume were carefully reviewed and selected from 293 Chinese and English submissions.

The conference papers are categorized into the following topical sub-headings: Linguistics and Cognitive Science; Fundamental Theory and Methods of Computational Linguistics; Information Retrieval, Dialogue and Question Answering; Text Generation and Summarization; Knowledge Graph and Information Extraction; Machine Translation and Multilingual Information Processing; Minority Language Information Processing; Language Resource and Evaluation; NLP Applications.

Sun / Liu / Che Chinese Computational Linguistics jetzt bestellen!

Zielgruppe


Research

Weitere Infos & Material


Linguistics and Cognitive Science.- Discourse Markers as the Classificatory Factors of Speech Acts.- Fundamental Theory and Methods of Computational Linguistics.- DIFM: An effective deep interaction and fusion model for sentence matching.- ConIsI: A Contrastive Framework with Inter-sentence Interaction for Self-supervised Sentence Representation.- Information Retrieval, Dialogue and Question Answering.- Data Synthesis and Iterative Refinement for Neural Semantic Parsing without Annotated Logical Forms.- EventBERT: Incorporating Event-based Semantics for Natural Language Understanding.- An Exploration of Prompt-Based Zero-Shot Relation Extraction Method.- Abstains from Prediction: Towards Robust Relation Extraction in Real World.-Using Extracted Emotion Cause to Improve Content-Relevance for Empathetic Conversation Generation.- Text Generation and Summarization.- To Adapt or to Fine-tune: A Case Study on Abstractive Summarization.- Knowledge Graph and Information Extraction.- MRC-based Medical NER with Multi-task Learning and Multi-strategies.- A Multi-Gate Encoder for Joint Entity and Relation Extraction.- Improving Event Temporal Relation Classification via Auxiliary Label-Aware Contrastive Learning.- Machine Translation and Multilingual Information Processing.- Towards Making the Most of Pre-trained Translation Model for Quality Estimation.- Supervised Contrastive Learning for Cross-lingual Transfer Learning.- Minority Language Information Processing.- Interactive Mongolian Question Answer Matching Model Based on Attention Mechanism in the Law Domain.- Language Resource and Evaluation.- TCM-SD: A Benchmark for Probing Syndrome Differentiation via Natural Language Processing.- COMPILING: A Benchmark Dataset for Chinese Complexity Controllable Definition Generation.- NLP Applications.- Can We Really Trust Explanations? Evaluating the Stability of Feature Attribution Explanation Methods via Adversarial Attack.- Dynamic Negative Example Construction for Grammatical Error Correction using Contrastive Learning.- SPACL: Shared-Private Architecture based on Contrastive Learning for Multi-domain Text Classification.- Low-Resource Named Entity Recognition Based on Multi-hop Dependency Trigger.- Fundamental Analysis based Neural Network for Stock Movement Prediction.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.