• Neu
Xu / Qin / Mountantonakis | China Conference on Knowledge Graph and Semantic Computing and International Joint Conference on Knowledge Graphs | E-Book | sack.de
E-Book

E-Book, Englisch, Band 2229, 507 Seiten, eBook

Reihe: Communications in Computer and Information Science

Xu / Qin / Mountantonakis China Conference on Knowledge Graph and Semantic Computing and International Joint Conference on Knowledge Graphs

International Joint Conference, CCKS-IJCKG 2024, Chongqing, China, September 20–22, 2024, Proceedings

E-Book, Englisch, Band 2229, 507 Seiten, eBook

Reihe: Communications in Computer and Information Science

ISBN: 978-981-961809-5
Verlag: Springer Singapore
Format: PDF
Kopierschutz: 1 - PDF Watermark



This book constitutes the joint refereed proceedings of the 18th China Conference on Knowledge Graph and Semantic Computing and the 13th International Joint Conference on Knowledge Graphs, CCKS-IJCKG 2024, held in Chongqing, China, during September 20–22, 2024. The 30 full papers and 11 other papers presented in this volume were carefully reviewed and selected from 168 submissions. They are organized in the following topical sections: Knowledge representation and reasoning; Knowledge graph construction and knowledge integration; Graph database and knowledge management; Machine learning on graphs; Knowledge retrieval and information retrieval; Knowledge graph and large language model applications; Knowledge graph open resources; Poster and demo; Evaluations.
Xu / Qin / Mountantonakis China Conference on Knowledge Graph and Semantic Computing and International Joint Conference on Knowledge Graphs jetzt bestellen!

Zielgruppe


Research

Weitere Infos & Material


.- Knowledge Representation and Reasoning.
.- KG-diffusion: an Improved Knowledge Graph Completion with Diffusion.
.- Cardiovascular Disease Knowledge Graph Reasoning Method Based on ConvKB Link Predication.
.- Evolutionary Graph Network with Time-aware Attention for Temporal Knowledge Graph Reasoning.
.- The Framework Design of a Semantic Role-Based Knowledge Graph for Natural Disaster Emergency Response.
.- Research on Automatic Extraction of Emergency Response Standards Concept Hierarchy Based on LDA..- Knowledge Graph Construction and Knowledge Integration..- Beyond Isolation: Multi-Agent Synergy for Improving Knowledge Graph Construction.
.- SCM-Net: Semantic-Contrastive Multimodal Framework for Enhanced Chinese NER.
.- Biomedical Document Relation Extraction Via Mention-Entity Double Fusion and Contrast Enhanced Inference.
.- GVDExtractor: Document-level Ternary Relation Extraction of Gene-Variant-Disease from Medical Literature.
.- Alias Extraction Enhanced by Automatically Generated Long-Tail Instances.
.-Taxonomy Induction Using LLMs: An Enhanced Framework by Integrating Doubly-Checked Mechanism and Self-Evaluation Strategy..- Graph Database and Knowledge Management.
.- Relation Inquiry: A Novel Synchronous Joint Extractor for Entities and Relations..- Machine Learning on Graphs.
.- Flexible Multi-view Subspace Clustering with Anchor Structure Alignment..- Knowledge Retrieval and Information Retrieval.
.- Reliable Academic Conference Question Answering: A Study Based on Large Language Model..- Knowledge Graph and Large Language Model Applications.
.- Benchmarking Knowledge Graph-grounded Factual Verification.
.- An LLM-SPARQL Hybrid Framework for Named Entity Linking and Disambiguation to Wikidata.
.- Mitigating Multi-Hop Hallucination in Large Language Models with Non-Authoritative Knowledge Sources.
.- LLM-AR: Large Language Model Augmented Retrieval for Few-shot Knowledge Graph Completion.
.- Hierarchical Knowledge Graph Attention Network for Recommendation Systems.
.- Few-shot Fine-grained Ship Detection.
.- Adaptive Factual Decoding for Hallucination Mitigation with Part-Of-Speech based Critics..- Knowledge Graph Open Resources.
.- EduChat: A Large Language Model-Based Conversational Agent for Intelligent Education.
.- Manu-Eval: A Chinese Language Understanding Benchmark for Manufacturing Industry.
.- A Comprehensive Ontology Knowledge Evaluation System for Large Language Models..- Poster and Demo.
.- Enhancing traditional Chinese medicine Information Extraction using Instruction-Tuned Large Models.
.- Development of an Intelligent Chinese Medicine Q&A System Based on Traditional Chinese Medicine Knowledge Graph and Large Language Models.
.- KAOS: Large Model Multi-Agent Operating System.
.- Integrating Large Language Models with Knowledge Graphs in Traditional Chinese Medicine Consultation: A Case Study.
.- Local Index File-based Tool for Extracting Class Hierarchies from Wikidata.
.- A Study on the Metadata System and the Construction of Knowledge Graph of the Classic of Mountains and Rivers-Taking the Classic of the Southern Mountains as an Example..- Evaluations.
.- A Two-Stage Approach for Knowledge Editing in LLM.
.- LLM-based Functional Query Generation with Multi-relation Alignment for Complex Knowledge Based Question Answering.
.- A Person Attribute Knowledge-Based Question Answering Method Leveraging Large Language Models.
.- Instruction Fine-Tuning of Large Language Models for Traditional Chinese Medicine.
.- Enhancing Traditional Chinese Medicine Question Answering and Semantic Reasoning via Historical Exam Retrieval and Sentence Similarity.
.- Chinese Knowledge Base Question Answering System with Retrieval Augmented Generation.
.- Fast Assortativity Coefficient Calculation in Large-scale Social Networks.
.- MQATG:An Automatic Military Equipment Question-Answer Test Case Generation Framework using Large Language Models.
.- Boosting Q&A Generation for Military Equipment via Example Selection and Automated Prompt Engineering.
.- Improving SQL Generation with Schema Retrieval and Reaction Mechanism.
.- HIT-SCIR at CCKS-IJCKG2024: Enhancing Text-to-SQL with Multi-Step Pipeline.


Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.