Wang / Yin / Aich | AI for Research and Scalable, Efficient Systems | Buch | 978-981-968911-8 | sack.de

Buch, Englisch, 302 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 482 g

Reihe: Communications in Computer and Information Science

Wang / Yin / Aich

AI for Research and Scalable, Efficient Systems

Second International Workshop, AI4Research 2025, and First International Workshop, SEAS 2025, Held in Conjunction with AAAI 2025, Philadelphia, PA, USA, February 25-March 4, 2025, Proceedings
Erscheinungsjahr 2025
ISBN: 978-981-968911-8
Verlag: Springer

Second International Workshop, AI4Research 2025, and First International Workshop, SEAS 2025, Held in Conjunction with AAAI 2025, Philadelphia, PA, USA, February 25-March 4, 2025, Proceedings

Buch, Englisch, 302 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 482 g

Reihe: Communications in Computer and Information Science

ISBN: 978-981-968911-8
Verlag: Springer


This book constitutes the proceedings of the Second International Workshop, AI4Research 2025, and First International Workshop, SEAS 2025, which were held in conjunction with AAAI 2025, Philadelphia, PA, USA, during February 25–March 4, 2025.

AI4Research 2025 presented 8 full papers from 35 submissions. The papers covered diverse areas such as agent debate evaluation, taxonomy expansion, hypothesis generation, AI4Research benchmarks, caption generation, drug discovery, and financial auditing. 

SEAS 2025 accepted 7 full papers from 17 submissions. These papers explore the efficiency and scalability of AI models.

Wang / Yin / Aich AI for Research and Scalable, Efficient Systems jetzt bestellen!

Zielgruppe


Research

Weitere Infos & Material


.- AI4Research 2025.

.- ResearchCodeAgent: An LLM Multi-Agent System for Automated Codification of Research Methodologies.

.- LLMs Tackle Meta-Analysis: Automating Scientific Hypothesis Generation with Statistical Rigor.

.- AuditBench: A Benchmark for Large Language Models in Financial Statement Auditing.

.- Clustering Time Series Data with Gaussian Mixture Embeddings in a Graph Autoencoder Framework.

.- Empowering AI as Autonomous Researchers: Evaluating LLMs in Generating Novel Research Ideas through Automated Metrics.

.- Multi-LLM Collaborative Caption Generation in Scientific Documents.

.- CypEGAT: A Deep Learning Framework Integrating Protein Language Model and Graph Attention Networks for Enhanced CYP450s Substrate Prediction.

.- Understanding How Paper Writers Use AI-Generated Captions in Figure Caption Writing.

.- SEAS 2025.

.-  ssProp: Energy-Efficient Training for Convolutional Neural Networks with Scheduled Sparse Back Propagation.

.- Knowledge Distillation with Training Wheels.

.- PickLLM: Context-Aware RL-Assisted Large Language Model Routing.

.- ZNorm: Z-Score Gradient Normalization Accelerating Skip-Connected Network Training without Architectural Modification.

.- The Impact of Multilingual Model Scaling on Seen and Unseen Language Performance.

.- Information Consistent Pruning: How to Efficiently Search for Sparse Networks?.

.- Efficient Image Similarity Search with Quadtrees.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.