AAAI 2026

January 22, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

With the growing demand for decentralized collaborative analysis of privacy-sensitive data, federated multi-view clustering (FMVC) has attracted widespread attention due to its ability to balance privacy protection and collaborative modeling. However, current methods still face the following challenges: (1) Clients need to frequently upload high-dimensional data such as model parameters or graph structures, resulting in high communication costs; (2) The structured data uploaded often contains semantic features and has a high risk of being inverted; (3) The server usually merges the data from all clients with the fixed fusion rule, which may result in a suboptimized clustering result when there exist low-quality clients. To address the issues, we propose a new trusted federated multi-view clustering framework (EvoFMVC) that introduces three key innovations: First, lightweight trusted evidence serves as a compact communication medium, significantly reducing overhead compared to conventional model parameters or graph structures. Second, trusted evidences express clustering results in the form of probability distribution, which avoids the risk of structured information being easily inverted. Lastly, we formalize the server-side aggregation process as a neural architecture search (NAS) task where the server flexibly uses different fusion operators to filter and fuse necessary views through evolutionary algorithms, which significantly improves the fusion effect and model performance. Experimental results on multiple datasets show that our method is superior to existing FMVC methods in terms of clustering accuracy and communication efficiency. (The source code will be published.)

Downloads

PaperTranscript English (automatic)

Next from AAAI 2026

Dual Mamba for Node-Specific Representation Learning: Tackling Over-Smoothing with Selective State Space Modeling
technical paper

Dual Mamba for Node-Specific Representation Learning: Tackling Over-Smoothing with Selective State Space Modeling

AAAI 2026

+1Yili Wang
Yiwei Dai and 3 other authors

22 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved