Contrast-Attention Hypergraph Neural Network for Image Classification: A Multi-View Representation Learning Framewor

Abstract
To address the challenges of complex relational representation in image data, we propose a novel framework called Contrast-Attention Hypergraph Neural Network (CAHG). By integrating hypergraph modeling, contrastive learning, and attention mechanisms, CAHG captures rich semantic structures from multiple views of images. Applied to a remote sensing cloud image dataset, the framework demonstrates superior classification performance compared to baseline models. The results validate the effectiveness of multi-view embedding and hypergraph construction in learning discriminative features, offering strong generalization potential for downstream tasks such as autonomous driving and medical imaging.