Abstract: Mixture of experts (MoE) has recently emerged as an effective framework for deploying machine learning models in a scalable and efficient way by softly dividing complex tasks among multiple ...
DSFormer is a novel Dual Selective Fusion Transformer Network for HSI classification. It adaptively selects and fuses features from diverse receptive fields to achieve joint spatial-spectral context ...
Abstract: Transformer neural networks have emerged as the state-of-the-art in AI across text, audio, image, and video processing tasks. However, the attention mechanism that is core to Transformers ...