"Medformer: A Multi-Granularity Patching Transformer for Medical Time-Series Classification" is accepted in NeurIPS 2024
Authored by Yihe Wang, Nan Huang, Taida Li, Yujun Yan, and Xiang Zhang. The paper is available at https://neurips.cc/virtual/2024/poster/93940.
Medformer is a newly developed Transformer-based model for medical time series classification, incorporating multi-granularity features processing. It comes with following key innovations:
- Cross-Channel Patching: Exploits inter-channel correlations to capture complex relationships in multi-channel data.
- Multi-Granularity Embedding: Extracts features at various scales to identify patterns across multiple time scales.
- Two-Stage Multi-Granularity Self-Attention: Utilizes intra-granularity and inter-granularity self-attention mechanisms to learn features within each granularity and their relationships across granularities.