📝 Publications

🩸 Diabetes Glucose Monitoring

CAI 2025
sym

GluTANN: Transformer-Based Continuous Glucose Monitoring Model with ANN Attention
Sijie Xiong, Youhao Xu, Cheng Tang, Jianing Wang, Shuqing Liu, Atsushi Shimada

Project

  • Abstract: In this work, we propose an innovative model based on Transformer architecture, GluTANN, with specially designed ANNs acting as self-attentions and paired correlations preserved by the encoder-decoder structure. Extensive experiments across five recognized datasets demonstrate that GluTANN has great competitiveness in reducing uncertainty while preserving satisfying accuracy, providing a feasible approach to effective glucose management and diabetes medical decisions.
  • Core Idea: Reduce redundant tokens of Transformer-based models as much as we can.
  • Domain: Time Series Forecasting, Diabetes, Glucose Monitoring

🚥🚦 Control & Time Series Forecasting

ASOC
sym

Enhancing Nonlinear Dependencies of Mamba via Negative Feedback for Time Series Forecasting
Sijie Xiong, Cheng Tang, Yuanyuan Zhang, Haoling Xiong, Youhao Xu, Atsushi Shimada

Project

  • Abstract: In this work, we are inspired by the curvature from financial domains and control systems, proposing CME-Mamba. The effectiveness, stability, robustness, etc., are discussed. Extensive experiments demonstrate that CME-Mamba grows excellent to uncover complex paradigms and predict future states in various domains, especially improving the performance for periodic and high-variate situations.
  • Core Idea: Leverage negative feedback loop to enhance non-linearity for TSF models.
  • Domain: Time Series Forecasting, Control

🏫 Education

ICLEA 2025 (The Best Poster)
sym

Fine-tuned T5 Models on FairytaleQA Chinese Dataset
Sijie Xiong, Haoling Xiong, Tao Sun, Haiqiao Liu, Fumiya Okubo, Cheng Tang, Atsushi Shimada

Project

  • Abstract: Question Answering (QA) is very important for comprehension learning and FairytaleQA is widely employed in this domain. However, rare versions in a limited number of alphabet languages restricts its application and current translators have five fatal errors. In our study, we manually translate FairytaleQA into Chinese and test its effectiveness via five fine-tuned T5 models.
  • Core Idea: Extend current QA datasets on education for pre-trained models.
  • Domain: Question-Answering, Education, Pre-trained Models

Others