Traditional statistical methods have low accuracy and predictability in the analysis of large amounts of data. In this method, non-linear models cannot be developed. Moreover, methods used to analyze data for a single time point exhibit lower performance than those used to analyze data for multiple time points, and the difference in performance increases as the amount of data increases. Using deep learning, it is possible to build a model that reflects all information on repeated measures. A recurrent neural network can be built to develop a predictive model using repeated measures. However, there are long-term dependencies and vanishing gradient problems. Meanwhile, long short-term memory method can be applied to solve problems with long-term dependency and vanishing gradient by assigning a fixed weight inside the cell state. Unlike traditional statistical methods, deep learning methods allow researchers to build non-linear models with high accuracy and predictability, using information from multiple time points. However, deep learning models cannot be interpreted; although, recently, many methods have been developed to do so by weighting time points and variables using attention algorithms, such as ReversE Time AttentIoN (RETAIN). In the future, deep learning methods, as well as traditional statistical methods, will become essential methods for big data analysis.
Citations
Citations to this article as recorded by
Longitudinal machine learning prediction of non-suicidal self-injury among Chinese adolescents: A prospective multicenter Cohort study Xinyu Guo, Shuyi Liu, Lihua Jiang, Zhihan Xiong, Linna Wang, Li Lu, Xiang Li, Li Zhao, Daniel T.L. Shek Journal of Affective Disorders.2026; 392: 120110. CrossRef
Immune-enhanced machine learning approach for early detection of precancerous colorectal neoplasia: Insights from biomarkers in routine health checkups Yohan Kim, Eun Young Kim, Mi Young Kim, Baek Hwan Cho European Journal of Cancer.2025; 229: 115786. CrossRef