Predicting Wavelet-Transformed Stock Prices Using a Vanishing Gradient Resilient Optimized Gated Recurrent Unit with a Time Lag  

Predicting Wavelet-Transformed Stock Prices Using a Vanishing Gradient Resilient Optimized Gated Recurrent Unit with a Time Lag

在线阅读下载全文

作  者:Luyandza Sindi Mamba Antony Ngunyi Lawrence Nderu Luyandza Sindi Mamba;Antony Ngunyi;Lawrence Nderu(Department of Mathematics, Institute for Basic Sciences, Technology and Innovation, The Pan African University, Nairobi, Kenya;Department of Statistics and Actuarial Sciences, Dedan Kimathi University of Technology, Nyeri, Kenya;Department of Computing, School of Computing and Information Technology, Jomo Kenyatta University of Agriculture and Technology, Nairobi, Kenya)

机构地区:[1]Department of Mathematics, Institute for Basic Sciences, Technology and Innovation, The Pan African University, Nairobi, Kenya [2]Department of Statistics and Actuarial Sciences, Dedan Kimathi University of Technology, Nyeri, Kenya [3]Department of Computing, School of Computing and Information Technology, Jomo Kenyatta University of Agriculture and Technology, Nairobi, Kenya

出  处:《Journal of Data Analysis and Information Processing》2023年第1期49-68,共20页数据分析和信息处理(英文)

摘  要:The development of accurate prediction models continues to be highly beneficial in myriad disciplines. Deep learning models have performed well in stock price prediction and give high accuracy. However, these models are largely affected by the vanishing gradient problem escalated by some activation functions. This study proposes the use of the Vanishing Gradient Resilient Optimized Gated Recurrent Unit (OGRU) model with a scaled mean Approximation Coefficient (AC) time lag which should counter slow convergence, vanishing gradient and large error metrics. This study employed the Rectified Linear Unit (ReLU), Hyperbolic Tangent (Tanh), Sigmoid and Exponential Linear Unit (ELU) activation functions. Real-life datasets including the daily Apple and 5-minute Netflix closing stock prices were used, and they were decomposed using the Stationary Wavelet Transform (SWT). The decomposed series formed a decomposed data model which was compared to an undecomposed data model with similar hyperparameters and different default lags. The Apple daily dataset performed well with a Default_1 lag, using an undecomposed data model and the ReLU, attaining 0.01312, 0.00854 and 3.67 minutes for RMSE, MAE and runtime. The Netflix data performed best with the MeanAC_42 lag, using decomposed data model and the ELU achieving 0.00620, 0.00487 and 3.01 minutes for the same metrics.The development of accurate prediction models continues to be highly beneficial in myriad disciplines. Deep learning models have performed well in stock price prediction and give high accuracy. However, these models are largely affected by the vanishing gradient problem escalated by some activation functions. This study proposes the use of the Vanishing Gradient Resilient Optimized Gated Recurrent Unit (OGRU) model with a scaled mean Approximation Coefficient (AC) time lag which should counter slow convergence, vanishing gradient and large error metrics. This study employed the Rectified Linear Unit (ReLU), Hyperbolic Tangent (Tanh), Sigmoid and Exponential Linear Unit (ELU) activation functions. Real-life datasets including the daily Apple and 5-minute Netflix closing stock prices were used, and they were decomposed using the Stationary Wavelet Transform (SWT). The decomposed series formed a decomposed data model which was compared to an undecomposed data model with similar hyperparameters and different default lags. The Apple daily dataset performed well with a Default_1 lag, using an undecomposed data model and the ReLU, attaining 0.01312, 0.00854 and 3.67 minutes for RMSE, MAE and runtime. The Netflix data performed best with the MeanAC_42 lag, using decomposed data model and the ELU achieving 0.00620, 0.00487 and 3.01 minutes for the same metrics.

关 键 词:Optimized Gated Recurrent Unit Approximation Coefficient Stationary Wavelet Transform Activation Function Time Lag 

分 类 号:G63[文化科学—教育学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象