Skip Navigation Links.
Collapse <span class="m110 colortj mt20 fontw700">Volume 12 (2024)</span>Volume 12 (2024)
Collapse <span class="m110 colortj mt20 fontw700">Volume 11 (2023)</span>Volume 11 (2023)
Collapse <span class="m110 colortj mt20 fontw700">Volume 10 (2022)</span>Volume 10 (2022)
Collapse <span class="m110 colortj mt20 fontw700">Volume 9 (2021)</span>Volume 9 (2021)
Collapse <span class="m110 colortj mt20 fontw700">Volume 8 (2020)</span>Volume 8 (2020)
Collapse <span class="m110 colortj mt20 fontw700">Volume 7 (2019)</span>Volume 7 (2019)
Collapse <span class="m110 colortj mt20 fontw700">Volume 6 (2018)</span>Volume 6 (2018)
Collapse <span class="m110 colortj mt20 fontw700">Volume 5 (2017)</span>Volume 5 (2017)
Collapse <span class="m110 colortj mt20 fontw700">Volume 4 (2016)</span>Volume 4 (2016)
Collapse <span class="m110 colortj mt20 fontw700">Volume 3 (2015)</span>Volume 3 (2015)
Collapse <span class="m110 colortj mt20 fontw700">Volume 2 (2014)</span>Volume 2 (2014)
Collapse <span class="m110 colortj mt20 fontw700">Volume 1 (2013)</span>Volume 1 (2013)
American Journal of Applied Mathematics and Statistics. 2024, 12(1), 15-23
DOI: 10.12691/AJAMS-12-1-3
Original Research

Optimized Investment Strategy Based on Long Short-Term Memory Networks (LSTMs)

Qingyun Wang1 and Yayuan Xiao2,

1College of Mathematics and Computer Science, Gannan Normal University, Ganzhou, China, 341000

2Department of Mathematical Sciences, Ball State University, Muncie, IN, USA, 47396

Pub. Date: February 22, 2024

Cite this paper

Qingyun Wang and Yayuan Xiao. Optimized Investment Strategy Based on Long Short-Term Memory Networks (LSTMs). American Journal of Applied Mathematics and Statistics. 2024; 12(1):15-23. doi: 10.12691/AJAMS-12-1-3

Abstract

In recent decades, Long Short-Term Memory networks (LSTMs), an enhanced version of Recurrent Neural Networks (RNNs), have made significant contributions across various domains. Particularly in the study of time series data, they have offered promising capabilities in capturing temporal dependencies and patterns. This paper delves into the application of LSTMs in market forecasting, aiming to use historical price data to construct predictive models and optimize investment allocations for improved portfolio performance. The investigation includes a detailed examination of hyperparameters tailored for Invesco QQQ Trust (QQQ), SPDR Gold Trust (GLD), and Bitcoin (BTC) LSTM models, employing them for price prediction and the development of high-return trading strategies. Following this, an analysis is carried out on portfolio holdings, return rates, and risk enhancements for each investment asset within the testing set under this trading strategy.

Keywords

RNN, LSTM, QQQ, GLD, BTC

Copyright

Creative CommonsThis work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

References

[1]  Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735–1780.
 
[2]  Graves, Alex & Mohamed, Abdel-rahman & Hinton, Geoffrey. (2013). Speech Recognition with Deep Recurrent Neural Networks. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings. 38. 10.1109/ICASSP.2013.6638947.
 
[3]  Prasoon, A., Petersen, K., Igel, C., Lauze, F., Dam, E., Nielsen, M. (2013). Deep Feature Learning for Knee Cartilage Segmentation Using a Triplanar Convolutional Neural Network. In: Mori, K., Sakuma, I., Sato, Y., Barillot, C., Navab, N. (eds) Medical Image Computing and Computer-Assisted Intervention – MICCAI 2013. MICCAI 2013. Lecture Notes in Computer Science, vol 8150. Springer, Berlin, Heidelberg.
 
[4]  Y. Lv, Y. Duan, W. Kang, Z. Li and F.-Y. Wang (2015), "Traffic Flow Prediction With Big Data: A Deep Learning Approach," in IEEE Transactions on Intelligent Transportation Systems, 16(2), 865-873.
 
[5]  Hansika Hewamalage, Christoph Bergmeir, Kasun Bandara (2021), Recurrent Neural Networks for Time Series Forecasting: Current status and future directions, International Journal of Forecasting, 37(1), 388-427.
 
[6]  Thomas Fischer, Christopher Krauss (2018), Deep learning with long short-term memory networks for financial market predictions, European Journal of Operational Research, 270(2), 654-669.
 
[7]  Qiu J, Wang B, Zhou C (2020). Forecasting stock prices with long-short term memory neural network based on attention mechanism. PLoS One. 15(1), e0227222. PMID: 31899770; PMCID: PMC6941898.
 
[8]  Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15(1), 1929-1958.
 
[9]  Pascanu Razvan, Mikolov Tomas, and Bengio Yoshua (2013). On the difficulty of training recurrent neural networks. In International Conference on Machine Learning. 1310–1318.
 
[10]  Ioffe, S., & Szegedy, C. (2015). Batch normalization: Accelerating deep network training by reducing internal covariate shift. ICML'15: Proceedings of the 32nd International Conference on International Conference on Machine Learning. Vol. 37, 448-456.