This project implements a Long Short-Term Memory (LSTM) neural network for stock market sequence prediction. LSTMs, the revolutionary predecessors of modern Large Language Models (LLMs), were the first architecture to effectively solve the long-term dependency problem in sequential data.
Before the Transformer architecture and modern LLMs like GPT, BERT, and others, LSTMs were the state-of-the-art in sequence modeling. Their innovative "gate" mechanism inspired many concepts used in today's language models, demonstrating that neural networks could "remember" important information over long sequences.
- Sequential stock market data processing
- Multiple time-step prediction
- Memory-based pattern recognition
- Customizable prediction windows
numpy>=1.19.2
pandas>=1.2.3
tensorflow>=2.4.1
scikit-learn>=0.24.1
matplotlib>=3.3.4
keras
This example is made with google price action data.
This project is licensed under the MIT License - see the LICENSE file for details.
Note: This project is for educational purposes only. Stock market prediction is inherently risky, and past performance does not guarantee future results.