Skip to content

Latest commit

 

History

History
32 lines (25 loc) · 1.29 KB

README.md

File metadata and controls

32 lines (25 loc) · 1.29 KB

LSTM Stock Market Sequence Prediction

The Ancestors of Modern Language Models

Overview

This project implements a Long Short-Term Memory (LSTM) neural network for stock market sequence prediction. LSTMs, the revolutionary predecessors of modern Large Language Models (LLMs), were the first architecture to effectively solve the long-term dependency problem in sequential data.

Historical Significance

Before the Transformer architecture and modern LLMs like GPT, BERT, and others, LSTMs were the state-of-the-art in sequence modeling. Their innovative "gate" mechanism inspired many concepts used in today's language models, demonstrating that neural networks could "remember" important information over long sequences.

Features

  • Sequential stock market data processing
  • Multiple time-step prediction
  • Memory-based pattern recognition
  • Customizable prediction windows

Requirements

numpy>=1.19.2
pandas>=1.2.3
tensorflow>=2.4.1
scikit-learn>=0.24.1
matplotlib>=3.3.4
keras

Details

This example is made with google price action data.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Note: This project is for educational purposes only. Stock market prediction is inherently risky, and past performance does not guarantee future results.