Stocks closing price prediction: Long-Term Memory Sequence (LTMS) models
In the world of finance, accurate stock option predictions can be a game changer, and machine learning models, particularly Long-Term Memory Sequence (LTMS) models, are showing promising results. LTMS models excel at capturing long-term dependencies in sequential data, which is particularly valuable for predicting trends in stock markets. In this article, we'll explore how to implement an LTMS model to forecast stock prices, along with a performance comparison to other models.
You can find the source code and notebook for this article on my GitHub repository [here](https://github.com/karelbecerra/ai-ml-dl-samples/blob/main/stocks/ltms-long-term-memory.ipynb).
Step 1: Cloning the Repository
To begin, clone the repository to access the required files and code. In your terminal, run:
> git clone https://github.com/karelbecerra/ai-ml-dl-samples.git
> cd ai-ml-dl-samples/stocks
Open the ltms-long-term-memory.ipynb Jupyter notebook to follow along with the steps and code samples.
Step 2: Importing Libraries
The code requires several Python libraries. To install them, execute:
> pip install numpy pandas tensorflow scikit-learn
In the notebook, the required libraries are imported as follows:
import numpy as np
import pandas as pd
import tensorflow as tf
from sklearn.metrics import mean_squared_error, mean_absolute_error
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import LSTM, Dense, Dropout
Step 3: Preparing the Data
Using historical stock price data, we preprocess it by normalizing the values and creating sequences for LTMS input. This is crucial, as LTMS models require a sequential data format.
def prepare_data(df, target_column, time_steps=60):
data = df[target_column].values
sequence_data = []
labels = []
for i in range(len(data) - time_steps):
sequence_data.append(data[i:i + time_steps])
labels.append(data[i + time_steps])
return np.array(sequence_data), np.array(labels)
After loading your stock data into a DataFrame df, specify your target column (e.g., 'Close') and call prepare_data():
领英推荐
X, y = prepare_data(df, target_column='Close')
X = np.reshape(X, (X.shape[0], X.shape[1], 1)) # reshape for LTMS input
Step 4: Building the LTMS Model
An LTMS model can be built using TensorFlow's Keras API. The following code defines a model with two LSTM layers, a dropout layer to prevent overfitting, and a dense output layer.
model = Sequential([
LSTM(50, return_sequences=True, input_shape=(X.shape[1], 1)),
Dropout(0.2),
LSTM(50),
Dense(1)
])
model.compile(optimizer='adam', loss='mean_squared_error')
model.summary()
Train the model on your prepared data:
history = model.fit(X, y, epochs=50, batch_size=32, validation_split=0.2)
Step 5: Comparing LTMS with Other Models
To evaluate the LTMS model's performance, we compare it with alternative machine learning models, such as Linear Regression and Decision Trees.
from sklearn.linear_model import LinearRegression
from sklearn.tree import DecisionTreeRegressor
# Split data for non-LTMS models
X_train, X_test, y_train, y_test = train_test_split(df[['Close']], df['Close'], test_size=0.2)
# Linear Regression Model
linear_model = LinearRegression()
linear_model.fit(X_train, y_train)
linear_pred = linear_model.predict(X_test)
# Decision Tree Model
tree_model = DecisionTreeRegressor()
tree_model.fit(X_train, y_train)
tree_pred = tree_model.predict(X_test)
Step 6: Accuracy and Performance Comparison
Using Mean Squared Error (MSE) and Mean Absolute Error (MAE) as evaluation metrics, we compare the models:
# LTMS model performance
ltms_pred = model.predict(X_test)
ltms_mse = mean_squared_error(y_test, ltms_pred)
ltms_mae = mean_absolute_error(y_test, ltms_pred)
# Linear Regression performance
linear_mse = mean_squared_error(y_test, linear_pred)
linear_mae = mean_absolute_error(y_test, linear_pred)
# Decision Tree performance
tree_mse = mean_squared_error(y_test, tree_pred)
tree_mae = mean_absolute_error(y_test, tree_pred)
print(f"LTMS - MSE: {ltms_mse}, MAE: {ltms_mae}")
print(f"Linear Regression - MSE: {linear_mse}, MAE: {linear_mae}")
print(f"Decision Tree - MSE: {tree_mse}, MAE: {tree_mae}")
Conclusion
The LTMS model tends to outperform traditional machine learning models like Linear Regression and Decision Trees in terms of accuracy, as seen by the lower MSE and MAE values. While LTMS models may require more computational resources, they can capture temporal dependencies in stock price data more effectively, leading to more accurate predictions.
For further experimentation, try tweaking the LTMS parameters (e.g., adding more LSTM layers or adjusting the dropout rate) or increasing the dataset size to observe changes in model accuracy.
Feel free to explore the full notebook [here](https://github.com/karelbecerra/ai-ml-dl-samples/blob/main/stocks/ltms-long-term-memory.ipynb), and reach out with any questions or insights on LTMS models in financial forecasting!