Time-Series

Forecasting Rideshare Demand with Transformers

Applying Transformers to forecast daily rail ridership using Keras Transformers. The proposed approach is based on the Transformers architecture which utilizes the attention mechanism, enabling the model to learn long-range dependencies in the data. This makes the model suitable for time-series tasks and can potentially outperform RNNs. The transformer encoder consists of stacked encoder layers, each containing self-attention and feed-forward neural networks. The self-attention mechanism allows the model to attend to different parts of the input sequence, capturing dependencies and patterns across time. The input sequence for the transformer encoder would include the historical ridership data along with additional features. Each data point would be represented as a vector containing the values of these features. Additionally, positional encoding is added to the input sequence to provide the model with information about the temporal order of the data.

Thumbnail
Gibran Hasan

Stock Price Prediction with RNNs

Trained and tested multiple Tensorflow Time-Series models on S&P’s historical closing prices. The data was obtained from Yahoo Finance and showed the daily closing prices for 5 years. Since prices during the test period appear to be higher than the training period, I applied the MinMaxScaler to the data and then processed them into sequences with a window of 5 days and step of 1 day.

Thumbnail
Gibran Hasan

Made with REPL Notes Build your own website in minutes with Jupyter notebooks.