Recurrent Deep Architectures for Modeling Time Series Data

Deep learning is currently the dominant machine learning technique as a result of state of the art performance in vision (Russakovsky, et al., 2015), speech (Amodei, et al., 2015) and natural language processing (Vinyals et al., 2015). The improvement in performance of these models is attributed to the availability of large datasets for training the models as well as software & hardware improvements that help accelerate the training process. Recurrent Neural Networks (RNNs) are one of the most powerful and popular frameworks for modeling sequential data such as speech and text. We propose to create an open source implementation of scalable, “industrial-strength”, RNN models. These models can then be fine-tuned and trained to perform specific prediction tasks on time series datasets from finance and insurance sectors.

Faculty Supervisor:

Graham Taylor


Nikhil Sapru


RBC Financial Group




Information and communications technologies


University of Guelph



Current openings

Find the perfect opportunity to put your academic skills and knowledge into practice!

Find Projects