Unsupervised language modeling with tensor networks

Modern machine learning is powered by deep neural networks composed of many interconnected layers of artificial neurons, whose tunable connections learn from data to solve important problems. While this approach has achieved incredible successes in many domains, in practice neural networks act as "black boxes" whose high-level insights are hard to access. Our project will study the use of a promising new family of tensor network models, originally developed for learning high-level structure in complex quantum systems, for capturing the structure of natural languages like French or English. We believe the convenient mathematical properties of tensor networks hold the promise to make our language models valuable sources of insight into the deeper structure of language, which can be used and modified in ways that are impossible to reproduce with neural networks.

Intern: 
Jacob Miller
Superviseur universitaire: 
Guillaume Rabusseau
Province: 
Quebec
Partenaire: 
Partner University: 
Discipline: