Related projects
Discover more projects across a range of sectors and discipline — from AI to cleantech to social innovation.
Deep neural network (DNN) is a class of machine learning algorithms which is inspired by biological neural networks. Learning with deep neural networks has enjoyed huge empirical success in recent years across a wide variety of tasks. Lately many researchers in machine learning society have become interested in the generalization mystery: why do overparameterized DNN perform well on previously unseen data, even though they have way more parameters than the number of training samples? The Information-theoretic approach for studying generalization is one the frameworks to answer this question. Although information-theoretic approach proves its applicability for several machine learning methods, it suffers from some shortcomings that have hindered progress towards the understanding generalization in DNN. In this project, we aim to improve the information-theoretic methods for generalization which let us find a promising answer to the question of why DNNs generalize well in practice.
Ashish Khisti
Mahdi Haghifam
Element AI
Engineering - computer / electrical
Professional, scientific and technical services
University of Toronto
Accelerate
Discover more projects across a range of sectors and discipline — from AI to cleantech to social innovation.
Find the perfect opportunity to put your academic skills and knowledge into practice!
Find ProjectsThe strong support from governments across Canada, international partners, universities, colleges, companies, and community organizations has enabled Mitacs to focus on the core idea that talent and partnerships power innovation — and innovation creates a better future.