Related projects
Discover more projects across a range of sectors and discipline — from AI to cleantech to social innovation.
Mitacs brings innovation to more people in more places across Canada and around the world.
Learn MoreWe work closely with businesses, researchers, and governments to create new pathways to innovation.
Learn MoreNo matter the size of your budget or scope of your research, Mitacs can help you turn ideas into impact.
Learn MoreThe Mitacs Entrepreneur Awards and the Mitacs Awards celebrate inspiring entrepreneurs and innovators who are galvanizing cutting-edge research across Canada.
Learn MoreDiscover the people, the ideas, the projects, and the partnerships that are making news, and creating meaningful impact across the Canadian innovation ecosystem.
Learn MoreThe pre-trained Bi-directional Encoder Representation from Transformers (BERT) model had proven to be a milestone in the field of Neural Machine Translation, achieving new state-of-the-art performances on many tasks in the field of Natural Language Processing. Despite its success, it has been noticed that there are still a lot of room for improvement, both in terms of training efficiency and structural design. The proposed research project would explore the detailed design decision of BERT on many levels, and optimize them wherever possible. The expected result would be an improved language model that achieves higher performance on NLP tasks while using less computational resources.
Jimmy Ba
Xiaoshi Huang
Layer 6 AI
Computer science
Accelerate
Discover more projects across a range of sectors and discipline — from AI to cleantech to social innovation.
Find the perfect opportunity to put your academic skills and knowledge into practice!
Find ProjectsThe strong support from governments across Canada, international partners, universities, colleges, companies, and community organizations has enabled Mitacs to focus on the core idea that talent and partnerships power innovation — and innovation creates a better future.