News
Mark Schmidt, Amii Canada CIFAR AI chair and Associate Professor of Computing Science at the University of British Columbia, has been awarded an Arthur B. McDonald Fellowship.
The fellowship, granted by the Natural Sciences and Engineering Research Council of Canada (NSERC), offers two-year grants to support early-career researchers "so that they can become leaders in their field and inspire others."
The $250,000 fellowship includes funding to relieve the recipients from teaching and administrative duties.
"It's amazing. It's not so easy to get teaching relief, so having time to focus on research is really special," Schmidt says.
Much of that research has focused on optimization in machine learning, computer vision and other applications to improve the speed and efficiency of machine learning models. With the support offered by the McDonald grant, Schmidt says he and his students plan to "double down" on that work.
"We're trying to develop methods that let you train machine learning models a lot cheaper, in terms of both time and money."
Schmidt's proposal to NSERC had a significant focus on optimizing hyperparameters in machine learning training. Hyperparameters are the variables that control how an ML model actually learns and are set before the training begins. Hyperparameters are generally set before a model is trained on data and involve trial and error. Researchers spend a lot of time training, tweaking hyperparameters, and then training again. This might happen many times over before the researchers find a set of hyperparameters that they are happy with.
Schmidt says this process of constantly having to start from scratch wastes a lot of time, computing power and cost. Some of the recent work Schmidt and his students have done involves one particular hyperparameter, the learning rate. This represents the pace at which a machine-learning model learns or updates a variable as it is being trained.
Their work has shown that instead of restarting the training process over and over with different learning rates, it can be possible to adjust the learning rate during training and still end up with results similar to if the ideal rate was set from the beginning. A paper detailing their work is set to be presented at this year's Neural Information Processing Systems conference in December.
Making more efficient, optimised models is an important step in advancing artificial intelligence. It allows the use of larger datasets that can be applied to solve more complex problems than wouldn’t have been possible otherwise. And creating more efficient models is vital to real-world applications of AI, as well as making them more accessible.
"It's a matter of what it used to take to train a model? If you had to do it in a [computer] cluster, maybe now you can do it on a workstation. What you used to do in a workstation, you can maybe do on a laptop. And what you did on a laptop, you can maybe do on your phone.
And what used to be impossible, well, maybe now you can solve that if you have better algorithms."
Nov 7th 2024
News
Amii partners with pipikwan pêhtâkwan and its startup company wâsikan kisewâtisiwin, to harness AI in efforts to challenge misinformation about Indigenous People and include Indigenous People in the development of AI. The project is supported by the PrairiesCan commitment to accelerate AI adoption among SMEs in the Prairie region.
Nov 7th 2024
News
Amii Fellow and Canada CIFAR AI Chair Russ Greiner and University of Alberta researcher and collaborator David Wishart were awarded the Brockhouse Canada Prize for Interdisciplinary Research in Science and Engineering from the National Sciences and Engineering Research Council of Canada (NSERC).
Nov 6th 2024
News
Amii founding member Jonathan Schaeffer has spent 40 years making huge impacts in game theory and AI. Now he’s retiring from academia and sharing some of the insights he’s gained over his impressive career.
Looking to build AI capacity? Need a speaker at your event?