Abstract
Meta-learning is a powerful learning paradigm in which solving a new task can benefit from similar tasks for faster adaption (few shot learning). Stochastic gradient descent (SGD) based meta learning has emerged as an attractive solution in the few-shot learning. However, this approach suffers from significant computational complexity due to the double loop and matrix inversion operations which incurs a significant amount of uncertainty and poor generalization. To achieve lower complexity and better generalization, in this paper, we propose MetaBayes, a novel framework that views the original meta learning problem from a Bayesian perspective where the meta-model is cast as the prior distribution and the task-specific models are viewed as task-specific posterior distributions. The objective amounts to jointly optimizing the prior and the posterior distributions. With this, we obtain a closed-form expression to update the distributions at every iteration, to avoid the high computation cost issue of SGD based meta learning, and produce a more robust and generalized meta-model. Our simulations show that tasks with few training samples achieves higher accuracy when MetaBayes prior distribution is used as an initializer compared to the commonly-used Gaussian prior distribution.
Original language | English |
---|---|
Title of host publication | 55th Asilomar Conference on Signals, Systems and Computers, ACSSC 2021 |
Editors | Michael B. Matthews |
Publisher | IEEE Computer Society |
Pages | 351-355 |
Number of pages | 5 |
ISBN (Electronic) | 9781665458283 |
DOIs | |
State | Published - 2021 |
Externally published | Yes |
Publication series
Name | Conference Record - Asilomar Conference on Signals, Systems and Computers |
---|---|
Volume | 2021-October |
ISSN (Print) | 1058-6393 |
Bibliographical note
Publisher Copyright:© 2021 IEEE.
ASJC Scopus subject areas
- Signal Processing
- Computer Networks and Communications