menu

Deep Meta-Learning: Learning to Learn in the Concept Space

Posted on 26/04/2019, in Paper.
  • Overview: This paper improves few-shot learning result on benchmark dataset a lot by jointly trained a concept generator (i.e. hidden representation) + concept discriminator (softmax layer) on an external dataset D with meta-learner.

  • Concept-space: The author argues that the few shot learning, comparing with the classic image classification task, is doing so poor because it is learning in the instance space and hasn’t generalized any concepts yet. If we are able to help meta-learner starts from some concept space, it will be improved a lot.

  • Jointly training: The proposed method, DEML, is applicable on top of MAML, Meta-SGD or Matching Nets. In each episode, apart from the sampled tasks, it also sampled a batch of data from the external dataset. This batch (x_i, y_i) will not be used by the meta-learner but go through the image classier that branches out from the concept generator, i.e. the concept discriminator.


I have the criticism that this actually give the model an unfair advantage over the few-shot learning problems as it learns from an exclusive but similar dataset in the same time. Although I am not sure how this benefit the few shot learning task by itself, I believe it is a big step to go from theoretical few-shot to more practical application of few-shot, i.e. semi-few-shots with a golden labeled dataset.

  • Life-long learning system: Silver, Daniel L, Yang, Qiang, and Li, Lianghao. Lifelong machine learning systems: Beyond learning algorithms. In AAAI Spring Symposium: Lifelong Machine Learning, volume 13, pp. 05, 2013.
  • Matching net: Vinyals, Oriol, Blundell, Charles, Lillicrap, Tim, and Wierstra, Daan. Matching networks for one shot learning. In NIPS, 2016
  • Meta-SGD: Li, Zhenguo, Zhou, Fengwei, Chen, Fei, and Li, Hang. Meta- SGD: Learning to Learn Quickly for Few Shot Learning. arXiv preprint arXiv:1707.09835, 2017.
Top