1

吴恩达来信:更“省”数据的大型预训练模型

 11 months ago
source link: https://www.6aiq.com/article/1685334801117
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

AIWeekly

via Mac OS
实时周报:https://github.com/cbamls/AI_Tutorial
  大数据  •  0 回帖  •  17 浏览  •  2 天前

吴恩达来信:更“省”数据的大型预训练模型

Dear friends, It's time to move beyond the stereotype that machine learning systems need a lot of data. While having more data is helpful, large pretrained models make it practical to build viable systems using a very small labeled train

吴恩达来信:更“省”数据的大型预训练模型


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK