Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Few-shot learning refers to the model’s ability to classify new data when only a limited number of training instances (e.g., 10 to 100) have been provided. As a result, after being exposed to a small amount of prior information, the model improves its performance.

Active learning is a type of machine learning where the model is trained on the data it considers most relevant, by prioritising the instances which are considered to be the most useful, and in that way reduces the data needs.

Few-shot learning and active learning combined together are known as active few-shot learning (FASL). FASL aims at tackling the challenge of quickly developing and deploying new models for real-world problems with the least effort.

 

The Few-shot model has multiple advantages including:

No need for large annotated datasets. FASL models learn from few examples and thus require less data to be trained.

Reduced computational costs. As FASL models are trained with small data sets, this drastically reduces the costs related to data collection and the following manual annotation in-house or by third parties.

Increased model quality. Due to the combination of technology, in multiple use cases FASL surpasses the zero-shot model performance and allows you to reach a very high quality of your model.


  • No labels