Online Learning
- Refers to the process of learning through continuous updates and adjustments to a model as new data becomes available.
- Involves incremental learning, processing each new instance sequentially to refine the model.
- Often used in real-time scenarios like streaming analytics, IoT, or online transactions.
- Examples: Stochastic gradient descent, online gradient descent, incremental learning.
Transfer Learning
- Adapts a model trained on one task/domain to another related task/domain.
- Leverages knowledge from a source domain to improve performance in a target domain.
- Useful for overcoming limited training data or domain shift.
- Examples: Domain adaptation, few-shot learning, meta-learning.
Offline Learning
- Learning from a static dataset, where all data is available upfront.
- Involves batch processing and is often used in scenarios like image classification or NLP.
- Examples: Batch gradient descent, k-means clustering, decision trees.
One-Shot Meta Learning
- Teaches a model to recognize or classify from a single or few examples.
- Useful where collecting large labeled datasets is impractical.
- Applications: Few-shot classification, image recognition, robotics.
Meta-Learning
- Focuses on training models to learn from a few examples and adapt to new tasks quickly.
- Useful in scenarios with frequent task changes or limited data.
- Applications: Computer vision, robotics, natural language processing.
Challenges of Meta and One-Shot Learning
- Overfitting: Models may overfit to the small example set.
- Lack of diversity: Homogeneous support sets may hinder generalization.
- Computational efficiency: Meta-learning can be resource-intensive.
Key Differences
- Online vs Offline Learning:
- Online learning is incremental and adaptive, suitable for real-time data.
- Offline learning is batch-based and static, suitable for static datasets.
- Transfer vs Offline Learning:
- Transfer learning adapts a pre-trained model, while offline learning trains from scratch.
- Transfer learning overcomes domain shift, while offline learning focuses on a single dataset.