Weight freezing is a technique used in training neural networks where certain layers or parameters are kept unchanged during further training. This means that the values of these weights are not updated by the learning process. It is often used when reusing parts of a pre-trained model, helping to preserve learned features while allowing new…
Weight Freezing
- Post author By EfficiencyAI
- Post date
- Categories In Deep Learning, Model Optimisation Techniques, Model Training & Tuning