Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add support for multiple frozen layer specifing #6017

Closed
wants to merge 5 commits into from

Conversation

youyuxiansen
Copy link
Contributor

@youyuxiansen youyuxiansen commented Dec 17, 2021

related to 6001

@glenn-jocher HI, I'm here. Now is

python train.py --freeze 10  # freeze up to 10
python train.py --freeze 5,6,7,8,9,10  # freeze layers 5-10
python train.py --freeze 5,7,8,10  # freeze layers 5,7,8,10

🛠️ PR Summary

Made with ❤️ by Ultralytics Actions

🌟 Summary

Enhanced layer freezing functionality during model training in YOLOv5.

📊 Key Changes

  • Removed an extraneous comma in the unpacking of training options.
  • Updated layer freezing logic to support freezing specific ranges of layers.
  • Changed --freeze argument type to accept a list of strings for more granular control.

🎯 Purpose & Impact

  • 🛠 Flexibility: Users can now specify precise ranges of layers to freeze during training, giving them finer control over the model's learning process.
  • 🔧 Usability: The new freezing option syntax is intended to be clearer and more intuitive, streamlining the training setup process.
  • 📈 Performance: Freezing the right layers can potentially improve training speed and model stability by preventing overfitting and allowing for more focus on training the unfrozen layers.

These changes could make it easier for developers and users to customize their model training to suit specific needs, leading to more efficient and effective machine learning workflows.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant