YoloWorld to Yolov8 Knowledge Distillation #16905
Replies: 2 comments
-
👋 Hello @legendof-selda, thank you for bringing your query to the Ultralytics community 🚀! We suggest checking out our comprehensive Docs for insights into utilizing YOLO models. You can find valuable guidance on Model Training and other training techniques that might help. For knowledge distillation or transfer learning inquiries, providing detailed information, like your current model setup and a few dataset examples, will be beneficial. We also recommend reviewing our minimum reproducible example guidelines to help us better assist you. Although this is an automated response, rest assured that an Ultralytics engineer will be with you shortly to provide further assistance. Join the conversation and connect with others! For real-time interaction, check out our Discord 🎧. Prefer detailed discussions? Our Discourse is perfect for that. Or join discussions on our Subreddit. UpgradeTo ensure any ongoing issue is not due to an outdated version, upgrade to the latest pip install -U ultralytics EnvironmentsYOLO models can be utilized in the following verified environments:
StatusIf the CI badge is green, all Ultralytics CI tests are passing, ensuring correct operation on multiple platforms. Looking forward to helping you further! 😊 |
Beta Was this translation helpful? Give feedback.
-
@legendof-selda knowledge distillation from a YOLO-World large model to a YOLOv8 small or nano model is possible and can be effective for improving performance with limited data. For transfer learning, even a few hundred well-labeled samples can be beneficial. Distillation might be more efficient if computational resources are limited, while transfer learning could yield better accuracy with a sufficiently diverse dataset. Consider experimenting with both to see which approach suits your specific needs. |
Beta Was this translation helpful? Give feedback.
-
Currently with yolo-worldv2, I am able to create a custom model with custom prompts. However, the results are only good for the large models and the smaller yolo-world models have really bad results. Since I don't have a rich dataset, what I would like to do is perform knowledge distillation from yolo-world large model to yolov8 small or nano model. Is this even possible and if yes what is the best way to do this? If I am to do typical transfer learning, how many samples I need in my dataset to get good results. Which would be more efficient and accurate? Knowledge distillation to a smaller yolov8 model, or building a small dataset and doing transfer learning?
Beta Was this translation helpful? Give feedback.
All reactions