integrations/weights-biases/ #8482
Replies: 7 comments 16 replies
-
Is it possible to remove logging and only log progress in the command line? For my training process yolov8 seems to automatically connect with my wandb account and start logging. This process is very time-consuming so I'd like to know whether there's a flag like |
Beta Was this translation helpful? Give feedback.
-
Hey there, I am trying to make a sweep using WandB but cant quite put together where to insert what into the YOLO call, to use different hyperparameters and track them. Even though I found a site where one has done this he too doesnt describe what he had to do. I am not sure if it is more complicated than I think and if I really have to insert stuff into the training function or if there is an easy way to achieve this. |
Beta Was this translation helpful? Give feedback.
-
from ultralytics import YOLO
if name == "main": |
Beta Was this translation helpful? Give feedback.
-
Is it possible to train RT-DETR model with wandb? In add_wandb_callback() function, there are nothing about rt-detr. |
Beta Was this translation helpful? Give feedback.
-
Hi! I am using wandb to do hyperparameter tuning. Is it possible to maximise validation mAP50? This seems to be maximising the training mAP50 if I am not mistaken? metric: Otherwise should I do hyperparameter using something like this instead: metric: Thanks! |
Beta Was this translation helpful? Give feedback.
-
I used this code, but it only logs system logs; no training metrics, media, or validation metrics are being logged as shown in the video. How can I log all the training metrics and media as demonstrated in the video? Can anyone help me with this? from ultralytics import YOLO
from wandb.integration.ultralytics import add_wandb_callback
wandb.login()
wandb.init(project = 'Hand-Gestures-Detection', name = 'yolov8n_5', job_type='training')
model = YOLO("yolov8n.pt")
add_wandb_callback(model, enable_model_checkpointing=True)
model.train(data="HAND-GESTURE-DETECTION-1/data.yaml",
epochs=5,
imgsz = 640,
project = "Hand-Gestures-Detection",
name = 'yolov8n_5'
) |
Beta Was this translation helpful? Give feedback.
-
Integration of Weights & Biases with YOLO11 in Google ColabIs it currently possible to integrate Weights & Biases (W&B) with Ultralytics YOLO11 in Google Colab? If yes, where can I find the correct and up-to-date information to set this up properly? My IssueWith YOLOv10, I was able to track all my training metrics in W&B seamlessly. However, when trying to replicate this with YOLO11, only GPU and system-related data are logged to the W&B platform. None of the training metrics or results appear. Summary of things I've Tried
This is because the branch https://github.com/wandb/wandb@feat/ultralytics does not exist. To resolve this, I tried various fixes, including following older tutorials and experimenting with suggestions from forums and community discussions. I’m particularly keen to switch to YOLO11 because it shows significant improvements in wildlife detection when compared to YOLOv10. Any guidance, updates, or resources on enabling this integration would be greatly appreciated! |
Beta Was this translation helpful? Give feedback.
-
integrations/weights-biases/
Discover how to train your YOLOv8 models efficiently with Weights & Biases. This guide walks through integrating Weights & Biases with YOLOv8 to enable seamless experiment tracking, result visualization, and model explainability.
https://docs.ultralytics.com/integrations/weights-biases/
Beta Was this translation helpful? Give feedback.
All reactions