-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to load an unlabeled dataset #3
Comments
It looks like that you are training your custom dataset instead of PascalVOC2Clipart or CitySpaces2CitySpacesFoggy. If so, you should organize your dataset labels following the YOLOv5 format. Plesase refer scripts under https://github.com/hnuzhy/SSDA-YOLO/tree/master/data/formats/. Also, if you want reproduce my DAOD method, you should remember prepare five different sub-datasets as described in https://github.com/hnuzhy/SSDA-YOLO#yamls |
Thanks for your feedback and guidance. I read your paper and in the training configuration it describes that the training images contain (Is,Isf) with labels and (It,Itf) without labels. But when I load the image of (It, Itf), at the red dot mark in the code position of the screenshot below, an error appears in the above screenshot. With respect to the dataset of (It,Itf), do I need to generate pseudo-labels ahead of time? |
Yes. Although we do not use GT labels of datasets from the target domain, we also need to provide corresponding labels for Oracle training (direct transfer without domain adaptation) and Adaptive Vlidation on the test-set of target domain. Please refer code lines 441~534 in |
Thanks again for your guidance. I successfully got it working on a custom dataset. |
I encountered the same problem. I have the test set labels and have placed them in the appropriate folder, but the program still prompts me to provide the train.cache file. How can I solve this issue? |
I would like to inquire about the problem. Why is there an error that the target data set does not have labels during training?
The text was updated successfully, but these errors were encountered: