Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question #1

Open
Divadi opened this issue Oct 19, 2023 · 3 comments
Open

Question #1

Divadi opened this issue Oct 19, 2023 · 3 comments

Comments

@Divadi
Copy link

Divadi commented Oct 19, 2023

Hello - I just had a short question. For the 100% setting, how is Efficient-CLS applied? As in, what is pseudo-labeled since there are no unlabeled frames?

@zhangjiewu
Copy link
Collaborator

Hi @Divadi, sorry for the delay. For the 100% setting, we use only the EMA branch and do not utilize pseudo labels since it's fully supervised.

@Divadi
Copy link
Author

Divadi commented Oct 25, 2023

@zhangjiewu Thank you for your prompt response!
To ask - is "offline training" also done with one epoch total as well? I've been trying to train Faster RCNN on Wanderlust, but I cannot seem to get anywhere close to 48 AP50 with just one epoch, shuffled or not shuffled.

Edit for details:
I have tried Faster-RCNN + R50 + COCO pre-training with AdamW optimizer, EMA, 1-pass, to get 36.4 AP50. VOC pretrained is much worse.
I have also tried the above, except sampling 2 batches per "step" to get 45.2 AP50 which is much closer, so I am wondering if >1 batch was used per step (I see that original Wanderlust used on average 10 batches)

@zhangjiewu
Copy link
Collaborator

Hi @Divadi, sorry I missed this message.

No, the offline training allows unlimited access to previous data and multiple-epoch training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants