Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What does "point cloud distillation loss" mean for CroCo V2 initialization? #42

Open
lizhiqi49 opened this issue Dec 5, 2024 · 2 comments

Comments

@lizhiqi49
Copy link

Nice work! I have a question about model training. In appendix it is mentioned that "for CroCoV2 initialization, we warm up the training by adding point cloud distillation loss from the DUSt3R model for 1,000 steps". However, I checked the DUSt3R paper but did not see a "point cloud distillation loss" term. So I wonder what does this loss exactly mean?

@botaoye
Copy link
Collaborator

botaoye commented Dec 10, 2024

Hi, thanks for your interest in our work. The distillation loss is used by us instead of DUSt3R. You can find it here, it is similar with the point loss used by DUSt3R.

@sfchng
Copy link

sfchng commented Dec 12, 2024

Is there any change of other hyperparameters to run this point cloud distillation loss apart from setting self.train_cfg.distill_max_steps=1000?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants