Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于pytorch使用显存的问题 #697

Closed
re-burn opened this issue Nov 21, 2021 · 1 comment
Closed

关于pytorch使用显存的问题 #697

re-burn opened this issue Nov 21, 2021 · 1 comment

Comments

@re-burn
Copy link

re-burn commented Nov 21, 2021

先说一下配置,rtx2080ti,4gpu,前几天训练的时候batch_size最大设置为32都不会爆显存,大概是从周四晚上开始,用跟之前一模一样的数据集和一模一样的模型(就只改了lr,anchors和alpha)结果加载数据的时候就报oom了,把batch_size减小到4以后(也就是一个gpu一张图片)才勉强能训练,但还是会跑到一半报oom,请问大佬这是什么原因呢

@re-burn re-burn closed this as completed Nov 21, 2021
@zylo117
Copy link
Owner

zylo117 commented Nov 21, 2021

这个原因很多,后处理部分可能会有出入,比如threshold过滤出来的anchor越多,内存就用得越多

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants