You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @liuzhuang13,
There is another issue I met in my experiments: inference time reduced when channel number to be pruned was a pow of 2 , otherwise it increased and longer than baseline, which is not mentioned. Have you ever met same issue in your experiments? and any suggestions?
my hardware and system:
GPU: GeForce RTX 2080 12G
CPU: 58G, 12 cores
System: Ubuntu16.04
regards,
summer, Gao
The text was updated successfully, but these errors were encountered:
Hi @liuzhuang13, There is another issue I met in my experiments: inference time reduced when channel number to be pruned was a pow of 2 , otherwise it increased and longer than baseline, which is not mentioned. Have you ever met same issue in your experiments? and any suggestions?
my hardware and system: GPU: GeForce RTX 2080 12G CPU: 58G, 12 cores System: Ubuntu16.04
Hi @liuzhuang13,
There is another issue I met in my experiments: inference time reduced when channel number to be pruned was a pow of 2 , otherwise it increased and longer than baseline, which is not mentioned. Have you ever met same issue in your experiments? and any suggestions?
my hardware and system:
GPU: GeForce RTX 2080 12G
CPU: 58G, 12 cores
System: Ubuntu16.04
regards,
summer, Gao
The text was updated successfully, but these errors were encountered: