We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我按照python main.py --multigpu 4,5,6,7 --config configs/reparam/mobilenetv1-prune.yaml --gradual sinp --flat-width 0.5 --print-freq 200 --data path --name mobilenet_prune --amp --prune-rate 0.2 训练,查看训练之后得权重和baseline权重得通道数一致,还是这只是稀疏化训练,需要自己写prune
The text was updated successfully, but these errors were encountered:
你好,首先该方法不是channel pruning而是非结构化剪枝,因此不会减少通道数,如果想要用这个方法进行channel pruning可以参照rebuttal里面我们进一步提出的方法,可能需要自己改写代码。第二,该方法没有prune rate参数。第三,非结构化剪枝时,训练后得到的module中权重是文章中的theta,需要调用对应被剪枝module的getSparseWeight来获取稀疏化后的权重,被剪枝的层是所有的卷积和FC的weight,不包括BatchNorm或者bias。
Sorry, something went wrong.
能否给出rebuttal的链接呢?参考一下
No branches or pull requests
您好,我按照python main.py --multigpu 4,5,6,7 --config configs/reparam/mobilenetv1-prune.yaml --gradual sinp --flat-width 0.5 --print-freq 200 --data path --name mobilenet_prune --amp --prune-rate 0.2 训练,查看训练之后得权重和baseline权重得通道数一致,还是这只是稀疏化训练,需要自己写prune
The text was updated successfully, but these errors were encountered: