We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
vi = torch.sum(torch.mul(self.B[i],self.W[i].data) / torch.sum(torch.mul(self.B[i],self.B[i]))). 为什么求B[i]的逆矩阵用 torch.sum(torch.mul(self.B[i],self.B[i])),由于论文中B[i]的维度是KxN(k是bit),B*B^T是kxk,那么逆矩阵为什么是求和的形式?
The text was updated successfully, but these errors were encountered:
求论文中的v*其实是linear regression的问题。我把B矩阵展开为1维的矩阵(我简化了quantization,也就是说quatization basis v1,v2,v3...之间的关系是1:2:4:...:2^n)。所以这里我的B矩阵直接不再是K*N维{-1,1},而变成了K维的{-(2^(K-1)-1), ... , -3, -1, 1, 3, ... ,(2^(K-1)-1)}。这样的话quantized weights就是uniform distributed 对于y = v*x 的linear regression问题,$v = con(x,y) / var(x) = (\sum_i (x_i,y_i)) / (\sum_i (x_i)^2) $这里 W就是x, B就是 y,所以其实就是求和的形式 ps. 不过我现在的lq-net还有点问题,如果周末有空改一下
Sorry, something went wrong.
谢谢回复,期待你新改的code,好像目前code里没有激活,每个网络还要写一个单独的layerdict。我复现原论文pytorch的时候,求矩阵inverse的时候,总是不可逆,很费解。比如其中一个BTxB为[[30720,1422,-1422],[1422,30720,-30720],[-1422,-30720,30720]],就会报错求不出inverse。
No branches or pull requests
compute v[i] with B[i]
vi = torch.sum(torch.mul(self.B[i],self.W[i].data) / torch.sum(torch.mul(self.B[i],self.B[i]))).
为什么求B[i]的逆矩阵用 torch.sum(torch.mul(self.B[i],self.B[i])),由于论文中B[i]的维度是KxN(k是bit),B*B^T是kxk,那么逆矩阵为什么是求和的形式?
The text was updated successfully, but these errors were encountered: