We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好, 我对FairNAS的理解是,在训练超网的时候,每个batch是等待所有路径 反向传播 梯度相加之后,统一进行参数更新。 我的问题是,对于超网中的每个节点,它只存在于一条路劲中,所以只会接收到一次梯度,没有相加的过程,也没有必要等所有梯度反传之后一起更新参数,请问算法中提到的梯度相加是指什么? 另外,FariNAS虽然解决了很多公平性的问题,但是是否依然存在路径先后问题?就是说对于有相同节点noda P的路径L1和L2,先训练L1的时候,节点P已经被改变,再训练L2的时候,该节点是否会影响到L2的效果? 谢谢!
The text was updated successfully, but these errors were encountered:
I follow your reasoning and have the same question on what the addition refers to in the algorithm. Did you figure it out ?
Sorry, something went wrong.
No branches or pull requests
你好,
我对FairNAS的理解是,在训练超网的时候,每个batch是等待所有路径 反向传播 梯度相加之后,统一进行参数更新。 我的问题是,对于超网中的每个节点,它只存在于一条路劲中,所以只会接收到一次梯度,没有相加的过程,也没有必要等所有梯度反传之后一起更新参数,请问算法中提到的梯度相加是指什么?
另外,FariNAS虽然解决了很多公平性的问题,但是是否依然存在路径先后问题?就是说对于有相同节点noda P的路径L1和L2,先训练L1的时候,节点P已经被改变,再训练L2的时候,该节点是否会影响到L2的效果?
谢谢!
The text was updated successfully, but these errors were encountered: