You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What you are seeing is just the effect of no fixed random seed. The weights get initialized randomly on each run of the example, you get different gradients on each run.
Issue Description
Please describe your issue, along with:
in this example https://github.com/eclipse/deeplearning4j-examples/blob/master/samediff-examples/src/main/java/org/nd4j/examples/samediff/quickstart/basics/Ex2_LinearRegression.java why gradMap.get("weights") and gradMap.get("bias") return me different value when I run it in many times ?
Version Information
Please indicate relevant versions, including, if relevant:
The text was updated successfully, but these errors were encountered: