You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Following a report on stackexchange there seems to be a bug in qml.qinfo.classical_fisher in combination with np.linalg.norm. Here is a minimal non-working example:
(everything identical except for how x is normalized. Just computing the probs yields the same result, hence the suspicion the bug is arising in classical_fisher. I had a glimpse at the source code and dont see any obvious reason for the behavior.)
not sure what exactly is up, but I do see HIPS/autograd#579 and PennyLane is pinned to an old autograd. Not sure if autograd.numpy.linalg.norm claims to be differentiable but it seems likely that it is not
The norm of coeff is 1.0, so I get the same jacobian if I remove the normalization step. However, if I use np.linalg.norm instead, I get a different jacobian:
I've opened a bug report on the autograd repo. In the mean time, I would suggest using a different interface or normalizing manually as done in the 2nd QNode in your example.
Following a report on stackexchange there seems to be a bug in
qml.qinfo.classical_fisher
in combination withnp.linalg.norm
. Here is a minimal non-working example:(everything identical except for how
x
is normalized. Just computing theprobs
yields the same result, hence the suspicion the bug is arising inclassical_fisher
. I had a glimpse at the source code and dont see any obvious reason for the behavior.)The text was updated successfully, but these errors were encountered: