Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] classical fisher does not work with np.linalg.norm #4560

Closed
Qottmann opened this issue Aug 31, 2023 · 2 comments
Closed

[bug] classical fisher does not work with np.linalg.norm #4560

Qottmann opened this issue Aug 31, 2023 · 2 comments
Labels
bug 🐛 Something isn't working

Comments

@Qottmann
Copy link
Contributor

Qottmann commented Aug 31, 2023

Following a report on stackexchange there seems to be a bug in qml.qinfo.classical_fisher in combination with np.linalg.norm. Here is a minimal non-working example:

(everything identical except for how x is normalized. Just computing the probs yields the same result, hence the suspicion the bug is arising in classical_fisher. I had a glimpse at the source code and dont see any obvious reason for the behavior.)

dev_with_norm = qml.device('default.qubit', wires = 1)

@qml.qinfo.classical_fisher
@qml.qnode(dev_with_norm)
def qnode0(theta, gamma=0):
    coeff = np.array([ ((1+np.exp(-1.j * theta))/2) * (np.sqrt(1-gamma)) , (1-np.exp(-1.j * theta))/2 ]) / (1-gamma * (np.cos(theta)**2) ) 
    x = coeff/np.linalg.norm(coeff)
    qml.QubitStateVector(x, wires=range(1))
    return qml.probs(wires=range(1))

@qml.qinfo.classical_fisher
@qml.qnode(dev_with_norm)
def qnode1(theta, gamma=0):
    coeff = np.array([ ((1+np.exp(-1.j * theta))/2) * (np.sqrt(1-gamma)) , (1-np.exp(-1.j * theta))/2 ]) / (1-gamma * (np.cos(theta)**2) ) 
    x = coeff/np.sqrt(np.sum(np.abs(coeff)**2))
    qml.QubitStateVector(x, wires=range(1))
    return qml.probs(wires=range(1))

x = np.array(10., requires_grad=True)
qnode0(x), qnode1(x)
(tensor([[1.83346903]], requires_grad=True),
 tensor([[1.]], requires_grad=True))
@github-actions github-actions bot added the bug 🐛 Something isn't working label Aug 31, 2023
@timmysilv
Copy link
Contributor

not sure what exactly is up, but I do see HIPS/autograd#579 and PennyLane is pinned to an old autograd. Not sure if autograd.numpy.linalg.norm claims to be differentiable but it seems likely that it is not

@mudit2812
Copy link
Contributor

This is an autograd bug rather than a PennyLane bug. Consider the following:

import autograd
import autograd.numpy as np

def get_sv(theta, gamma=0):
    coeff = np.array([((1 + np.exp(-1.j * theta)) / 2) * (np.sqrt(1 - gamma)) , (1 - np.exp(-1.j * theta)) / 2]) / (1 - gamma * (np.cos(theta)**2))
    x = coeff / np.sqrt(np.sum(np.real(coeff) ** 2 + np.imag(coeff) ** 2))
    res = np.real(x) ** 2 + np.imag(x) ** 2
    return res

x = np.array(10.)
autograd.jacobian(get_sv, 0)(10.0, 0.0)
array([ 0.27201056, -0.27201056])

The norm of coeff is 1.0, so I get the same jacobian if I remove the normalization step. However, if I use np.linalg.norm instead, I get a different jacobian:

def get_sv(theta, gamma=0):
    coeff = np.array([((1 + np.exp(-1.j * theta)) / 2) * (np.sqrt(1 - gamma)) , (1 - np.exp(-1.j * theta)) / 2]) / (1 - gamma * (np.cos(theta)**2))
    x = coeff / np.linalg.norm(coeff)
    res = np.real(x) ** 2 + np.imag(x) ** 2
    return res

x = np.array(10.)
autograd.jacobian(get_sv, 0)(10.0, 0.0)
array([0.34547   , 0.56747525])

I've opened a bug report on the autograd repo. In the mean time, I would suggest using a different interface or normalizing manually as done in the 2nd QNode in your example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐛 Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants