You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our solver currently is solving optimization problem $$\min_{A} \frac{1}{2}\|K(t) - (I + A)K(0)(I + A^\top)\|^2 + \frac{\lambda}{2} \|A\|^2, \quad\mathrm{s.t.}~A_{i,j} = 0, \forall(i, j) \in \mathcal{I}$$
where, $\mathcal{I} = \{(i, j)| K(t)^{-1}_{i,j} = 0, K(0)^{-1}_{i,j} = 0\}$.
Since $K(t)$ and $K(0)$ are variance covariance matrices, the mask $\mathcal{I}$ is symmetric. However in the application we are interested in, we want either $A_{i, j} \ne 0$ or $A_{j, i} \ne 0$. Therefore we need additional penalization on top of current mask.
Potential solution:
Here we propose to use trimmed $\ell_1$ regularizer, follow the work at http://proceedings.mlr.press/v97/yun19a/yun19a.pdf
And we change the objective to $$\min_{A,w} \frac{1}{2}\|K(t) - (I + A)K(0)(I + A^\top)\|^2 + \frac{\lambda}{2} \|A\|^2 + \tau\sum_{(i, j) \in \mathcal{\bar I}_{--}} (w_{i,j} |A_{i, j}| + (1 - w_{i, j})|A_{j, i}|)$$
and we have the constraints $$A_{i,j} = 0, \forall(i, j) \in \mathcal{I},~\text{and}~w_{i, j}\in[0,1], \forall (i, j)\in\mathcal{\bar I}_{--}$$
where $\mathcal{\bar I}_{--} = \{(i, j)|i < j, K(t)^{-1}_{i,j} \ne 0~\mathrm{or}~K(0)^{-1}_{i,j} \ne 0\}$.
In this case, we hope $w_{i,j}$ will reach to $0$ or $1$ and we will only penalize $A_{i,j}$ or $A_{j,i}$ with $\ell_1$ norm and bring it to zero.
The text was updated successfully, but these errors were encountered:
Our solver currently is solving optimization problem
$$\min_{A} \frac{1}{2}\|K(t) - (I + A)K(0)(I + A^\top)\|^2 + \frac{\lambda}{2} \|A\|^2, \quad\mathrm{s.t.}~A_{i,j} = 0, \forall(i, j) \in \mathcal{I}$$ $\mathcal{I} = \{(i, j)| K(t)^{-1}_{i,j} = 0, K(0)^{-1}_{i,j} = 0\}$ .$K(t)$ and $K(0)$ are variance covariance matrices, the mask $\mathcal{I}$ is symmetric. However in the application we are interested in, we want either $A_{i, j} \ne 0$ or $A_{j, i} \ne 0$ . Therefore we need additional penalization on top of current mask.
where,
Since
Potential solution:$\ell_1$ regularizer, follow the work at
$$\min_{A,w} \frac{1}{2}\|K(t) - (I + A)K(0)(I + A^\top)\|^2 + \frac{\lambda}{2} \|A\|^2 + \tau\sum_{(i, j) \in \mathcal{\bar I}_{--}} (w_{i,j} |A_{i, j}| + (1 - w_{i, j})|A_{j, i}|)$$
$$A_{i,j} = 0, \forall(i, j) \in \mathcal{I},~\text{and}~w_{i, j}\in[0,1], \forall (i, j)\in\mathcal{\bar I}_{--}$$ $\mathcal{\bar I}_{--} = \{(i, j)|i < j, K(t)^{-1}_{i,j} \ne 0~\mathrm{or}~K(0)^{-1}_{i,j} \ne 0\}$ .
Here we propose to use trimmed
http://proceedings.mlr.press/v97/yun19a/yun19a.pdf
And we change the objective to
and we have the constraints
where
In this case, we hope$w_{i,j}$ will reach to $0$ or $1$ and we will only penalize $A_{i,j}$ or $A_{j,i}$ with $\ell_1$ norm and bring it to zero.
The text was updated successfully, but these errors were encountered: