Skip to content

Commit

Permalink
DOC: README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
olivierverdier authored Oct 3, 2024
1 parent c0334c8 commit 4b64908
Showing 1 changed file with 8 additions and 4 deletions.
12 changes: 8 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,11 @@

Optimisation on diffeomorphisms using Pytorch to compute the gradient automatically.

The general idea is to be able to minimise expressions of the form $g ↦ F(g . x_0)$, where $g$ is a diffeomorphism, $x_0$ is a *template*, and must be an element in a vector space, and where $F$ is a *cost function*.
The general idea is to be able to minimise expressions of the form $g ↦ F(g · x_0)$, where
- $g$ is a group element, typically a diffeomorphism
- $x_0$ is a *template*, typically either a density or a function (i.e., an image)
- $F$ is a *cost function*
- $g · x$ is an action (or representation) of the diffeomorphism group on densities or functions

This can be used to do direct matching or indirect matching, both with several kinds of regularisation.

Expand All @@ -26,12 +30,12 @@ from diffeopt.group.ddmatch.group import DiffeoGroup
group = DiffeoGroup(I0.shape)
```

First, prepare the "network", with one layer, which keeps a group element as a parameter, and computes one or several action on images.
First, prepare the network, with one layer, which keeps a group element as a parameter, and computes one or several action on images.
Here, we want to compute, for the same group element, an action on function and one on densities:
```python
from diffeopt.group.ddmatch.representation import DensityRepresentation, FunctionRepresentation
from diffeopt.sum_representation import get_sum_representation
srep = get_sum_representation(FunctionRepresentation(group), DensityRepresentation(group))
from diffeopt.sum_representation import OrbitProblem
srep = OrbitProblem(FunctionRepresentation(group), DensityRepresentation(group))
```

Now we prepare an optimizer. It needs a learning rate and a cometric, as well as the network's parameters to be initialized:
Expand Down

0 comments on commit 4b64908

Please sign in to comment.