Skip to content

Commit

Permalink
First public release for NExOS
Browse files Browse the repository at this point in the history
  • Loading branch information
Shuvomoy committed Nov 9, 2020
0 parents commit af45d8c
Show file tree
Hide file tree
Showing 16 changed files with 2,783 additions and 0 deletions.
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
/Manifest.toml
*.jld2
numerical_experiments/sparse_regression/data_files_in_jl_format/data_sparse_regression.jl
numerical_experiments/*
12 changes: 12 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
# Documentation: http://docs.travis-ci.com/user/languages/julia/
language: julia
os:
- linux
julia:
- 1.0
- 1.5
notifications:
email: false

after_success:
- julia -e 'using Pkg; Pkg.add("Coverage"); using Coverage; Codecov.submit(process_folder())'
21 changes: 21 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2020 Shuvomoy Das Gupta <[email protected]> and contributors

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
29 changes: 29 additions & 0 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
name = "NExOS"
uuid = "a0d681ee-6dde-4d9d-b128-06c773d9ceb4"
authors = ["Shuvomoy Das Gupta <[email protected]>"]
version = "0.1.0"

[deps]
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
MosekTools = "1ec41992-ff65-5c91-ac43-2df89e9693a4"
OSQP = "ab2f91bb-94b4-55e3-9ba0-7f65df51de79"
ProximalOperators = "a725b495-10eb-56fe-b38b-717eba820537"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
TSVD = "9449cd9e-2762-5aa3-a617-5413e99d722e"

[compat]
julia = "1"
JuMP = "0.21.3"
MosekTools = "0.9.3"
OSQP = "0.6.0"
ProximalOperators = "0.11.0"
TSVD = "0.4.0"

[extras]
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Test"]

65 changes: 65 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# ```NExOS.jl```
[![Build Status](https://travis-ci.com/Shuvomoy/NExOS.jl.svg?branch=master)](https://travis-ci.com/Shuvomoy/NExOS.jl)

``NExOS.jl`` is a `Julia` package that implements [**N**onconvex **Ex**terior-point **O**perator **S**plitting algorithm](http://www.optimization-online.org/DB_FILE/2020/11/8099.pdf) (**NExOS**). The package is tailored for minimizing a convex cost function over a nonconvex constraint set, where projection onto the constraint set is single-valued around points of interest. These types of sets are called *prox-regular* sets, *e.g.*, sets containing low-rank and sparsity constraints.

``NExOS.jl`` considers nonconvex optimization problems of the following form:

```
minimize f(x)+(β/2)‖x‖^2
subject to x ∈ X,
```

where the decision variable `x` can be a vector in `ℜ^d` or a matrix in `ℜ^(m×d)` or a combination of both. The cost function `f` is convex, `β` is a positive parameter (can be arbitrarily small), and the constraint set `X` is a nonconvex prox-regular set.

## Installation/Usage

In `Julia REPL`, type

```] add https://github.com/Shuvomoy/NExOS.jl```

## Examples
Please see the following `jupyter notebook` tutorials that describe how to use `NExOS.jl`.

1. [Affine rank minimization](https://nbviewer.jupyter.org/github/Shuvomoy/NExOS.jl/blob/master/tutorials/Affine%20rank%20minimization%20using%20NExOS.jl.ipynb).
2. [Matrix completion](https://nbviewer.jupyter.org/github/Shuvomoy/NExOS.jl/blob/master/tutorials/Matrix_completion_problem_NEXOS.ipynb).
3. [Sparse regression](https://nbviewer.jupyter.org/github/Shuvomoy/NExOS.jl/blob/master/tutorials/sparse_regression_using_NExOS.ipynb).

## Acceptable functions and sets

##### Objective function `f`
`NExOS.jl` allows for any `f` that is convex. We can classify them into two types:

1. The function `f` is an unconstrained convex function with an easy-to-compute proximal operator. To compute the proximal operators for this category of functions, `NExOS.jl` uses the package [`ProximalOperators.jl`](https://github.com/kul-forbes/ProximalOperators.jl). The list of available functions for this type is available at this [link](https://kul-forbes.github.io/ProximalOperators.jl/stable/functions/).

2. The function `f` is a constrained convex function (*i.e.*, a convex function over some convex constraint set). For such a function, no closed form solution usually exists, and in this situation `NExOS` computes the proximal operator of `f` by solving a convex optimization problem using [`JuMP`](https://github.com/jump-dev/JuMP.jl) and any of the solvers supported by it (listed [here](https://jump.dev/JuMP.jl/stable/installation/#Getting-Solvers-1)). To know more about this proximal operator computation process, please see [this blog post](https://shuvomoy.github.io/blog/programming/2020/09/08/proximal_operator_over_matrix.html) I wrote.

##### Constraint set `X`
The constraint set `X` is nonconvex, but it can be decomposed into a convex compact set `C` and a nonconvex prox-regular set `N`, *i.e.*, `X = C ⋂ N`. For example:

1. **Sparse set.** `N = {x ∈ ℜ^d ∣ card(x) ≦ k}`, where `card(x)` denotes the number of nonzero components in `x`,
2. **Low-rank set.** `N = { X ∈ ℜ^(m×d) ∣ rank(X) ≦ r}`, where `rank(X)` denotes the rank of the matrix `X`,
3. **Combination of low-rank and sparse set.** `N = {X ∈ ℜ^(m×d), x ∈ ℜ^d ∣ card(x) ≦ k, rank(X) ≦ r}`,
4. **Any other prox-regular set.** `N` can be any other prox-regular sets, *e.g.,* weakly convex sets, proximally smooth sets, *etc.*


## Citing
If you find `NExOS.jl` useful in your project, we kindly request that you cite the following paper:
```
@article{NExOS,
title={Exterior-point Operator Splitting for Nonconvex Learning},
author={Das Gupta, Shuvomoy and Stellato, Bartolomeo and Van Parys, Bart P.G.},
journal={Optimization Online Preprint},
note={\url{http://www.optimization-online.org/DB_FILE/2020/11/8099.pdf}},
year={2020}
}
```
A preprint can be downloaded [here](http://www.optimization-online.org/DB_HTML/2020/11/8099.html).

## Reporting issues
Please report any issues via the [Github issue tracker](https://github.com/Shuvomoy/NExOS.jl/issues). All types of issues are welcome including bug reports, documentation typos, feature requests and so on.

## Contact
Send an email :email: to [Shuvomoy Das Gupta](mailto:[email protected]) :rocket:!


88 changes: 88 additions & 0 deletions src/NExOS.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
module NExOS

# First, we include the type file

include("./types.jl")

# Export all the types, and functions for usage

export ProxRegularSet, Problem, Setting, State, InitInfo, SparseSet, RankSet, LeastSquaresOverMatrix, SquaredLossMatrixCompletion, affine_operator_to_matrix

# Next, we include the utils file

include("./utils.jl")

export Π_exact, update_state!, inner_iteration, prox_NExOS, Π_NExOS, update_init_info!, update_init_info_experimental!, prox!

# Next, we include the file that solves factor analysis problem, this a special file, as it is using somewhat specialized implementation

include("./factor_analysis.jl")

export ProblemFactorAnalysisModel, StateFactorAnalysisModel, InitInfoFactorAnalysisModel, update_state_fam!, inner_iteration_fam, prox_NExOS_fam



# Export all the types, and functions for usage from the utils

# the main solver function

# export the solver function

# Final solver that does everything

function solve!(problem::Problem, setting::Setting)

# create the initial state
state = State(problem, setting) # create the initial information
init_info = InitInfo(problem, setting) # create intial information

# now this first state goes into the iteration_outer!(state, problem, setting) and we keep running it until our termination condtion has been met
while state.μ >= setting.μ_min
# run the outer iteration update procedure
state = update_state!(state, init_info, problem, setting)
# init_info = update_init_info!(state, init_info, problem, setting )
# experimental version: uncomment the previous line after you are done experimenting
init_info = update_init_info_experimental!(state, init_info, problem, setting )
end

if setting.verbose == true
@info "information about the best state found for smallest μ = $(state.μ)"
@info "μ = $(state.μ) | log fixed point gap = $(log10(state.fxd_pnt_gap)) | log feasibility gap = $(log10(state.fsblt_gap)) | inner iterations = $(state.i)"
end


return state
end

# Dedicated solver for factor analysis problem
function solve!(problem::ProblemFactorAnalysisModel, setting::Setting)

# create the initial state
state = StateFactorAnalysisModel(problem, setting) # create the initial state, keep in mind actually we can run a proximal evaluation now that we can use to warm start later
init_info = InitInfoFactorAnalysisModel(problem, setting) # create intial information

#create the optimization problem to compute the proximal operator

# now this first state goes into the iteration_outer!(state, problem, setting) and we keep running it until our termination condtion has been met
while state.μ >= setting.μ_min
# run the outer iteration update procedure
state = update_state_fam!(state, init_info, problem, setting)
# init_info = update_init_info!(state, init_info, problem, setting )
# experimental version: uncomment the previous line after you are done experimenting
init_info = update_init_info_experimental_fam!(state, init_info, problem, setting )
end

if setting.verbose == true
@info "information about the best state found for smallest μ = $(state.μ)"
@info "μ = $(state.μ) | log fixed point gap = $(log10(state.fxd_pnt_gap)) | log feasibility gap = $(log10(state.fsblt_gap)) | inner iterations = $(state.i)"
end


return state
end


export solve!


end # module
Loading

2 comments on commit af45d8c

@Shuvomoy
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request updated: JuliaRegistries/General/24336

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.1.0 -m "<description of version>" af45d8ca5d4a4b7643f3058a7ebe10d54fd1078d
git push origin v0.1.0

Please sign in to comment.