diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..124fcf3 --- /dev/null +++ b/.gitignore @@ -0,0 +1,4 @@ +/Manifest.toml +*.jld2 +numerical_experiments/sparse_regression/data_files_in_jl_format/data_sparse_regression.jl +numerical_experiments/* diff --git a/.travis.yml b/.travis.yml new file mode 100644 index 0000000..d554036 --- /dev/null +++ b/.travis.yml @@ -0,0 +1,12 @@ +# Documentation: http://docs.travis-ci.com/user/languages/julia/ +language: julia +os: + - linux +julia: + - 1.0 + - 1.5 +notifications: + email: false + +after_success: + - julia -e 'using Pkg; Pkg.add("Coverage"); using Coverage; Codecov.submit(process_folder())' \ No newline at end of file diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..f2d7c1a --- /dev/null +++ b/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2020 Shuvomoy Das Gupta and contributors + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/Project.toml b/Project.toml new file mode 100644 index 0000000..21a5a9f --- /dev/null +++ b/Project.toml @@ -0,0 +1,29 @@ +name = "NExOS" +uuid = "a0d681ee-6dde-4d9d-b128-06c773d9ceb4" +authors = ["Shuvomoy Das Gupta "] +version = "0.1.0" + +[deps] +JuMP = "4076af6c-e467-56ae-b986-b466b2749572" +LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e" +MosekTools = "1ec41992-ff65-5c91-ac43-2df89e9693a4" +OSQP = "ab2f91bb-94b4-55e3-9ba0-7f65df51de79" +ProximalOperators = "a725b495-10eb-56fe-b38b-717eba820537" +Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c" +SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf" +TSVD = "9449cd9e-2762-5aa3-a617-5413e99d722e" + +[compat] +julia = "1" +JuMP = "0.21.3" +MosekTools = "0.9.3" +OSQP = "0.6.0" +ProximalOperators = "0.11.0" +TSVD = "0.4.0" + +[extras] +Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40" + +[targets] +test = ["Test"] + diff --git a/README.md b/README.md new file mode 100644 index 0000000..c01107a --- /dev/null +++ b/README.md @@ -0,0 +1,65 @@ +# ```NExOS.jl``` +[![Build Status](https://travis-ci.com/Shuvomoy/NExOS.jl.svg?branch=master)](https://travis-ci.com/Shuvomoy/NExOS.jl) + +``NExOS.jl`` is a `Julia` package that implements [**N**onconvex **Ex**terior-point **O**perator **S**plitting algorithm](http://www.optimization-online.org/DB_FILE/2020/11/8099.pdf) (**NExOS**). The package is tailored for minimizing a convex cost function over a nonconvex constraint set, where projection onto the constraint set is single-valued around points of interest. These types of sets are called *prox-regular* sets, *e.g.*, sets containing low-rank and sparsity constraints. + +``NExOS.jl`` considers nonconvex optimization problems of the following form: + +``` +minimize f(x)+(β/2)‖x‖^2 +subject to x ∈ X, +``` + +where the decision variable `x` can be a vector in `ℜ^d` or a matrix in `ℜ^(m×d)` or a combination of both. The cost function `f` is convex, `β` is a positive parameter (can be arbitrarily small), and the constraint set `X` is a nonconvex prox-regular set. + +## Installation/Usage + +In `Julia REPL`, type + +```] add https://github.com/Shuvomoy/NExOS.jl``` + +## Examples +Please see the following `jupyter notebook` tutorials that describe how to use `NExOS.jl`. + +1. [Affine rank minimization](https://nbviewer.jupyter.org/github/Shuvomoy/NExOS.jl/blob/master/tutorials/Affine%20rank%20minimization%20using%20NExOS.jl.ipynb). +2. [Matrix completion](https://nbviewer.jupyter.org/github/Shuvomoy/NExOS.jl/blob/master/tutorials/Matrix_completion_problem_NEXOS.ipynb). +3. [Sparse regression](https://nbviewer.jupyter.org/github/Shuvomoy/NExOS.jl/blob/master/tutorials/sparse_regression_using_NExOS.ipynb). + +## Acceptable functions and sets + +##### Objective function `f` +`NExOS.jl` allows for any `f` that is convex. We can classify them into two types: + +1. The function `f` is an unconstrained convex function with an easy-to-compute proximal operator. To compute the proximal operators for this category of functions, `NExOS.jl` uses the package [`ProximalOperators.jl`](https://github.com/kul-forbes/ProximalOperators.jl). The list of available functions for this type is available at this [link](https://kul-forbes.github.io/ProximalOperators.jl/stable/functions/). + +2. The function `f` is a constrained convex function (*i.e.*, a convex function over some convex constraint set). For such a function, no closed form solution usually exists, and in this situation `NExOS` computes the proximal operator of `f` by solving a convex optimization problem using [`JuMP`](https://github.com/jump-dev/JuMP.jl) and any of the solvers supported by it (listed [here](https://jump.dev/JuMP.jl/stable/installation/#Getting-Solvers-1)). To know more about this proximal operator computation process, please see [this blog post](https://shuvomoy.github.io/blog/programming/2020/09/08/proximal_operator_over_matrix.html) I wrote. + +##### Constraint set `X` +The constraint set `X` is nonconvex, but it can be decomposed into a convex compact set `C` and a nonconvex prox-regular set `N`, *i.e.*, `X = C ⋂ N`. For example: + +1. **Sparse set.** `N = {x ∈ ℜ^d ∣ card(x) ≦ k}`, where `card(x)` denotes the number of nonzero components in `x`, +2. **Low-rank set.** `N = { X ∈ ℜ^(m×d) ∣ rank(X) ≦ r}`, where `rank(X)` denotes the rank of the matrix `X`, +3. **Combination of low-rank and sparse set.** `N = {X ∈ ℜ^(m×d), x ∈ ℜ^d ∣ card(x) ≦ k, rank(X) ≦ r}`, +4. **Any other prox-regular set.** `N` can be any other prox-regular sets, *e.g.,* weakly convex sets, proximally smooth sets, *etc.* + + +## Citing +If you find `NExOS.jl` useful in your project, we kindly request that you cite the following paper: +``` +@article{NExOS, + title={Exterior-point Operator Splitting for Nonconvex Learning}, + author={Das Gupta, Shuvomoy and Stellato, Bartolomeo and Van Parys, Bart P.G.}, + journal={Optimization Online Preprint}, + note={\url{http://www.optimization-online.org/DB_FILE/2020/11/8099.pdf}}, + year={2020} +} +``` +A preprint can be downloaded [here](http://www.optimization-online.org/DB_HTML/2020/11/8099.html). + +## Reporting issues +Please report any issues via the [Github issue tracker](https://github.com/Shuvomoy/NExOS.jl/issues). All types of issues are welcome including bug reports, documentation typos, feature requests and so on. + +## Contact +Send an email :email: to [Shuvomoy Das Gupta](mailto:sdgupta@mit.edu) :rocket:! + + diff --git a/src/NExOS.jl b/src/NExOS.jl new file mode 100644 index 0000000..f4fcae0 --- /dev/null +++ b/src/NExOS.jl @@ -0,0 +1,88 @@ +module NExOS + +# First, we include the type file + +include("./types.jl") + +# Export all the types, and functions for usage + +export ProxRegularSet, Problem, Setting, State, InitInfo, SparseSet, RankSet, LeastSquaresOverMatrix, SquaredLossMatrixCompletion, affine_operator_to_matrix + +# Next, we include the utils file + +include("./utils.jl") + +export Π_exact, update_state!, inner_iteration, prox_NExOS, Π_NExOS, update_init_info!, update_init_info_experimental!, prox! + +# Next, we include the file that solves factor analysis problem, this a special file, as it is using somewhat specialized implementation + +include("./factor_analysis.jl") + +export ProblemFactorAnalysisModel, StateFactorAnalysisModel, InitInfoFactorAnalysisModel, update_state_fam!, inner_iteration_fam, prox_NExOS_fam + + + +# Export all the types, and functions for usage from the utils + +# the main solver function + +# export the solver function + +# Final solver that does everything + +function solve!(problem::Problem, setting::Setting) + + # create the initial state + state = State(problem, setting) # create the initial information + init_info = InitInfo(problem, setting) # create intial information + + # now this first state goes into the iteration_outer!(state, problem, setting) and we keep running it until our termination condtion has been met + while state.μ >= setting.μ_min + # run the outer iteration update procedure + state = update_state!(state, init_info, problem, setting) + # init_info = update_init_info!(state, init_info, problem, setting ) + # experimental version: uncomment the previous line after you are done experimenting + init_info = update_init_info_experimental!(state, init_info, problem, setting ) + end + + if setting.verbose == true + @info "information about the best state found for smallest μ = $(state.μ)" + @info "μ = $(state.μ) | log fixed point gap = $(log10(state.fxd_pnt_gap)) | log feasibility gap = $(log10(state.fsblt_gap)) | inner iterations = $(state.i)" + end + + + return state +end + +# Dedicated solver for factor analysis problem +function solve!(problem::ProblemFactorAnalysisModel, setting::Setting) + + # create the initial state + state = StateFactorAnalysisModel(problem, setting) # create the initial state, keep in mind actually we can run a proximal evaluation now that we can use to warm start later + init_info = InitInfoFactorAnalysisModel(problem, setting) # create intial information + + #create the optimization problem to compute the proximal operator + + # now this first state goes into the iteration_outer!(state, problem, setting) and we keep running it until our termination condtion has been met + while state.μ >= setting.μ_min + # run the outer iteration update procedure + state = update_state_fam!(state, init_info, problem, setting) + # init_info = update_init_info!(state, init_info, problem, setting ) + # experimental version: uncomment the previous line after you are done experimenting + init_info = update_init_info_experimental_fam!(state, init_info, problem, setting ) + end + + if setting.verbose == true + @info "information about the best state found for smallest μ = $(state.μ)" + @info "μ = $(state.μ) | log fixed point gap = $(log10(state.fxd_pnt_gap)) | log feasibility gap = $(log10(state.fsblt_gap)) | inner iterations = $(state.i)" + end + + + return state +end + + +export solve! + + +end # module diff --git a/src/factor_analysis.jl b/src/factor_analysis.jl new file mode 100644 index 0000000..e22a556 --- /dev/null +++ b/src/factor_analysis.jl @@ -0,0 +1,293 @@ +# Code for factor analysis model. We write the code for it seperately because it has a very special structure. + +## Define the one shot function FactorAnalysisModel + +struct ProblemFactorAnalysisModel #{A <: AbstractMatrix{ <: Real }, I <: Integer, R <: Real, V <: AbstractVector{ <:Real }} + Σ #::A # This is part of the data, which is a positive semidefinite matrix + r #::I # represents rank of the decision variable matrix X + M #::R # upper bound on the operator-2 norm of decision variable matrix X + Z0 #::A # one initial state provided by the user (for the matrix part) + z0 #::V # one initial state provided by the user (the vector part) +end + +## This structure contains all the necessary information to describe a state + +mutable struct StateFactorAnalysisModel #{R <: Real, A <: AbstractMatrix{ <: Real }, V <: AbstractVector{ <: Real}, I <: Integer} # Done + X #::A # the first iterate matrix part + x #::V # the first iterate vector part + Y #::A # the second iterate matrix part + y #::V # the second iterate vector part + Z #::A # the third iterate matrix part + z #::V # the third iterate vector part + i #::I # iteration number + fxd_pnt_gap #::R # fixed point gap for state + fsblt_gap #::R # how feasible the current point is, this is measured by || x- Π_C(x) || + μ #::R # current value of μ + γ #::R # current value of γ +end + +## this function constructs the very first state for the problem and setting for the factor analysis model +function StateFactorAnalysisModel(problem::ProblemFactorAnalysisModel, setting::Setting) + + # this function constructs the very first state for the problem and setting + Z0 = copy(problem.Z0) + z0 = copy(problem.z0) + X0 = zero(Z0) + x0 = zero(z0) + Y0 = zero(Z0) + y0 = zero(z0) + i = copy(setting.n_iter_max) # iteration number where best fixed point gap is reached + fxd_pnt_gap = copy(setting.big_M) + fsblt_gap = copy(setting.big_M) + μ = copy(setting.μ_max) + γ = γ_from_μ(μ, setting) + + return StateFactorAnalysisModel(X0, x0, Y0, y0, Z0, z0, i, fxd_pnt_gap, fsblt_gap, μ, γ) + +end + +## structure that describes the information to intialize our algorithm: + +mutable struct InitInfoFactorAnalysisModel #{ A <: AbstractMatrix{ <: Real }, V <: AbstractVector{ <: Real}, R <: Real} + + # this will have Z0, z0, γ, μ + Z0 #::A # initial condition (matrix part) to start an inner iteration + z0 #::V # initial condition (vector part) to start an inner iteration + μ #::R # μ required start an inner iteration + γ #::R # γ required to start an inner iteration + +end + +## Of course, we will need a constructor for the InitInfoFactorAnalysisModel type, given the user input. The user inputs are problem::problem, setting::setting. + +function InitInfoFactorAnalysisModel(problem::ProblemFactorAnalysisModel, setting::Setting) + + ## constructs the initial NExOS information + Z0 = copy(problem.Z0) + z0 = copy(problem.z0) # need to understand if copy is necessary + μ = copy(setting.μ_max) + γ = γ_from_μ(μ, setting) + + return InitInfoFactorAnalysisModel(Z0, z0, μ, γ) + +end + + + +## update the state for factor analysis model + +function update_state_fam!(state::StateFactorAnalysisModel, init_info::InitInfoFactorAnalysisModel, problem::ProblemFactorAnalysisModel, setting::Setting) + + # extract information from the state and init info + X = state.X + x = state.x + Y = state.Y + y = state.y + Z = init_info.Z0 + z = init_info.z0 # TODO: Need to write the InitInfoFactorAnalysisModel + β = setting.β + γ = init_info.γ + μ = init_info.μ + + # create best point variables + best_point_X = X + best_point_x = x + best_point_Y = y + best_point_y = y + best_point_Z = Z + best_point_z = z + best_fxd_pnt_gap = setting.big_M + best_fsblt_gap = setting.big_M + + i = 1 + while i <= setting.n_iter_max + X, x, Y, y, Z, z = inner_iteration_fam(X, x, Y, y, Z, z, β, γ, μ, problem) # Done: Need to write inner iteration factor analysis model: inner_iteration_fam + crnt_fxd_pnt_gap = norm(x-y, Inf)+norm(X-Y, Inf) + Y_fsbl = Π_exact(RankSet(problem.M, problem.r), Y) + crnt_fsblt_gap = norm(Y-Y_fsbl, Inf) # the vector part y is not changed in the second step of the factor analysis inner iteration, so we do not need norm(y-y_fsbl, Inf) part + + # update the best points so far if we have lower objective value + if crnt_fxd_pnt_gap <= best_fxd_pnt_gap + best_point_X = X + best_point_x = x + best_point_Y = Y + best_point_y = y + best_point_Z = Z + best_point_z = z + best_fxd_pnt_gap = crnt_fxd_pnt_gap + best_fsblt_gap = crnt_fsblt_gap + end # if + + # display important information + if setting.verbose == true + if mod(i, setting.freq) == 0 + @info "μ = $μ | iteration = $i | log fixed point gap = $(log10(crnt_fxd_pnt_gap)) | log feasibility gap = $(log10(crnt_fsblt_gap))" + end # if + end # if + + # termination condition check + if crnt_fxd_pnt_gap <= setting.tol + if setting.verbose == true + @info "μ = $μ | log fixed point gap reached $(log10(setting.tol))" + end + break + end #if + + # increment the count by 1 + i = i+1 + + end # while + + ## update the state + state.X = best_point_X + state.x = best_point_x + state.Y = best_point_Y + state.y = best_point_y + state.Z = best_point_Z + state.z = best_point_z + state.i = i + state.fxd_pnt_gap = best_fxd_pnt_gap + state.fsblt_gap = best_fsblt_gap + state.μ = μ + state.γ = γ + + # Print information about the best point + if setting.verbose == true + @info "information about the best state found for μ = $(state.μ)" + @info "μ = $(state.μ) | log fixed point gap = $(log10(state.fxd_pnt_gap)) | log feasibility gap = $(log10(state.fsblt_gap)) | inner iterations = $(state.i)" + end + + return state +end + +# inner iteration function for FactorAnalysisModel +function inner_iteration_fam(X, x, Y, y, Z, z, β, γ, μ, problem::ProblemFactorAnalysisModel) + #(X::A, x::V, Y::A, y::V, Z::A, z::V, β::R, γ::R, μ::R, problem::ProblemFactorAnalysisModel) where { A <: AbstractMatrix{ <: Real }, V <: AbstractVector{ <: Real}, R <: Real} + # old iteration: x = prox_NExOS(problem.f, β, γ, z) + X, x = prox_NExOS_fam(problem.Σ, problem.M, γ, Z, z) + Y = Π_NExOS(RankSet(problem.M, problem.r), β, γ, μ, 2*X - Z) + y = max.(2*x - z,0) + Z = Z + Y - X + z = z + y - x + # println("size of the new matrices = ", size(Z), size(z)) + return X, vec(x), Y, vec(y), Z, vec(z) +end + +# Proximal operator funciton evaluation using Convex, this is much slower than using JuMP, so I am commenting it for now. + +# function prox_NExOS_fam(Σ, M, γ, X, d) #(Σ::A, M::R, γ::R, X::A, d::V) where {R <: Real, A <: AbstractMatrix{R}, V <: AbstractVector{R}} # For now M is not used, may use it in a future version +# +# # This functions takes the input data Σ, γ, X, d and evaluates +# # the proximal operator of the function f at the point (X,D) +# +# # Data extraction +# # --------------- +# n = length(d) # dimension of the problem +# # println("*****************************") +# # println(size(d)) +# # println("the value of d is = ", d) +# # println("the type of d is", typeof(d)) +# D = LinearAlgebra.diagm(d) # creates the diagonal matrix D that embed +# +# # Create the variables +# # -------------------- +# X_tl = Convex.Semidefinite(n) # Here Semidefinite(n) encodes that +# # X_tl ≡ ̃X is a positive semidefinite matrix +# d_tl = Convex.Variable(n) # d_tl ≡ ̃d +# D_tl = diagm(d_tl) # Create the diagonal matrix ̃D from ̃d +# +# # Create terms of the objective function, which we write down +# # in three parts +# # ---------------------------------------------------------- +# t1 = square(norm2(Σ - X_tl - D_tl)) # norm2 computes Frobenius norm in Convex.jl +# t2 = square(norm2(X-X_tl)) +# t3 = square(norm2(D-D_tl)) +# +# # Create objective +# # ---------------- +# objective = t1 + (1/(2*γ))*(t2 + t3) # the objective to be minimized +# +# # create the problem instance +# # --------------------------- +# convex_problem = Convex.minimize(objective, [d_tl >= 0, Σ - D_tl in :SDP]) +# +# # set the solver +# # -------------- +# convex_solver = () -> SCS.Optimizer(verbose=false)#COSMO.Optimizer(verbose=false) +# +# # solve the problem +# # ----------------- +# Convex.solve!(convex_problem, convex_solver) +# +# # get the optimal solution +# # ------------------------ +# X_sol = X_tl.value +# d_sol = d_tl.value +# # println("d_sol = ", d_sol) +# +# # return the output +# return X_sol, vec(d_sol) +# +# end + +## put everything in a function +# uses JuMP +function prox_NExOS_fam(Σ, M, γ, X, d; X_tl_sv = nothing, d_tl_sv = nothing, warm_start = false) + + # This functions takes the input data Σ, γ, X, d and evaluates the proximal operator of the function f at the point (X,d) + + n = length(d) + + # prox_model = JuMP.Model(optimizer_with_attributes(COSMO.Optimizer, "verbose" => false)) + + prox_model = JuMP.Model(optimizer_with_attributes(Mosek.Optimizer)) + + + @variables( prox_model, + begin + d_tl[1:n] >= 0 + X_tl[1:n, 1:n], PSD + end + ) + + if warm_start == true + println("warm start enabled") + # Set warm-start + set_start_value.(X_tl, X_tl_sv) # Warm start + set_start_value.(d_tl, d_tl_sv) # Warm start + println("norm difference is = ", norm(start_value.(X_tl) - X_tl_sv)) + end + + t_1 = vec(Σ - X_tl - diagm(d_tl)) + t_2 = vec(X_tl-X) + t_3 = vec(diagm(d_tl)-diagm(d)) + obj = t_1'*t_1 + ((1/(2*γ))*(t_2'*t_2 + t_3'*t_3)) + + @objective(prox_model, Min, obj) + + @constraints(prox_model, begin + Symmetric(Σ - diagm(d_tl)) in PSDCone() + end) + + set_silent(prox_model) + + JuMP.optimize!(prox_model) + + # obj_val = JuMP.objective_value(prox_model) + X_sol = JuMP.value.(X_tl) + d_sol = JuMP.value.(d_tl) + + return X_sol, d_sol + +end + +function update_init_info_experimental_fam!(state::StateFactorAnalysisModel, init_info::InitInfoFactorAnalysisModel, problem::ProblemFactorAnalysisModel, setting::Setting) + + init_info.μ = init_info.μ*setting.μ_mult_fact + init_info.γ = γ_from_μ(init_info.μ, setting) + init_info.z0 = state.z + init_info.Z0 = state.Z + + return init_info + +end diff --git a/src/types.jl b/src/types.jl new file mode 100644 index 0000000..f575682 --- /dev/null +++ b/src/types.jl @@ -0,0 +1,363 @@ +##%% Load the packages + +# First, we load the packages. + +using ProximalOperators, LinearAlgebra, TSVD, SparseArrays, OSQP, JuMP, MosekTools + +##%% All the necessary types + +# # ProxRegularSet +# This is an abstract type that describes the nature of the set. + +abstract type ProxRegularSet end + + +# # problem +# This structure contains the function, the set, value of parameter β, and an intial point. + +struct Problem{F <: ProximableFunction, S <: ProxRegularSet, R <: Real, A <: AbstractVecOrMat{ <:Real}} + + f::F # the objective function + C::S # the constraint set + β::R # what value of β to pick + z0::A # one intial state provided by the user + + ## consider having a constructor that will create an intial staate if the user does not give one + +end + +##%% Setting function + +# # setting +# This struct represents the user setting. It contains information regarding user specified value of different parameters + +struct Setting + + # first comes description of different data type + + μ_max::Float64 # starting μ + μ_min::Float64 # ending μ + μ_mult_fact::Float64 # multiplicative factor to reduce μ + n_iter_min::Int64 # minimum number of iterations per fixed μ + n_iter_max::Int64 # maximum number of iterations per fixed μ + big_M::Float64 # a big number, it suffices to hard code it, but for now let me keep it + β::Float64 # β is the strong convexity parameter + tol::Float64 # tolerance for the fixed point gap + verbose::Bool # whether to print iteration information + freq::Int64 # how often to print the iteration solution + γ_updt_rule::Symbol # for now two rule: :safe and :adaptive, deafult will be :safe + + # constructor function + function Setting(; + μ_max = 1.0, + μ_min = 1e-12, + μ_mult_fact = 0.85, + n_iter_min = 1000, + n_iter_max = 1000, + big_M = 1e99, + β = 1e-10, + tol = 1e-8, + verbose = true, + freq = 10, + γ_updt_rule = :safe + ) + new(μ_max, μ_min, μ_mult_fact, n_iter_min, n_iter_max, big_M, β, tol, verbose, freq, γ_updt_rule) + end + +end + +## structure that describes the state (for the outer loop) + +# # state +# This structure contains all the necessary information to describe a state + +mutable struct State{T <: AbstractVecOrMat{<: Real}, I <: Integer, R <: Real} # Done + x::T # the first iterate + y::T # the second iterate + z::T # the third iterate + i::I # iteration number + fxd_pnt_gap::R # fixed point gap for state + fsblt_gap::R # how feasible the current point is, this is measured by || x- Π_C(x) || + μ::R # current value of μ + γ::R # current value of γ +end + +# Let us initialize the state data structure. + +function State(problem::Problem, setting::Setting) + + # this function constructs the very first state for the problem and setting + z0 = copy(problem.z0) + x0 = zero(z0) + y0 = zero(z0) + i = copy(setting.n_iter_max) # iteration number where best fixed point gap is reached + fxd_pnt_gap = copy(setting.big_M) + fsblt_gap = copy(setting.big_M) + μ = copy(setting.μ_max) + γ = γ_from_μ(μ, setting) # + + return State(x0, y0, z0, i, fxd_pnt_gap, fsblt_gap, μ, γ) + +end + +# Time to describe the data structure that we require to initialize our algorithm. + +## structure that describes the information to intialize our algorithm: + +mutable struct InitInfo{T <: AbstractVecOrMat{<: Real}, R <: Real} + + # this will have z0, γ, μ + z0::T # initial condition to start an inner iteration + μ::R # μ required start an inner iteration + γ::R # γ required to start an inner iteration + +end + +# Of course, we will need a constructor for the InitInfo type, given the user input. The user inputs are problem::problem, setting::setting. + +function InitInfo(problem::Problem, setting::Setting) + + ## constructs the initial NExOS information + z0 = copy(problem.z0) # need to understand if copy is necessary + μ = copy(setting.μ_max) + γ = γ_from_μ(μ, setting) + + return InitInfo(z0, μ, γ) + +end + +#%% +# Time to define our sets. We have two prox regular sets for now: +# 1. SparseSet: It is of the form: card(x) ≦ r, ||x||_∞ ≦ M +# 2. RankSet: It is of the form: rank(x) ≦ r, ||x||_2 ≦ M + +# We also consider the unit hypercube, which is not prox-regular however is defined here to apply our algorithm as a heuristics to the 3-SAT problem. + +# define sparse set first, via the `SparseSet` structure. + +struct SparseSet{R <: Real, I <: Integer} <: ProxRegularSet + M::R # M is the upper bound on the elements of x + r::I # r is the maximum cardinality + function SparseSet{R, I}(M::R, r::I) where {R <: Real, I <: Integer} + if r <= 0 || M <= 0 + error("parameter M and r must be a positive integer") + else + new(M, r) + end + end +end + +# Constructor for the sparse set +SparseSet(M::R, r::I) where {R <: Real, I <: Integer} = SparseSet{R,I}(M, r) + + +# Okay, we have done everything for the sparse set. Now, we do the same for the low rank set. First, we define the structure. + +struct RankSet{R<: Real, I <: Integer} <: ProxRegularSet + M::R + r::I + function RankSet{R, I}(M::R, r::I) where {R <: Real, I <: Integer} + if r <= 0 || M <= 0 + error("parameter M and r must be a positive integer") + else + new(M, r) + end + end +end + + +# Let us define the constructor for the low rank set. + +RankSet(M::R, r::I) where {R <: Real, I <: Integer} = RankSet{R, I}(M, r) + +# Time to define the unit hypercube: {0,1}^n + +struct UnitHyperCube{I <: Integer} <: ProxRegularSet + n::I + function UnitHyperCube{I}(n::I) where {I <: Integer} + if n <= 0 + error("parameter n must be a positive integer") + else + new(n) + end + end +end + +# Constructor for the unit hypercube + +UnitHyperCube(n::I) where {I <: Integer} = UnitHyperCube{I}(n) + +## Define the Least squarers function for affine rank minimization: + + +## Start with some helper functions + +function Afn_Op_ith_Mat(𝓐, k, i) + ## gives the i-th matrix of the affine oerator of 𝓐, such that + ## [𝓐(X)]_i = trace(A_i^T X) + m, kn = size(𝓐) + n = convert(Int64, kn/k) + m, kn = size(𝓐) + n = convert(Int64, kn/k) + return 𝓐[:, (i-1)*n+1:i*n] +end + +# TODO: Make this function faster +function Affn_Op(𝓐, k, X) + # Computes 𝓐(X), k is the dimension of the affine operator + # 𝓐 = [A_1 A_2 ... A_k] ∈ R^ {m × kn} + m, n = size(X) + m1, kn = size(𝓐) + n1 = convert(Int64, kn/k) + if m1 ≠ m || n1 ≠ n + @error "sizes of the input matrix and the affine operator matrix do not match" + end + z = zeros(k) # output will be stored in this matrix + for i in 1:k + z[i] = tr(Afn_Op_ith_Mat(𝓐, k, i)'*X) + end + return z +end + +function Affn_Op_Mat(𝓐, k, X) + # Computes 𝓐(X), k is the dimension of the affine operator + # 𝓐 = [A_1 A_2 ... A_k] ∈ R^ {m × kn} + A = affine_operator_to_matrix(𝓐, k) + # output will be stored in this matrix + return A*vec(X) +end + +# TODO: need to make this function faster +function affine_operator_to_matrix(𝓐::AbstractMatrix{R}, k::I) where {R <: Real, I <: Integer} + ## convert 𝓐 to a matrix A of size k × mn + ## 𝓐 = [A_1 A_2 ... A_k] ∈ R^ {m × kn}, where each + ## A_i is an m × n matrix, and our + ## AffineOp(X) = [ tr(A_1^T X) ; tr(A_2^T X); ... ; tr(A_k^T X) ], where X is an m × n matrix + ## our goal is coming up with a matrix A such that + ## AffineOp(X) = A*vec(X) + m, kn = size(𝓐) + if mod(kn, k) ≠ 0 + @error "dimension of the matrix [A_1 ... A_k] is not correct" + end + n = convert(Int64, kn/k) + A = zeros(k,m*n) + for i in 1:k + A[i,:] = transpose(vec(𝓐[:, (i-1)*n+1:i*n])) + end + return A +end + + + + +## Let us define the function least squares over matrices + + +abstract type LeastSquaresOverMatrix <: ProximableFunction end + +is_smooth(f::LeastSquaresOverMatrix) = true + +is_quadratic(f::LeastSquaresOverMatrix) = true + +fun_name(f::LeastSquaresOverMatrix) = "Least squares penalty over matrices" + +function LeastSquaresOverMatrix(𝓐::M, b::V, k::I, lam::R=R(1); iterative=false) where {R <: Real, V <: AbstractArray{R}, I <: Integer, M} # BUG: seems that this version of the objective value has a bug, for now use the other one that directly takes a k×mn matrix + # FIXME: Fix this function later on, for now work with the other version + ## In this case, 𝓐 is as follows: + ## 𝓐 = [A_1 A_2 ... A_k] ∈ R^ {m × kn}, where each + ## A_i is an m × n matrix, and + # b is the output observation with k entries + # this function will convert 𝓐 into a matrix Amat of size k × mn that can operate on vec(X) of size mn, which is the vectorized version of the input matrix X of size m × n + Amat = affine_operator_to_matrix(𝓐, k) + if iterative == false + ProximalOperators.LeastSquaresDirect(Amat, b, lam) + else + ProximalOperators.LeastSquaresIterative(Amat, b, lam) + end +end + +function LeastSquaresOverMatrix(𝐀::M, b::V, lam::R=R(1); iterative=false) where {R <: Real, V <: AbstractArray{R}, I <: Integer, M} + # in this case 𝐀 is given as a k × mn matrix, so that it acts directly on the vectorized operator + if size(𝐀)[1] != length(b) + @error "number of rows of A must be equal to number of elements of b " + end + if iterative == false + ProximalOperators.LeastSquaresDirect(𝐀, b, lam) + else + ProximalOperators.LeastSquaresIterative(𝐀, b, lam) + end +end + +# evaluating the LeastSquaresOverMatrix +# for direct +function (f::ProximalOperators.LeastSquaresDirect)(X::Array{R,2}) where {R <: Real} + return f(vec(X)) +end + +# for iterative +function (f::ProximalOperators.LeastSquaresIterative)(X::Array{R,2}) where {R <: Real} + return f(vec(X)) +end + +## Function for squared loss for matrix completion problem + +# data conversion for the matrix completion problem + +# older function: takes much longer to compute +# function matrix_completion_A_b(Zobs::AbstractMatrix{R}) where {R <: Real} +# # The function will take the matrix Zobs of size mxn such that has it has many missing (NaN) values and few observed value. Let the index of available values of Zobs be Ω. Then the function will create a matrix A and vector b, such that for any matrix X we have +# # \[ +# # f(X)=\sum_{(i,j)\in\Omega}(X_{ij}-Z_{ij})^{2}=\|Ax-b\|_{2}^{2}. +# # \] +# m, n = size(Zobs) +# zobs = vec(Zobs) +# # find the corresponding indices of the missing data again: +# index_available_vec = findall(x -> isnan(x)== false, zobs) +# b = zobs[index_available_vec] +# p = length(index_available_vec) +# A = spzeros(p, m*n) # m is size of b, +# for i in 1:p +# A[i,index_available_vec[i]] = 1.0 +# end +# return A, b +# end + +function matrix_completion_A_b(Zobs::AbstractMatrix{R}) where {R <: Real} + # The function will take the matrix Zobs of size mxn such that has it has many missing (NaN) values and few observed value. Let the index of available values of Zobs be Ω. Then the function will create a matrix A and vector b, such that for any matrix X we have + # \[ + # f(X)=\sum_{(i,j)\in\Omega}(X_{ij}-Z_{ij})^{2}=\|Ax-b\|_{2}^{2}. + # \] + m, n = size(Zobs) + zobs = vec(Zobs) + # find the corresponding indices of the missing data again: + index_available_vec = findall(x -> isnan(x)== false, zobs) + b = zobs[index_available_vec] + p = length(index_available_vec) + I_A = 1:p + J_A = index_available_vec + V_A = ones(p) + A = sparse(I_A,J_A,V_A,p,m*n) + return A, b +end + +# time to define the squared loss function + +abstract type SquaredLossMatrixCompletion <: ProximableFunction end + +is_smooth(f::SquaredLossMatrixCompletion) = true + +is_quadratic(f::SquaredLossMatrixCompletion) = true + +fun_name(f::SquaredLossMatrixCompletion) = "Squared loss function for matrix completion problem" + +function SquaredLossMatrixCompletion(Zobs::AbstractMatrix{R}, lam::R=R(1); iterative = false) where {R <: Real} + # construct the data from Zobs + A, b = matrix_completion_A_b(Zobs) + + if iterative == false + ProximalOperators.LeastSquaresDirect(A, b, lam) + else + ProximalOperators.LeastSquaresIterative(A, b, lam) + end + +end diff --git a/src/utils.jl b/src/utils.jl new file mode 100644 index 0000000..0ad45e2 --- /dev/null +++ b/src/utils.jl @@ -0,0 +1,294 @@ +## Our rule for chosing γ from μ + +function γ_from_μ(μ::R, setting::Setting) where {R <: Real} + if setting.γ_updt_rule == :safe + return abs(cbrt(setting.μ_min)) # cbrt also works quite well + elseif setting.γ_updt_rule == :adaptive + γ_adaptive = (10^-3)*abs(sqrt(μ)) + return γ_adaptive # adaptive actually does not work very well + elseif setting.γ_updt_rule == :adaptiveplus + log_γ_adaptiveplus = ((3/(log10(setting.μ_max)+10))*(log10(μ)+10))-6 + γ_adaptiveplus = 10^log_γ_adaptiveplus + return γ_adaptiveplus + else + @error "either select :adaptive or :safe or :adaptiveplus as your γ selection rule" + return nothing + end +end + +# # Projection onto the sparse set +# First thing to do is, projecting onto the sparse set. Projection on the sparse set can be computed as follows. + +function Π_exact(C::SparseSet, x::AbstractVector{R}) where {R <: Real} + ## this function takes a vector x and returns a projection of x on the set + ## card(⋅) <= k, ||⋅||_∞ <= M + ## Extract the information from teh SparseSet + M = C.M + k = C.r + + n = length(x) + y = zero(x) + + perm = sortperm(abs.(x), rev=true) + ## sortperm(abs.(x), rev=true) returns a sorted array index set from larger to smaller components of absolute values of x + for i in 1:n + if i in perm[1:k] + if x[i]>M + y[i] = M + elseif x[i] < -M + y[i] = -M + else + y[i] = x[i] + end + end + end + + return y +end + +# Finally, the projection onto the low rank set. +# # Pseudocode for `Π_exact` +# --- + +# ![Algorithm](/literate_assets/low_rank_projection.png) + +function Π_exact(C::RankSet, x::AbstractMatrix{R}; iterative::Bool = true) where {R <: Real} + + ## extract the values + y = similar(x) + M = C.M # bound on the matrix elements + r = C.r # rank constraint rank(x) ≦ r + if iterative == true + U, S, V = tsvd(x, r) + for i in 1:r + if S[i] > M + S[i] = M + end + end + SVt = S.* V' # this line essentially does the same as Diagonal(X)*V' but more efficiently + mul!(y, U, SVt) + return y + else + F = svd(x) + σ_x = F.S + for i in 1:r + if σ_x[i] > M + σ_x[i] = M + end + end + y = F.U[:,1:r]*Diagonal(σ_x[1:r])*V[:,1:r]' + return y + end # if else +end + +# Projection onto the unit hypercube. + +function Π_exact(C::UnitHyperCube, x::AbstractVector{R}) where {R <: Real} + ## this function takes a vector x and returns a projection of x on the set {0,1}^n + n = C.n + y = zero(x) # creates a 0 vector with the same dimension as x + for i=1:n + if x[i] > 0.5 + y[i]=1 + end + end + return y +end + + + + +# update the state + +function update_state!(state::State, init_info::InitInfo, problem::Problem, setting::Setting) + + # extract information from the state and init info + x = state.x + y = state.y + z = init_info.z0 + β = setting.β + γ = init_info.γ + μ = init_info.μ + + # create best point variables + best_point_x = x + best_point_y = y + best_point_z = z + best_fxd_pnt_gap = setting.big_M + best_fsblt_gap = setting.big_M + + i = 1 + while i <= setting.n_iter_max + x, y, z = inner_iteration(x, y, z, β, γ, μ, problem) + crnt_fxd_pnt_gap = norm(x-y, Inf) + y_fsbl = Π_exact(problem.C, y) + crnt_fsblt_gap = norm(y-y_fsbl, Inf) + + # update the best points so far if we have lower objective value + if crnt_fxd_pnt_gap <= best_fxd_pnt_gap + best_point_x = x + best_point_y = y + best_point_z = z + best_fxd_pnt_gap = crnt_fxd_pnt_gap + best_fsblt_gap = crnt_fsblt_gap + end # if + + # display important information + if setting.verbose == true + if mod(i, setting.freq) == 0 + @info "μ = $μ | iteration = $i | log fixed point gap = $(log10(crnt_fxd_pnt_gap)) | log feasibility gap = $(log10(crnt_fsblt_gap))" + end # if + end # if + + # termination condition check + if crnt_fxd_pnt_gap <= setting.tol + if setting.verbose == true + @info "μ = $μ | log fixed point gap reached $(log10(setting.tol))" + end + break + end #if + + # increment the count by 1 + i = i+1 + + end # while + + ## update the state + state.x = best_point_x + state.y = best_point_y + state.z = best_point_z + state.i = i + state.fxd_pnt_gap = best_fxd_pnt_gap + state.fsblt_gap = best_fsblt_gap + state.μ = μ + state.γ = γ + + # Print information about the best point + if setting.verbose == true + @info "information about the best state found for μ = $(state.μ)" + @info "μ = $(state.μ) | log fixed point gap = $(log10(state.fxd_pnt_gap)) | log feasibility gap = $(log10(state.fsblt_gap)) | inner iterations = $(state.i)" + end + + return state +end + +## the inner iteration function +# The inner iteration function has the following pseudocde +# # Pseudocode for `Π_exact` +# --- + +# ![Algorithm](/literate_assets/inner_iteration.png) + +function inner_iteration(x::A, y::A, z::A, β::R, γ::R, μ::R, problem::Problem) where {A <: AbstractVecOrMat{<:Real}, R <: Real} + # old iteration: x = prox_NExOS(problem.f, β, γ, z) + x = prox_NExOS(problem.f, γ, z) + y = Π_NExOS(problem.C, β, γ, μ, 2*x - z) + z = z + y - x + return x, y, z +end + +## proximal function on f: this is the old function to be removed later +# function prox_NExOS(f::ProximableFunction, β::R, γ::R, x::A) where {R <: Real, A <: AbstractVecOrMat{<:Real}} +# prox_param = γ/((β*γ)+1) +# scaled_input = (1/((β*γ)+1)).*x +# prox_point = similar(x) # allocate prox_NExOS output +# prox!(prox_point, f, scaled_input, prox_param) # this is the prox! function from the ProximalOperators package +# return prox_point +# end + +## New proximal function, no scaling required +function prox_NExOS(f::ProximableFunction, γ::R, x::A) where {R <: Real, A <: AbstractVecOrMat{<:Real}} + prox_point = similar(x) # allocate prox_NExOS output + prox!(prox_point, f, x, γ) # this is the prox! function from the ProximalOperators package + return prox_point +end + + + +## NExOS projection function +# This implements the following pseudocode +# ![relaxed_projection_algorithm](/literate_assets/relaxed_projection.png) + +function Π_NExOS(C::ProxRegularSet, β::R, γ::R, μ::R, x::A) where {R <: Real, A <: AbstractVecOrMat{<:Real}} + dnm = γ + (μ*((β*γ)+1)) + scaled_input = (1/((β*γ)+1)).*x + pi_x_scaled_input = Π_exact(C, scaled_input) + t1 = (μ/dnm).*x + t2 = (γ/dnm)*pi_x_scaled_input + return t1+t2 +end + + +# Now we write the update init information function. + +function update_init_info!(state::State, init_info::InitInfo, problem::Problem, setting::Setting) + + init_info.μ = init_info.μ*setting.μ_mult_fact + init_info.γ = γ_from_μ(init_info.μ, setting) + x_μ = Π_NExOS(problem.C, setting.β, init_info.γ, init_info.μ, state.x) + u1 = similar(x_μ) # allocate memory + gradient!(u1, problem.f, x_μ) + u = u1 + setting.β.*x_μ # this is a gradient of the function f + (β/2)*||⋅||^2 at x_μ + # we are using the function gradient from proximaloperators.jl + init_info.z0 = x_μ + init_info.γ*u + + return init_info + +end + +# Experiment with the initial information + +function update_init_info_experimental!(state::State, init_info::InitInfo, problem::Problem, setting::Setting) + + init_info.μ = init_info.μ*setting.μ_mult_fact + init_info.γ = γ_from_μ(init_info.μ, setting) + init_info.z0 = state.z + + return init_info + +end + +## Extending the proximal opertor for least sqaures penalty over matrices + + + +function ProximalOperators.prox!(Y::Array{R,2}, f::ProximalOperators.LeastSquaresIterative, X::Array{R,2}, gamma::R=R(1)) where {R <: Real} + m, n = size(Y) + y = similar(vec(Y)) # create storaget to store the proximal operator + x = vec(X) + prox!(y, f, x, gamma) + # need to convert y back to Y + Y[:,:] = reshape(y,m,n) # Y[:,:] is used so that the contents are changed + return f(y) +end + +function ProximalOperators.prox!(Y::Array{R,2}, f::ProximalOperators.LeastSquaresDirect, X::Array{R,2}, gamma::R=R(1)) where {R <: Real} + m, n = size(Y) + y = similar(vec(Y)) # create storaget to store the proximal operator + x = vec(X) + prox!(y, f, x, gamma) + # need to convert y back to Y + Y[:,:] = reshape(y,m,n) # Y[:,:] is used so that the contents are changed + return f(y) +end + +# Computing gradient for the leastsquares over matrix function direct case +function ProximalOperators.gradient!(Y::Array{R,2}, f::ProximalOperators.LeastSquaresDirect, X::Array{R,2}) where {R<: Real} + m, n = size(Y) + y = similar(vec(Y)) # create storaget to store the proximal operator + x = vec(X) + gradient!(y, f, x) + Y[:,:] = reshape(y,m,n) # Y[:,:] is used so that the contents are changed + return f(y) +end + + +# Computing gradient for the leastsquares over matrix function direct case +function ProximalOperators.gradient!(Y::Array{R,2}, f::ProximalOperators.LeastSquaresIterative, X::Array{R,2}) where {R<: Real} + m, n = size(Y) + y = similar(vec(Y)) # create storaget to store the proximal operator + x = vec(X) + gradient!(y, f, x) + Y[:,:] = reshape(y,m,n) # Y[:,:] is used so that the contents are changed + return f(y) +end diff --git a/test/runtests.jl b/test/runtests.jl new file mode 100644 index 0000000..ece37ca --- /dev/null +++ b/test/runtests.jl @@ -0,0 +1,113 @@ +using NExOS +using ProximalOperators +using Test +using Random, LinearAlgebra + +@testset "sparse_regression" begin + + M = 1.0 + + r = 3 + + C = SparseSet(M, r) + + A = [0.0026751207469390887 -1.7874010558441433 1.9570015234024314 0.48297821744005304 0.5499471860787609; -1.1926192117844148 -0.2962711608418759 0.34270253049651844 -0.9366799738457191 2.6135543879953094; -2.3956100254067567 1.5957363630260428 -0.5408329087542932 -1.595958552841769 -1.6414598361443185; 0.6791192541307551 2.895535427621422 -0.6676549207144326 -0.49424052539586016 -0.4336822064078515; 2.201382399088591 -0.578227952607833 -0.17602060199064307 -0.46261612338452723 0.24784285430931077; -0.49522601422578916 0.44636764792131967 1.2906418721153354 0.8301596930345838 0.5837176074011505] + + b = [-0.7765194270735608, 0.28815810383363427, -0.07926006593887291, 0.5659412043036721, 1.3718921913877151, 0.8834692513807884] + + m, n = size(A) + + z0 = [0.7363657041121843, 1.7271074489333478, -1.7929456862340958, 0.9359690211495514, 1.771070535933835] + + f = LeastSquares(A, b, iterative = true) + + setting = NExOS.Setting(μ_max = 2, μ_min = 1e-8, μ_mult_fact = 0.85, verbose = true, freq = 250, γ_updt_rule = :adaptive) + + problem = NExOS.Problem(f, C, setting.β, z0) + + state_final = NExOS.solve!(problem, setting) + + @test log10(state_final.fxd_pnt_gap) <= -4 + + @test log10(state_final.fsblt_gap) <= -4 + + @test abs(f(state_final.x) - 0.934) <= 1e-3 + +end + +@testset "affine_rank_minimization" begin + + m = 10 + + n = 2*m + + M = 1.0 + + k = convert(Int64, m*n/20) # k is the number of components in the affine operator A: R^m×n → R^k + + r = convert(Int64,round(m*.35)) # r corresponds to the rank of the matrix rank(X) <= r + + mcA = randn(k, m*n) + + b = randn(k) + + Z0 = zeros(m,n) + + f = LeastSquaresOverMatrix(mcA, b, 1.0, iterative = true); + + D = NExOS.RankSet(M, r) + + setting = NExOS.Setting(μ_max = 2, μ_min = 1e-8, μ_mult_fact = 0.5, verbose = true, freq = 250, γ_updt_rule = :safe) + + problem = NExOS.Problem(f, D, setting.β, Z0) + + state_final = NExOS.solve!(problem, setting) + + f(state_final.x) + + @test log10(state_final.fxd_pnt_gap) <= -4 + + @test log10(state_final.fsblt_gap) <= -4 + +end + +@testset "matrix_completion" begin + + Random.seed!(1234) + + # Construct a random m-by-n matrix matrix of rank k + m,n,k = 5,5,2 + + Zfull = randn(m,k)*randn(k,n) # ground truth data + + M = opnorm(Zfull,2) + + n_obs = 13 + + Zobs = fill(NaN,(m,n)) + + obs = randperm(m*n)[1:n_obs] + + Zobs[obs] = Zfull[obs] # partially observed matrix + + # let us find the indices of all the elements that are available + + f = SquaredLossMatrixCompletion(Zobs, iterative = true) + + r = k + + Z0 = zeros(size(Zobs)) + + C = RankSet(M, r) + + setting = NExOS.Setting(μ_max = 5, μ_min = 1e-8, μ_mult_fact = 0.5, n_iter_min = 1000, n_iter_max = 1000, verbose = true, freq = 250, tol = 1e-4, γ_updt_rule = :safe) + + problem = NExOS.Problem(f, C, setting.β, Z0) + + state_final = NExOS.solve!(problem, setting) + + @test log10(state_final.fxd_pnt_gap) <= -4 + + @test log10(state_final.fsblt_gap) <= -4 + +end diff --git a/tutorials/.ipynb_checkpoints/Affine rank minimization using NExOS.jl-checkpoint.ipynb b/tutorials/.ipynb_checkpoints/Affine rank minimization using NExOS.jl-checkpoint.ipynb new file mode 100644 index 0000000..250ad7c --- /dev/null +++ b/tutorials/.ipynb_checkpoints/Affine rank minimization using NExOS.jl-checkpoint.ipynb @@ -0,0 +1,383 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Solving affine rank minimization problem using `NExOS.jl`\n", + "**Shuvomoy Das Gupta**" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Background\n", + "\n", + "The problem in consideration can be written as:\n", + "\n", + "$$\n", + "\\begin{equation}\n", + "\\begin{array}{ll}\n", + "& \\textrm{minimize} & \\left\\Vert \\mathcal{A}(X)-b\\right\\Vert _{2}^{2}\\\\\n", + "& \\textrm{subject to} & \\mathop{{\\bf rank}}(X)\\leq r\\\\\n", + "& & \\|X\\|_{2}\\leq M\\\\\n", + "& & X\\in\\mathbf{R}^{m\\times n},\n", + "\\end{array}\n", + "\\end{equation}\n", + "$$\n", + "\n", + " where $X\\in\\mathbf{R}^{m\\times n}$ is the decision variable, $b\\in\\mathbf{R}^{k}$ is a noisy measurement data, and $\\mathcal{A}:\\mathbf{R}^{m\\times n}\\to\\mathbf{R}^{k}$ is a linear map. The affine map $\\mathcal{A}$ can be determined by $k$ matrices $A_{1},\\ldots,A_{k}$ in $\\mathbf{R}^{m\\times n}$ such that\n", + " \n", + "$$\n", + "\\mathcal{A}(X)=(\\mathbf{tr}(A_{1}^{T}X),\\ldots,\\mathbf{tr}(A_{k}^{T}X)).\n", + "$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Data Generation\n", + "\n", + "First we generate our data for this example." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "10" + ] + }, + "execution_count": 1, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "using Random, NExOS, ProximalOperators, LinearAlgebra\n", + "\n", + "Random.seed!(1234)\n", + "\n", + "m = 10\n", + "\n", + "n = 2*m\n", + "\n", + "M = 1.0\n", + "\n", + "k = convert(Int64, m*n/20)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, $k$ is the number of components in output of the affine operator $\\mathcal{A}$, i.e., for any matrix $X \\in \\mathbf{X}$, we have $\\mathcal{A}(X) \\in \\mathbf{R}^k$." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "4" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "r = convert(Int64,round(m*.35))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, $r$ corresponds to the rank of the matrix $\\mathbf{rank}(X) <= r$." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The affine operator $\\mathcal{A}$ is passed as a $k \\times mn$ matrix $\\bar{A}$, so that when it acts on $\\mathbf{vec}(X)$ we have $\\bar{A} (\\mathbf{vec}(X)) = \\mathcal{A}(X)$." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "10×200 Array{Float64,2}:\n", + " 0.867347 -0.560501 1.56417 … 0.384485 0.685573 -0.858928\n", + " -0.901744 -0.0192918 -1.39674 0.315489 1.21533 1.12188\n", + " -0.494479 0.128064 1.1055 -0.382068 1.29772 -2.45236\n", + " -0.902914 1.85278 -1.10673 0.691941 -1.71088 -2.30555\n", + " 0.864401 -0.827763 -3.21136 0.0473293 -0.747106 1.54823\n", + " 2.21188 0.110096 -0.0740145 … -0.455901 0.0330671 -0.297855\n", + " 0.532813 -0.251176 0.150976 0.100961 2.05177 1.58331\n", + " -0.271735 0.369714 0.769278 -1.12375 1.05237 0.562472\n", + " 0.502334 0.0721164 -0.310153 -0.579068 0.430384 0.85342\n", + " -0.516984 -1.50343 -0.602707 -0.493044 0.211279 -0.321671" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "barA = randn(k, m*n)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The vector $b$ is the observed output." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "10-element Array{Float64,1}:\n", + " -0.4667377936599136\n", + " -0.17047970356351902\n", + " 0.8292034377961193\n", + " -0.4500585937344793\n", + " -1.3038856208344294\n", + " 0.5869555339344937\n", + " 0.17548586215288262\n", + " -0.2760027307659979\n", + " -0.2631151380019278\n", + " -1.1348769546238908" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "b = randn(k)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us chose an initial point." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "Z0 = zeros(m,n);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create the objective function." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "f = LeastSquaresOverMatrix(barA, b, 1.0, iterative = true);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us create the constraint set." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "RankSet{Float64,Int64}(1.0, 4)" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "C = RankSet(M, r)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create the `setting` file, note that we are working with the `:safe` rule for updating the proximal parameter $\\gamma$. Also, the solver output is turned off here, we could turn it on by setting `verbose=true`." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "execution": { + "iopub.execute_input": "2020-11-05T18:59:11.425Z", + "iopub.status.busy": "2020-11-05T18:59:10.909Z", + "iopub.status.idle": "2020-11-05T18:59:12.266Z" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Problem{ProximalOperators.LeastSquaresIterative{1,Float64,Float64,Array{Float64,2},Array{Float64,1},ProximalOperators.AAc},RankSet{Float64,Int64},Float64,Array{Float64,2}}(description : Least squares penalty\n", + "domain : n/a\n", + "expression : n/a\n", + "parameters : n/a, RankSet{Float64,Int64}(1.0, 4), 1.0e-10, [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0])" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "setting = Setting(μ_max = 2, μ_min = 1e-8, μ_mult_fact = 0.5, verbose = false, freq = 500, γ_updt_rule = :safe)\n", + "\n", + "problem = Problem(f, C, setting.β, Z0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solve the problem!" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "State{Array{Float64,2},Int64,Float64}([0.009640555740470996 -0.0039049066980724172 … 0.0032744246264381385 -0.011401924226827814; 0.017039021787684776 -0.006486467044692794 … -0.006723985185855889 0.011346398802052203; … ; -0.0031155465177579336 0.012103228512837969 … -0.0030847566736254646 0.006789864137943311; 0.0037643322883237993 0.011064929502908967 … 0.003141509734574783 -0.006391901052935991], [0.009640555310037102 -0.003904906555286084 … 0.0032744247618116023 -0.011401924380723396; 0.017039022426592925 -0.006486466874719959 … -0.006723985197637846 0.011346398967171602; … ; -0.0031155462203965234 0.012103228542114463 … -0.0030847567074370006 0.006789863932342989; 0.0037643323913579916 0.011064929307709616 … 0.0031415097913574316 -0.00639190102063273], [0.009640555709940084 -0.003904906817854433 … 0.003274424252976659 -0.011401923965769611; 0.017039021755258916 -0.006486467029139976 … -0.006723985147301422 0.011346398638919817; … ; -0.0031155461667072546 0.012103228769906214 … -0.003084756747081626 0.006789864408033303; 0.0037643322644109536 0.011064929211436649 … 0.003141509432661576 -0.006391901114060031], 1, 9.267131556491698e-10, 2.192690473634684e-15, 7.450580596923828e-9, 0.002154434690031884)" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "state_final = solve!(problem, setting)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Observe the output.** Let us check the objective value first, we want it to be small.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "2.1913058045348726e-14" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "f(state_final.x)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, we want to check what if the best state found by our algorithm has converged to a locally optimal solution, which is checked by testing whether the best state has reached the desired fixed point gap and feasibility gap." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "true" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "log10(state_final.fxd_pnt_gap) <= -4\n", + "\n", + "log10(state_final.fsblt_gap) <= -4" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Julia 1.5.0", + "language": "julia", + "name": "julia-1.5" + }, + "language_info": { + "file_extension": ".jl", + "mimetype": "application/julia", + "name": "julia", + "version": "1.5.0" + }, + "nteract": { + "version": "0.26.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tutorials/Affine rank minimization using NExOS.jl.ipynb b/tutorials/Affine rank minimization using NExOS.jl.ipynb new file mode 100644 index 0000000..250ad7c --- /dev/null +++ b/tutorials/Affine rank minimization using NExOS.jl.ipynb @@ -0,0 +1,383 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Solving affine rank minimization problem using `NExOS.jl`\n", + "**Shuvomoy Das Gupta**" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Background\n", + "\n", + "The problem in consideration can be written as:\n", + "\n", + "$$\n", + "\\begin{equation}\n", + "\\begin{array}{ll}\n", + "& \\textrm{minimize} & \\left\\Vert \\mathcal{A}(X)-b\\right\\Vert _{2}^{2}\\\\\n", + "& \\textrm{subject to} & \\mathop{{\\bf rank}}(X)\\leq r\\\\\n", + "& & \\|X\\|_{2}\\leq M\\\\\n", + "& & X\\in\\mathbf{R}^{m\\times n},\n", + "\\end{array}\n", + "\\end{equation}\n", + "$$\n", + "\n", + " where $X\\in\\mathbf{R}^{m\\times n}$ is the decision variable, $b\\in\\mathbf{R}^{k}$ is a noisy measurement data, and $\\mathcal{A}:\\mathbf{R}^{m\\times n}\\to\\mathbf{R}^{k}$ is a linear map. The affine map $\\mathcal{A}$ can be determined by $k$ matrices $A_{1},\\ldots,A_{k}$ in $\\mathbf{R}^{m\\times n}$ such that\n", + " \n", + "$$\n", + "\\mathcal{A}(X)=(\\mathbf{tr}(A_{1}^{T}X),\\ldots,\\mathbf{tr}(A_{k}^{T}X)).\n", + "$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Data Generation\n", + "\n", + "First we generate our data for this example." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "10" + ] + }, + "execution_count": 1, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "using Random, NExOS, ProximalOperators, LinearAlgebra\n", + "\n", + "Random.seed!(1234)\n", + "\n", + "m = 10\n", + "\n", + "n = 2*m\n", + "\n", + "M = 1.0\n", + "\n", + "k = convert(Int64, m*n/20)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, $k$ is the number of components in output of the affine operator $\\mathcal{A}$, i.e., for any matrix $X \\in \\mathbf{X}$, we have $\\mathcal{A}(X) \\in \\mathbf{R}^k$." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "4" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "r = convert(Int64,round(m*.35))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, $r$ corresponds to the rank of the matrix $\\mathbf{rank}(X) <= r$." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The affine operator $\\mathcal{A}$ is passed as a $k \\times mn$ matrix $\\bar{A}$, so that when it acts on $\\mathbf{vec}(X)$ we have $\\bar{A} (\\mathbf{vec}(X)) = \\mathcal{A}(X)$." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "10×200 Array{Float64,2}:\n", + " 0.867347 -0.560501 1.56417 … 0.384485 0.685573 -0.858928\n", + " -0.901744 -0.0192918 -1.39674 0.315489 1.21533 1.12188\n", + " -0.494479 0.128064 1.1055 -0.382068 1.29772 -2.45236\n", + " -0.902914 1.85278 -1.10673 0.691941 -1.71088 -2.30555\n", + " 0.864401 -0.827763 -3.21136 0.0473293 -0.747106 1.54823\n", + " 2.21188 0.110096 -0.0740145 … -0.455901 0.0330671 -0.297855\n", + " 0.532813 -0.251176 0.150976 0.100961 2.05177 1.58331\n", + " -0.271735 0.369714 0.769278 -1.12375 1.05237 0.562472\n", + " 0.502334 0.0721164 -0.310153 -0.579068 0.430384 0.85342\n", + " -0.516984 -1.50343 -0.602707 -0.493044 0.211279 -0.321671" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "barA = randn(k, m*n)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The vector $b$ is the observed output." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "10-element Array{Float64,1}:\n", + " -0.4667377936599136\n", + " -0.17047970356351902\n", + " 0.8292034377961193\n", + " -0.4500585937344793\n", + " -1.3038856208344294\n", + " 0.5869555339344937\n", + " 0.17548586215288262\n", + " -0.2760027307659979\n", + " -0.2631151380019278\n", + " -1.1348769546238908" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "b = randn(k)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us chose an initial point." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "Z0 = zeros(m,n);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create the objective function." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "f = LeastSquaresOverMatrix(barA, b, 1.0, iterative = true);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let us create the constraint set." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "RankSet{Float64,Int64}(1.0, 4)" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "C = RankSet(M, r)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create the `setting` file, note that we are working with the `:safe` rule for updating the proximal parameter $\\gamma$. Also, the solver output is turned off here, we could turn it on by setting `verbose=true`." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "execution": { + "iopub.execute_input": "2020-11-05T18:59:11.425Z", + "iopub.status.busy": "2020-11-05T18:59:10.909Z", + "iopub.status.idle": "2020-11-05T18:59:12.266Z" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Problem{ProximalOperators.LeastSquaresIterative{1,Float64,Float64,Array{Float64,2},Array{Float64,1},ProximalOperators.AAc},RankSet{Float64,Int64},Float64,Array{Float64,2}}(description : Least squares penalty\n", + "domain : n/a\n", + "expression : n/a\n", + "parameters : n/a, RankSet{Float64,Int64}(1.0, 4), 1.0e-10, [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0])" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "setting = Setting(μ_max = 2, μ_min = 1e-8, μ_mult_fact = 0.5, verbose = false, freq = 500, γ_updt_rule = :safe)\n", + "\n", + "problem = Problem(f, C, setting.β, Z0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Solve the problem!" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "State{Array{Float64,2},Int64,Float64}([0.009640555740470996 -0.0039049066980724172 … 0.0032744246264381385 -0.011401924226827814; 0.017039021787684776 -0.006486467044692794 … -0.006723985185855889 0.011346398802052203; … ; -0.0031155465177579336 0.012103228512837969 … -0.0030847566736254646 0.006789864137943311; 0.0037643322883237993 0.011064929502908967 … 0.003141509734574783 -0.006391901052935991], [0.009640555310037102 -0.003904906555286084 … 0.0032744247618116023 -0.011401924380723396; 0.017039022426592925 -0.006486466874719959 … -0.006723985197637846 0.011346398967171602; … ; -0.0031155462203965234 0.012103228542114463 … -0.0030847567074370006 0.006789863932342989; 0.0037643323913579916 0.011064929307709616 … 0.0031415097913574316 -0.00639190102063273], [0.009640555709940084 -0.003904906817854433 … 0.003274424252976659 -0.011401923965769611; 0.017039021755258916 -0.006486467029139976 … -0.006723985147301422 0.011346398638919817; … ; -0.0031155461667072546 0.012103228769906214 … -0.003084756747081626 0.006789864408033303; 0.0037643322644109536 0.011064929211436649 … 0.003141509432661576 -0.006391901114060031], 1, 9.267131556491698e-10, 2.192690473634684e-15, 7.450580596923828e-9, 0.002154434690031884)" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "state_final = solve!(problem, setting)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Observe the output.** Let us check the objective value first, we want it to be small.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "2.1913058045348726e-14" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "f(state_final.x)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, we want to check what if the best state found by our algorithm has converged to a locally optimal solution, which is checked by testing whether the best state has reached the desired fixed point gap and feasibility gap." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "true" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "log10(state_final.fxd_pnt_gap) <= -4\n", + "\n", + "log10(state_final.fsblt_gap) <= -4" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Julia 1.5.0", + "language": "julia", + "name": "julia-1.5" + }, + "language_info": { + "file_extension": ".jl", + "mimetype": "application/julia", + "name": "julia", + "version": "1.5.0" + }, + "nteract": { + "version": "0.26.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tutorials/Matrix_completion_problem_NEXOS.ipynb b/tutorials/Matrix_completion_problem_NEXOS.ipynb new file mode 100644 index 0000000..9cdf1b2 --- /dev/null +++ b/tutorials/Matrix_completion_problem_NEXOS.ipynb @@ -0,0 +1,312 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "source": [ + "### Matrix Completion Problem\n", + "\n", + "A matrix completion problem can be formulated as the following optimization problem:\n", + "\n", + "$$\n", + "\\begin{array}{ll}\n", + "\\textrm{minimize} & \\sum_{(i,j)\\in\\Omega}(X_{ij}-Z_{ij})^{2}\\\\\n", + "\\textrm{subject to} & \\mathop{{\\bf rank}}(X)\\leq r\\\\\n", + " & \\|X\\|_{2}\\leq M\\\\\n", + " & X\\in\\mathbf{R}^{m\\times n},\n", + "\\end{array}\n", + "$$\n", + "\n", + "where $Z\\in\\mathbf{R}^{m\\times n}$ is the matrix whose entries $Z_{ij}$ are observable for $(i,j)\\in\\Omega$. Based on these observed entries, our goal is to construct a matrix $X\\in\\mathbf{R}^{m\\times n}$ that has rank $r$. \n", + "\n", + "First, of course, we load our packages." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "using Random, LinearAlgebra, NExOS\n", + "\n", + "Random.seed!(1234)" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 1, + "data": { + "text/plain": "MersenneTwister(UInt32[0x000004d2], Random.DSFMT.DSFMT_state(Int32[-1393240018, 1073611148, 45497681, 1072875908, 436273599, 1073674613, -2043716458, 1073445557, -254908435, 1072827086 … -599655111, 1073144102, 367655457, 1072985259, -1278750689, 1018350124, -597141475, 249849711, 382, 0]), [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0 … 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], UInt128[0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000 … 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000, 0x00000000000000000000000000000000], 1002, 0)" + }, + "metadata": {} + } + ], + "execution_count": 1, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:50:39.124Z", + "iopub.execute_input": "2020-11-06T21:50:39.784Z", + "iopub.status.idle": "2020-11-06T21:50:53.020Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Construct a random $m\\times n$ matrix matrix of rank $k$." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "m,n,k = 40,40,5\n", + "Zfull = randn(m,k)*randn(k,n) # ground truth data\n", + "Zfull = Zfull/opnorm(Zfull,2) # scale the matrix so its operator norm is 1\n", + "M = opnorm(Zfull,2)" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 2, + "data": { + "text/plain": "1.0000000000000002" + }, + "metadata": {} + } + ], + "execution_count": 2, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:50:53.037Z", + "iopub.execute_input": "2020-11-06T21:50:53.577Z", + "iopub.status.idle": "2020-11-06T21:50:55.961Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Suppose that we only observe a fraction of entries in Zfull. Let us find the indices of all the elements that are available." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "n_obs = 600\n", + "Zobs = fill(NaN,(m,n))\n", + "obs = randperm(m*n)[1:n_obs]\n", + "Zobs[obs] .= Zfull[obs] # partially observed matrix" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 3, + "data": { + "text/plain": "600-element view(::Array{Float64,1}, [314, 484, 770, 1462, 495, 602, 637, 1284, 358, 1510 … 1159, 371, 157, 537, 589, 533, 221, 1025, 564, 529]) with eltype Float64:\n 0.04614085153216508\n 0.029552887430458176\n -0.0184886116014581\n -0.04597613170772224\n 0.06158831071378109\n 0.04359079928175278\n -0.014791813279264342\n 0.006936983760815719\n 0.01650141696962079\n -0.014561284205705108\n 0.0251527010880924\n 0.008374291722196085\n -0.00335913745738921\n ⋮\n -0.008553749580091464\n -0.010747230035676235\n -0.01914100968377146\n -0.055173021978315376\n -0.16168361561961594\n -0.02789400226183489\n -0.02613098308987431\n -0.010856985785824986\n 0.01568346928771265\n -0.025649452543356512\n -0.003523186151803044\n 0.022307043854383427" + }, + "metadata": {} + } + ], + "execution_count": 3, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:50:55.972Z", + "iopub.execute_input": "2020-11-06T21:50:55.979Z", + "iopub.status.idle": "2020-11-06T21:50:57.500Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Plot the ground-truth, full dataset and our partial observations" + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "using PyPlot" + ], + "outputs": [], + "execution_count": 4, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:50:57.512Z", + "iopub.execute_input": "2020-11-06T21:50:57.523Z", + "iopub.status.idle": "2020-11-06T21:51:06.802Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Plot the ground-truth, full dataset and our partial observations" + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "figure(figsize=(7,14))\n", + "subplot(1,2,1)\n", + "imshow(Zfull,cmap=ColorMap(\"Blues\"),interpolation=\"None\")\n", + "xticks([]),yticks([]),title(\"True Data\",fontweight=\"bold\")\n", + "\n", + "subplot(1,2,2)\n", + "imshow(Zobs,cmap=ColorMap(\"Blues\"),interpolation=\"None\")\n", + "xticks([]),yticks([]),title(\"Observed Data\",fontweight=\"bold\")\n", + "show()\n", + "PyPlot.display_figs()" + ], + "outputs": [ + { + "output_type": "display_data", + "data": { + "text/plain": "Figure(PyObject
)", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAEjCAYAAADOhyC+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3dd3RUdfo/8CekTnpPCIEECKF3VKQjSlFAYNUVG7blq9hd1rWgsMq66FpQAXUVUBFBRUABAQUB6b13SCGkJ6Rn0u/vDw/5MeH9jJkkClzer3M8xzwzc+/n3rn35sPkvudxMgzDECIiIiKTaHSpB0BERETUkDi5ISIiIlPh5IaIiIhMhZMbIiIiMhVOboiIiMhUOLkhIiIiU+HkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4ubkEoqOjxcnJqVb/ffbZZ5d6uBcZMGCAzRjd3NwkICBA2rVrJ3fddZesXLmy3uvYt2+fTJkyRaZMmSLr16+v/6CJTCYuLk6eeuopad++vfj4+IiHh4dER0fLXXfdJRs3brzo+Z999ln1OTtlypQ/f8CXqSlTpjh0vb3w2teoUSPx8PCQ0NBQ6dGjhzzxxBNy4MCBeo9p6dKl1de/hISEei/vauRyqQdAV77y8nLJzc2V3NxcOXr0qCxYsEBGjBgh8+fPFx8fnzotc9++ffKvf/2r+ucBAwY00GiJrnxff/21PPDAA2K1Wm3qiYmJkpiYKAsWLJBnn31W3n777Us0wquDYRhSWloqmZmZkpmZKbt375ZZs2bJpEmTbK5fjlq6dKl8/vnnIvLbtS86OrqBRnz14Cc3l0BCQoIYhlH9X1RUVPVj69ats3ns/vvvh8soLi7+k0Zr39y5c6WyslJSU1Nl9uzZEhISIiIiy5Ytk7Fjx17i0RGZz549e+Tee++tnti88MILkp6eLoWFhfLFF1+IxWIREZF33nlHZs6ceSmH6pDL5ZrmiPj4eCkrK5OTJ0/KpEmTxMXFRaqqquTVV1+Vt95661IP7+pm0CUXFRVliIghIsa6detsHjtfj4qKMrZs2WL069fP8PT0NPr373/Ray80btw4dZlLly41Bg8ebAQGBhouLi5GRESEce+99xonTpyo1Xj79+9fvey5c+faPLZ//37D2dm5+vE1a9ZUPzZ16lSjT58+RuPGjQ0PDw/D3d3daN68ufHggw8a8fHxcH/U/G/y5MmGYRjG4sWLjaFDhxrNmjUzvL29DRcXFyMsLMwYMWKEsWHDhlptB9GVaNSoUdXnw/Dhwy96fNq0adWPh4SEGOXl5YZhGMbcuXNtzqMPP/zQaN26teHm5ma0atXKmDlzps1ycnNzjQkTJhjNmzc33NzcDIvFYjRt2tQYOnSoMX/+fJvnxsXFGePHj69+ro+Pj9G3b1/jm2++sXneunXrqscwbtw4Y86cOUb79u0NV1dX48UXXzTCwsIMETF8fX2N4uJim9e2atXKEBHD3d3dyMrKMgzDMKqqqoy5c+caffv2Nfz8/AxXV1cjKirKmDBhgpGamnrRvvn444+N2NhYw83NzYiNjTU+/PBDY/Lkyer1DLnwenThdcswDGPmzJnVj3l7exs5OTmGYRiG1Wo17r//fqNz585GcHCw4erqanh6ehodO3Y0Xn75ZaOwsNAwDMOIj49Xr30XXstrey29mnFycxmozeTG09PTsFgs1T/XdXLzz3/+Uz1xvL29jZ07d/7ueO1NbgzDMG6++ebqxx9//PHqeufOndV1N27c2MjOzr5om7TJzVNPPaU+x9nZ2fj1119/f8cTXWEqKioMLy+v6mN9yZIlFz0nOzvb5nzYvn27YRi2k5smTZrAc+eNN96oXs6Fk6ia/919993Vz9uxY4fh4+OjPvf555+vfu6Fk5vg4OCLzu0Lr08LFy6sft3WrVur62PHjjUM47eJzZ133mn3mnLhL/q3334bPu/CfVHfyU15ebnh5+dX/fiiRYsMwzCMnJwcu5OWwYMHG4ZR+8lNba+lVzP+WeoKUVxcLD179pQTJ05IUVGRzJo1y+Fl7Nq1S9544w0RERk6dKgkJCRIaWmprF27Vtzc3KSwsFAeffTReo+1Q4cO1f8fFxdX/f9TpkyRAwcOyLlz56S8vFzS09PlgQceEBGR1NRUmT9/voj89me7uXPnVr9u8uTJ1X+mO38j5G233SabN2+W9PR0KSsrk7y8PPnwww9FRKSyslKmT59e7+0gutxkZ2dLUVFR9c8tWrS46DmBgYHi7+9f/XNiYuJFz8nKypJly5ZJQUGBzU20U6ZMkZycHBERWbt2rYiIXH/99ZKVlSVWq1VOnz4t8+bNk0GDBlW/5sEHH5SCggLx9/eXNWvWSElJiZw5c0b69u0rIiJvvPGGHDp0CI7h6aeflvT0dMnOzpZx48bJ3/72N3FychIRkXnz5lU/98L/Hz9+vIiILF68WBYuXCgiIvfff7+kpqZKSUmJfPXVVyLy2zXlH//4h4iIFBQUyOTJk6uX8emnn0pBQYGsWLFCsrKyLhpbXbm4uEjr1q2rfz5//bNYLDJ//nw5ffq0FBQUSFlZmZw6dUq6dOkiIiI//fSTHDx4UKKjo8UwDBk3blz1Mi68VeH8vYe1vZZezXhD8RXk888/l6ZNm4qISLt27Rx+/dKlS6v/f9WqVfAmtV27dklWVpYEBwfXeZwXOn+hEhEJCgqSl156qXod5eXlNs89cuRIrZcbGRkpU6dOlXXr1klycrKUlpbWeVlEVwrDMBrkeWPGjJHhw4eLiMi4cePk448/lq1bt4rVapVNmzbJiBEjpGXLlrJv3z45fPiwTJkyRdq3by9t2rSR0aNHi5eXl4iInDp1qnrikpubKzfeeCMcy+rVq23+0SMiEhMTI2+//bY0avTbv7EDAwNFROSGG26QtWvXyurVqyUjI0MCAgLk66+/FhGR2NjY6l/wS5YsqV7WZ599BpNOq1atEhGRLVu2SGFhoYiIdO/eXR566CEREbn55ptlzJgxsmDBArv7q67OX//c3d2lpKRExo0bJ4cPH5a8vDypqqqyee6RI0ekY8eOtVpuQ15LzYqTmytESEhI9cRGYxhG9clUUVFx0ePp6em1Wld2dna9JjcXRiHP/8ty+/btMnDgQKmsrFRfVzP5oSkoKJBevXpJampqvZdFdCUJDg4WLy+v6k9v4uLipFOnTjbPOXfunOTl5VX/fGFgQatFRUXJ1q1bRUQkIyNDRETmzJkj999/vxw4cEBmzJhR/VyLxSJTp06VZ599ttbXFPTpSNeuXasnNhcaP368rF27VioqKmTBggUSFRUl2dnZ1Y+dV5t1FxYWSmlpqc36a15H0f6pq/Lycjl27Fj1z+evf2+//bZMnDjR7mtre81qyGupmfHPUlcIT09PWPfw8Kj+/wvTBqdOnbrouWFhYdX//5///McmlXX+v6qqKpuPVR21d+9e+fnnn6t/HjVqlIiILFy4sPpkvPvuuyUrK0sMw5D3338fLufCT3xq+uWXX6onNu3bt5e4uDipqqpqkO+XILqcOTs7y0033VT985w5cy56zuzZs6v/PyQkRLp163bRc2r+qerCn0NDQ0Xkt8nH/v37JSkpSVavXi0zZ86U1q1bi9VqlYkTJ0pKSorNNaVNmzbwmmIYhrz++usXjUG7po0aNao6dTlv3rzqP0m5ubnZ/LnmwnUvWLBAvZ65u7vb/GMtKSnJ7r6oj48++kjy8/NFRMTHx6f6z3dffvll9XPee+89KS4uFsMwZMyYMXA59q5/jl5Lr1ac3FzhLvzT0vLly0VEZNGiRbJ9+/aLnnt+oiEi8uabb8ry5culqKhICgsLZdu2bfLUU0+pJ5s9hmFIWlqazJ49WwYPHlx94g0fPlxuuOEGEfntb9HneXh4iMVikf3798t7770HlxkUFFT9/0ePHpWysrLqny9clouLi3h5eUlaWpq8+OKLDo+d6EozadIkcXV1FZHfvnLh5ZdflqysLCkuLpb58+fbfEHfK6+8YnO+nLd48WJZsWKFFBYWyueff179qY3FYpE+ffqIiMiLL74oS5YskYqKCunXr5/ccccdEhMTIyK/nfNnz56VmJiY6j83HTt2TCZOnCipqalSXl4ucXFxMmvWLOnUqZNDE4gLJzG7d++WH374QUR++1PahZOU0aNHV///Cy+8IBs2bJCSkhLJy8uT9evXy4MPPiiPPfaYiIj06tVLvL29q5c5e/ZsKSwslJUrV8rixYtrPTakoqJCTp06JZMmTZJnn322uj558mTx8/MTEdtrlre3tzg5Ocn3338vK1asgMu88Pp34MABmz9hOXotvWr9GXctk321jYIj8+fPt7lT/nxq4cJExYXLfOGFF+zejX8+hWXPhWkp7b+RI0ca+fn51a/ZsmWL0ahRo4ueFxsbW/3/48aNq35+cnKy4e7uDtMCOTk5Rnh4uN1lafuLyAwWLlxok55E/z3zzDM2r6lNWmratGnVz2/ZsqW67MjISMNqtRqG8VtaytfX1+5YzqeKakbBNSdOnLhoGb/88ovNc6qqqoy77rrL7novXIeWlgoJCalzWgr916hRI+OVV16xec2F8fwLn3fhPr5w3d999x1ctmE4fi29WnFycxmoz+SmqqrKeOedd4yYmBjD3d3daN++vfHJJ5/Y/Z6b5cuXGzfffLMREhJiuLi4GCEhIUa3bt2MZ555xtiyZcvvjrfm5MbZ2dnw9/c32rZta9x1113GqlWr4Ou+++47o1OnToaHh4cRFRVlvP7668acOXPUE3LRokVGx44dbS7i57dl//79xqBBgwwfHx8jKCjIeOihh4y9e/dyckNXjVOnThlPPvmk0bZtW8PT09Nwc3MzmjZtatx5553wu57Q99yc/86XmJiYi77nZvr06caQIUOMyMhIw8PDw3B1dTWaNm1qjBs3zoiLi7N5bkJCgjFhwoTq65C3t7fRqlUr4/bbbzc+++wzo7S01DCM2k9uDMMwBgwYUP3cVq1awedUVVUZ8+bNMwYOHGgEBAQYLi4uRnh4uHHdddcZL730knHw4EGb53/00UdGq1atDFdXVyMmJsaYPn16vb7nxsnJyXBzczNCQkKM7t27G0888cRF6zSM3yL8r776qhEdHW24u7sbnTt3Nr7//nub63TNdU+ePNmIjo42XFxcbCY3huH4tfRq5GQYtbz9noiIiOgKwHtuiIiIyFQ4uSEiIiJT4eSGiIiITIWTGyIiIjIVTm6IiIjIVDi5ISIiIlOpc2+pqqoqSUlJER8fH7tfFU1EVw7DMKSgoEAiIiJg358rBa9PRObjyPWpzpOblJSU323kSERXpqSkJImMjLzUw6gzXp+IzKs216c6T258fHxEROSOGT+Lq8XL5rHro33ga87klsG6iIirM/7XVbKd18B1ZBbC+vAuoXj5eeWwbncd54phPT2nBNa7NQ+AdU9XPPOsqNK/V7GsEj9WUlEF695uzrCeU3xx13ARkSrlOx0TM/B+FRFpEuwF6zmFpbDepjE+PmKCPGD919O56rq1f5VHBeJlFZfj/ZRnxfujTNmvAV6u6pi6RnjD+g8HM2B9eIcQWI8/h4+nA2fzYT3YB2+ziIi3Bz4OWga72/xcUlQoL9zaq/r8vlKdH39SUpL4+vpe4tH8f11f/hnW9752E6w/tFBvBjv7zk6wfuP0zbC+5unevzM6WxMWHYb1Wbe1V18zbv4+WLeW4Q7WPhZ8Hmnb9md4+Gu8z2ND8XXul8P4vBYRadRIuT6F4GtEcRm+DuUX4d9TpRV4v/44oac6phmb4mH9m81JsD76ejyJeKZfC3UdyKj/7VQf87HgqUjLMNv9VFpcKLPGDajV9anOk5vzv1RcLV7i5mk7AIs3XrF7Of5lJyLi5ox/0buV6a9BXCz4l7PFSxuTY5MnERFXKx6rSwn+BeLuiQ9kD2XiUV6Ff6GKiDhV4O0zlF/C7u54HW7i2ORG268ictH7f55rJb5wuXvh51u8Lcry8VhF9MmNhxdeVqVykXVzUtah7VdPfXLjqRz/rhY8Kdae71GK1+FqwWNy89QnN+7K5MbihV9zpf8p5/z4fX19L6vJjbM7/gWpjdHVgs8Ve69x8XBsHRrtvLa3HG28Fc74vHNVJjeX8j3TtsFDuW65eBSpy9ImN9q+LXfG1yGXKvx7qrIc71d7+89D+V3orBw32vMdfY9cLHj5IvpxoP3urM316cr9ozoRERERwMkNERERmUqdG2fm5+eLn5+ffLP15EUfq/9wNAu+JsxX/yjfS/kTzfY4fL/Fw9fhmwVPnMP3hnyy+hSs9+3aRB1TUhb+uPGWTvj+newi/JFipXIPTZZy38upVHxPhYiIi3JvkqZHNL7fp0J52zuH448B/dz09+7XMzmwfiS54HdGZyvEF/+JJNBL/+vpwOhAWP/hOD4GT6Tkwfo1LfBytHuTLMr9UiIi2h8VtVcsXnca1h8b1RbWo/zwfvrhUKY6ppQc/CexyCDbj4rLigtl4d96S15e3mX15xxHnb8+oe24ccZW+Jo1j1+vLu8R5f6Tj+zcf3KpPLf8OKy/Obx1gyy/9T9XqY85K7cXHHl9MKzf++V+WJ93T2fHB6a4ax6+D+ire7s02Do0C/achfX5O1Ng/djJbFg/9dawBhuTZszs3bC++KHuDbL83v/9VX1s8z/61WoZ9s7rmvjJDREREZkKJzdERERkKpzcEBERkalwckNERESmwskNERERmUqdv8TvPC9XF/FytU3SaKmoHhH63c1zt+O7yk8m4CROZif8ra5NlMRNWBhOAUX6u8O6iP6tmvuTcYqqVSj+4rgTGTipMqZdGKx/lG3nS6Ec/HK1YTF4P328E38bZSvlW4KjffUvYNISRdc294f1rzcmwnpuAP7CxsEd8H4SEfFVUlyHEs/B+r5d+Ns5B7TB3+h5Q3OcjNukJMTsOZ2Jj4OsVCVdqHwL8m4lhdbYX/8SvwMn8Tr+0r2xzc/WIidZqC7lyrP8UJp4etueT439PR1ezrZDafiBPzgt9dDXB9XHZv+1I6xrqajHFh+B9Zlj2sH6V0rSx91d/7WhfWmdxtFU1Gs/49Rrn6Y4FSrScKmo6KeXw3rC9OHqa4I88O+Xg4fTYT1j42pYf/gbnA4eEotTnj8qiWURkblj8bc/ZyiJSs20X3DKc88ZnEiNCNa/kPKPwE9uiIiIyFQ4uSEiIiJT4eSGiIiITIWTGyIiIjIVTm6IiIjIVDi5ISIiIlOpd+PMF5fsEQ8v28aZ6YXl8DXWMhwZFhHxteDYq5cbnn/lWnFM211pLOmhNDrUYswiIi2CcbTWRUk7ZiqNM09n4Gi3n6cbrlv0qOWpdNwYdHC7IFhftDsV1q9riZ+fXYTfO3uCvfF75600Q/V2V5rrpVlh3V6TyoNJuLFqizAfWB8Wi7d7byqOV2vx7eYhOPYvojdQbaN8VUBRGW61qZ1HnRrjKPOZHBylFxHxccfvxf5k2+PJbI0z7/p0s7h52kZQtShsXTRUU8axX+yF9QX3dXV4TI6u41QSju76+eEYc4CX/vUZ3z7QzfGBAY7uj8Ezt6nL+ukx/DUPmr//cAzW3x7ZxqHl2KONVxvrffNxg9Ev7na8weiwD7fDersmfrDeUNs97qsD6mOf31W7c5KNM4mIiOiqxckNERERmQonN0RERGQqnNwQERGRqXByQ0RERKZS78aZ2cUV4ia2qQ4PJU5kGDixISIyPDYY1qeswHeu+/tod/LjBNLEfi1gfcY23MRRRKRXBG7G9vyyw7B+fx/c3OxoCk7ijOsSAevTN+PmjiIiFiX1EmTB231Te9z48cd9uBGgjydOPsWE4/SRiJ446xWJG2dmleBUT1ZBCawXleiJth1bcSO9nLaRsB7giQ/5tqE4gXRHR/weLT2KU2giIrnFZbBeZeC0VMtAXK9SgoyH03CCq28U3t8iIq6N8L9jQmo057QWupiqceaHt3eoderrlo92qI890BMfT79uS4D19scyYD0wEB9n74zGTTAbkqPJqx8P4W14ZSm+/jWkzUpjX1G2wd1V/93iKC0d1GPqOljfNWmguixL18dh3bp3hkNjqksqSrPy0etg/envjzbYOpDaJqIaCj+5ISIiIlPh5IaIiIhMhZMbIiIiMhVOboiIiMhUOLkhIiIiU6l3Wio+o0BcLbapjiaBXvC5Ph76He3llTgZEhuJ+13EKumWxHM4ifOxcve9h5277F9bcwLWnZxwGixOWfcTvaNhfW1CNqxXVentvu7v3gTWX1yMEwwz7sQ9bvYl5cN6l6Y4WaL1+BIR2ZeMe2dN3xAH61qyoX9r3PfpwFmcNhMR+falIbB+MAO/5q0FuBfQ0AGtYD3IA6fQ9p7B+09ExNsDn1Y/Hc6EdV8loab18DkQj4+blFycNhMRUQ5Z6Rlte35ZS/VkmtmteORa9bGv9pyF9bMzRzm0jiFKT6FXVuNUqJZssafZEz/A+pkPRjq0nG8P4kSlvXTQwr3JsH5nV3zdWnM0C9YjlD5H//ctvs7F2klzDvpgK6yvfeJ69TWIve3WOJqK0gyYvhnW1z/dG9a7v4aTXSIi3t74mrbhGbysP0OvN3+F9evb2KZ9S4txb0WEn9wQERGRqXByQ0RERKbCyQ0RERGZCic3REREZCqc3BAREZGp1Dst1SM6QDy8bO9UD/fB6Q8PF30uNUdJMxUr6Q1fC16Hm9LXyuKK1x2o9BoSEWnkhJNDKTlWWD+dgVND+SWVsN6zGb7DP89OL6XVp3BSpmNLnDRKL8YJGi8PvP9yrXjduXiTRURPd7VREg+JmfiO99Q83JMp3M9DXXdiPu6zdCgVvxfNovF+8rXg4yA+Dy8/NtxbHVPrENwr6ru9OH2itJCS1qF4OWeycIoq1BfXRUROp+N9HlDjPPKoxMeFmby0Eqcg/z0sVn3NOz+ehPW7uuGeU6M+3QXrqx/r+Tujqz9HU1GauWMd7wWkpaK0tM99vfD+2/Z8f4fXfTl6ZTU+1l4doh9riJaK0ux+2fFkV4dJP8O6Rfld27tDGKxPv7Wtw+vOz8dJ4xBv23WXONV+ysJPboiIiMhUOLkhIiIiU+HkhoiIiEyFkxsiIiIyFU5uiIiIyFScDEPLatiXn58vfn5+8uiCneLuaZscSczESRU3O2mpAbEBsJ5eUA7r5UpCx7URTkul5uMkTmk5TjKJiAR64R4cvkqPrCZ++PnxSs+pszk4yaRsgoiI+Cl9iNyc8b5NVfoNackaZ2XlId76XepFpVWwXiX4PdLGmnAOj9XTTe//VeRgmi7EC29HcTneBk2uVT9uTqfhvlM3dwyF9WMZOIpWohyb3ZvilF2ykjYTEfG34H24O9F2rOXWQln8aD/Jy8sTX1+cFrwSnL8+XartuPtL3MNs/j2415vWc6qkTD/OHO0F1FDJnT/DaCVttuThHg22Du09cnfB50qC0q/ulyd7ObxuLU23VNk+bawa7TgTEWnz/GpYH9YrCtZPpODrmdaL7bnlx2H9zeGt1THVliPnNT+5ISIiIlPh5IaIiIhMhZMbIiIiMhVOboiIiMhUOLkhIiIiU/lD0lJBSr+muCy9QdHYzuGwvuJEFqzHBON+Qz/sS4f15qE4YVJpZ/PLlLSK1uvoeCq+q/yGNsGwvvsMfr69d8Tiju/kL1T6Ubko6SctLaX12jqWhhNwIiKB3jgltvEA7qV0e+9msN4qyBPWv92HlyMiEqq8F+3DcV+mnGL8np7JwYm2uHSckGgRho8nEZEOjfF27E7Cy2of7gXrWk8yLdF2MBkfTyIizZR9W/NYKy0ulJljrzFNWurm99eJq6XG9ckHH/uz/9pRXd7CvcmwPn9nCqwvG39NLUdq3w3vb1EfK1cSfuGBynn0QDeH1t3nrY2wbu9XhofSs27tE9c7tG7Nw98cgvVP7+jQIMu3Z9aWBFif0Cv6D1/30FnbYX3VhOv+8HU/tvgIrBdYcZL5i7s7/2FjYVqKiIiIrlqc3BAREZGpcHJDREREpsLJDREREZkKJzdERERkKpzcEBERkano3RBrycetkbi7286RVu1Lhc9tHIQjryIiSQXFsO6tRJ/LKnAc8Z7rmsD6YiUi7uWh7wIvd/yYFpd+sm9zWN96NlddB1JYgiN2InqE26WRY/PUSge/AOCvSlRfRORgRiGsj+2HG7GlKg0eN51MgvV2TfzUdbu54Fj0niQ9uo40DcD7tWsTvN27kvA2i4h8vPo0rD87vBWsz9uOY8aN/XGkd1i7IFgvr9KbfzYSvJ+2x52z+bnC6th+u9x5e7iKW414sr3It+ZIBt4vDRX51thrynjjjK2w7qc0/P12H46t394lAtY3TewL650nr1XHtPkf/dTHHKE1lgz2xV/98PnOM/qYEvBXJPzvdsfi4zOW48ajdYmCa+/dmsdxZF6LfE9ahce04/Q5WBcR+emxnr8zOlszx7SD9d7//dWh5dz75X71sXn34Ph4zSaflaW1vz7xkxsiIiIyFU5uiIiIyFQ4uSEiIiJT4eSGiIiITIWTGyIiIjKVejfO/GjdYbF42zYRzLHiJo45xbguIuLhiudZPx3CKac2Tfxh3VqG1zEgBj9/b4p+93XvKJzS2ZiQB+uVVXhXHkvGzx/fBzeQ/OmEfqf74Xj82KODcFJrw2m87opKnKzxcMPpNHtpLC2x1DwQJxs0R9JwYq5EaWAqIrLveCash4XiZN7ILmGw3sgJb0NRGV63m7O+P5S+lpKnNMLUnh/qjZsQas1etQShiEix0mTRtcbKS4oK5KVbupimceZLS/eIh5ft9WnSjTENtp7mz6yA9fh3b2mwdWje2xgP608pqc2r1a2f7IT17//mWNJNSziVKdcIEZEDuxNgPbgxTjyeemsYrE9cdgzW3xrRBtaf/v6oOiYn5Vr37ki8rMsJG2cSERHRVYuTGyIiIjIVTm6IiIjIVDi5ISIiIlPh5IaIiIhMpd5pqbd+OiCWGmmEuOwS+JoQJf0hIlJQiu84P3QW9wUZEBuoLAenQo6m4V5AEf56oie/BCevXJWkTJAX7jmlpVjCffD+2JGIt1lExFPpd1Wq3LHfvxVOiWm9kWJCLIFVQmEAACAASURBVLAeovTTEtETZxVKq6OsfHx8eFvw/vCz0/+rbRge7+Y4nBKLSyuA9T5tQmC9iR/u06Md4yIiLkr8KVdJER5Teo8N7YT7WuUpx6VVSUSJiCSfs8J6+wjbc7e0qFCm/aWbadJSl2o7Hl+C0ypFSt+4uWM7/ZHDqZMnl+JteH9U2z95JP/fhO+OwHpjX3yeioi8fBNOx93y0Q5YX/HItY4PzEH9390M6wlKGjbx/REOLf+2ObvVx7S0VH4x7vl3TEmkamO68/O9sL5wXFd1TN1fWwfru18eaPMz01JERER01eLkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4uSEiIiJT0WMotVRcVilVrrZJncwCfNe1Pa2VlE5cJh6i1ouqtBInk0qUNFGhktISEbEqj7l54f5LWi8gLcVyTknPFJfqPbhCfd1hPa8I7/MKpd+VpmavofO8XPVDReuzVFKOt8PZGa9DC+6V29kGZ2W87q74PXKUmzJWe8mkXCV1EO6Lk3mVyjGrpaI8lW3TkoIiIhalZ1hujXWU2jn2zG7MbD1h0rZGquy8fw+LhXVvN3xOzBjtWCpqyMxt6mOrH+vp0LI0jyw6DOtFyvHX6ZU16rIOvHojrNc2DfN7Zv2lHazP2Iz7bImI3D53D6w3VCpqxP9w7yoRkWXjcf+qDc/0hvVmT/zQIGNa9GB39bFeb/4K61ue69cg69YSvfZYLPg1NfdtuRWnfBF+ckNERESmwskNERERmQonN0RERGQqnNwQERGRqXByQ0RERKZS77TU5lM54mqx7ZcSoqRCErOK1eU82L0prKcV4F4sIZ44NbTqcDKsawkkLzt3dh88lQXrnWNxH6J9yThdNeHaZrD+4o+4d4vS+kNERPpG415RL25JhPWBrXEPLi1tFqDctX40U3/vSpQmUkt/wtt3y6DWsH5bhzBYn7PjrLruMW1DYT2yPU7fvZqYA+unM3B/rNS8UljPyMO9mkREIoO8lHXgO/3bNsPvaYaSOqxQ0lXxaXpPsubhuA9LeY1lacu+GiSm6Ptv8UN6+gRpFoCvT44qVJJ3IiIB986H9Zx5dzu0jo9ua+/Q89s8v1p9bNaWBFjXUlETlx2D9bdGtHFoTI/3bl6nxxzx5W58HUpJr32C5/ec+WBkgy1Lo6WitBRcSAi+nq194npYn3NnR4fHtGliX1gf9uF2m5+1ZCnCT26IiIjIVDi5ISIiIlPh5IaIiIhMhZMbIiIiMhVOboiIiMhU6p2WcndpJK4utnOkddvPwOcGBXmqy9mShJNJPu54/vXakiOw/spo3HskJtAb1l9ehe/WFxH5ajzu3TJzG04mvToY95kZ/T/cH+aWro1hPT67RB3TurhcWF/6FL7b/MF5u2C9cTDeHxsPpMH6vf2j1DEFeOJ0XMKs22B9bwLehtFv4rv1xwzB76mIyLA7J8N69NARsP7VI/gO/6R8nAbrEO4H61/s0RNcb7y5CNYPLnga1o9n4JTOf9ecgvW/D4qB9eh+LdQx+XvivmcPLdhr83OFFafGzMQybDqse4WHO7ysJ5fiROC873A/owm9oh1a/uZ/NEy/HxGRDCV52v2FH2E9acatsH5s2pAGG9OXyw7BuqNpKXu0RNbUYXgd649nwvo93SNh/cvtOKHbkObswL9zHrxWvy47SusL9vlO/Pu8IcVn4d95Kx+9zubn/Px88ft77ZbJT26IiIjIVDi5ISIiIlPh5IaIiIhMhZMbIiIiMhVOboiIiMhUOLkhIiIiU3EyDKNOnfLy8/PFz89PHvlqh7h72saKs5WGfy1D9Cj4caX5WKC3Yw3oejTFEefdZ3HEVYuai4hYy3FDyMJS3CCzd3PcnPBoOm6yGOiJk/hpyv4TEWnmj/fHaSU+fq2yP7afwfvbojTUbB+OG1GKiBxOw9unvWbVkWxY15qb9myJm3+K6PtwW3werJ/NxsdBjxZBsH5dU/yebojHcXYRkaQsvI5W4T6wfiQZj1VrwFlajo+/FsH6+ZWUi48PXw/b/VdaXCgz7rxG8vLyxNcXb/uV4Pz1adh7v4irxfYc+P5v18DX3Dd/v7q8L+7u7ND6b/loB6xf2yIA1icPbgXrd36+F9ZFRKxl+HwpsOLId/82uOGvtm7NrZ/sVB/T9q1mqvJ1B5tO4GvEqgnXwbo9Y2bvhnWtGeo1/14P68VKE9PrOuKv9BDRm0j2eWsjrGsNJBtS58lrYX3/vwY1yPL7v7sZ1lso1z8RkTPKNbNmc87z53Vtrk/85IaIiIhMhZMbIiIiMhVOboiIiMhUOLkhIiIiU+HkhoiIiEyl3o0zU84Vi6vVdo7UuSluNhjqo6/ujY9wg8dpz+JmXrN/Pg3rnSJwU8GBLfxhPa9UTya5NMJzv/WncFLmUCpuvrj/TA6sD+8UBuuj2oSqYzqVi1NOA5TtO30OJ5n8LM6wvu0kTimcSMONF0VEmiqpnk4heExbPAtgPa8IvxdN/dzUdW84jZNGZRU46dandTCsZxXi5EliLt5/IV76/nBzxgm1EmVMwzvh93t7Am6oeTQRH0+9W+DzTkSkWElY+Vtsz8kSp3pfEi4rZ1MLxdnDNhD6wIID8LmOJqLsWfHItQ2ynIXjuqqPaUmjSTfia+DoT/E1VqM1BbWXiJq06gSsTx2KmwqfycEpPi0VFfXUMlhPfA83yhXRU5ianS8NcOj5r/2M3wcRkQHTcXKoVEncOuqxxbiB9MwxerPhVlH4uqx5fsVxWJ92S2tY3/BMb1h/ZTU+NkRErGUNsz8uxE9uiIiIyFQ4uSEiIiJT4eSGiIiITIWTGyIiIjIVTm6IiIjIVOrdW+qBz7eJW43eUsHeOEmi9WQSEQn3wYmYk5k4gaTdAW9xw2mPm9vg3kGbz+BEiohIp8a4V8+a4+dgvWkA7qWUmlcK69dF494Yp7NwgkBET9y0DcPrjs/G684qxMmk8kq8fFdnfR7s54nf71bBHso68CEXdw6P1V7a4XgSTg5FN8b7tmME7m+ijUlL+CXl6ik7axnehxY3vA9bBeH9dCwTJ7XyS/B51FJZjoiIp7LuvBrLKikqkCkjupqmt9QfvR1aGqa4GPd38lTOlfVP44TJpfTccpySOZmG044iIkse7vFHDedPoyWQEjNxUnX5/zmejOv15q+w3i4a99GrVK7Lc8d2cnjdf7RhH26H9ZWP6n3BapvIYm8pIiIiumpxckNERESmwskNERERmQonN0RERGQqnNwQERGRqdS7kYxh/PbfhXo3xb0rMq04DSMi8urCQ7B+c59oWD+ZjHsKhQfihJO7M+6l1F5JGYnoMz83JTm0Zm8KrPv74xSLq9ILaEDzAHVMP5/GSS1fd/xWujrjVE+Ikk47loLTY0kpekKiS+sQWC+vdIf1tAKcJAn3wUmSCk/9MM0pxO+3tztelrc7fu8OpuBU3pFUvN3do/Q+ToOVPl/vb4qH9V6R+PlpFryfNh/LhPVQb9yrTETkTC4+93o2s00cFLvgdV6pRn68Q1wstr3P1jx+PXzu62v1HkEvDsL9mrSUU9PHv4f1pBm3qutwlNbT6OWb8Fgd9eZw3Dto7s4zDbL8umj34k+wfuT1wepr+r69CdY3/r0PrGfk4bRqXVJRGm8vfP399I4ODbYOzfcH0mD91k7hDbL8Q4fx8u+bv199TUP2dTuPn9wQERGRqXByQ0RERKbCyQ0RERGZCic3REREZCqc3BAREZGp1Lu31Kcbjoint22/nuQCnNCpUPr3iIis3I/vsG6rJK/8LThB49zICdbP5uA74KPs9ONpJHhZzQLwne7rTuTCekYe7hHUwcFtExFZcygd1kd3bwzrG07gdNV1LXAiK9eK+zgVKv2SRES8lb5FHi64XliGeyMVKD2TogJx6kpE5KzS46mkHC+rV3PcjyRe6Wu1Kw7vv5s6hKpj8nPHybxEJbGk9R5rG+4F69ryT9rpSab1ErO42S6rrLhQPr33OlP3lprwHe4dVFGlH+OrN+KkW+L7I2D9js/2wLqLkrRMysB9i6LDcC80EZF59ziWMOn9X9zPKFPp39dBSUEufqi7Q+u9XGnpncIS5RpYgpOEzUK8YV2k4dJPY7/YC+sL7uvaIMv/M9w+F58TIiLfPtCtVstgbykiIiK6anFyQ0RERKbCyQ0RERGZCic3REREZCqc3BAREZGp1Lu31I9HssTVYpvSsJbhu839vfTUy7ujO8L65J+Ow3q5H045hSo9k27riPvufKOktERE+sXgNNMn6xJg/bWR7WF9bzrug9UyAPdF+mhTojqmMT0iYD05D6dh7rmmCax/sC4O1luE4zvQ+7bUeyntTymC9fZhePtOZ+NUT1wGXk66kjYTEcktxEkji9Jra3467hU1RkmbfTIWpxFe/0XvQ7TrWAas3967Gawbvvi8iFPST2WVONXTW0mCiYi4NsbJq4waKaoSD3P1lkI27nOsB5yInorqMOlnWA8Jwft73VO9fmd0tffSyhOw/u9hsbD+tNJz6vYu+JqiiZm4Un1sRL/msP7uyDYOrSPqqWWwnvgefh/qoo2SRtT6iF1KEf56D8Q/mpbUylESmGUVOKnaT0nf2fP4kqO2yy7GqUKEn9wQERGRqXByQ0RERKbCyQ0RERGZCic3REREZCqc3BAREZGpcHJDREREplLvxpmrdieIl7dtBDXdiiOseaU4OiYisupINqxnF+Co78gu4bC+JwlHfcuV+GxsKI4DiohUKLsmwhfHzb/engzrrq54Djm8E46nxylNHEVEVm3BMfG7BrWE9dNKU7zmITha6KM0ZUywMyatWWm2EtMO8sbR5yBPHN8uKMXRQhGRiir8HmUrzVvbR+AmdyXl+PhYfxTHuu+4FkfsRUS6hODY/Gf78PGRX4zj13cox3hhOf6qhS0J+eqYUnPwcXB9y0Cbn0uKCuRfI7uapnHmgi0nLmrsO7Ij3q910eLZH2E97p2bG2wdDaXZEz/AuqsbPu9u6Ytj3XHK1ymIiKxduQ/WrUvH/87oaueRRYdh/aPb8NdwXGnGf3sI1kuUZsNf3O1Y81QRkR8P4WvazXaaATvi9bX4azIaImLPxplERER01eLkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4uSEiIiJTqXdaqs3EJeLsbps46t4Op4CKS3HKQ0SkcQBO74xpi+/gnrPrLKy7ueK0T6CnK6xXKmkbEZHELNzIMUxpYlZagRM3N7YKgPUP1pyG9fv64gaLIiIdgnAS5+MdZ2DdoiQhmgfhJoGnsnCTyuRsvC9ERNo0wWM6m40TOjFhOKHWUhmT1mhTRCTMG7+vsYE4FXUwEyc9tp7OgfUKJWXnqTTmFBFxd8HHoJuSmhvWOgjWtQaq2hmrHZciIh2UlNixNNv3tay4UL4e38c0aSm0Hf3e2QRfU1SkNw1trKQql//ftQ6N64b3t8D6L082XEPNWz7aAesrHnFsrA2pz1sbYX3TxL5/8kjq7pXVuFHpq0Nwo1IRkSeXHoX1HpH4fLyvR1NY7/TKGlivUH7nHHl9sDqmS/VeXP/GBvWxrf/sD+s1x1pRUiTbXx7GtBQRERFdfTi5ISIiIlPh5IaIiIhMhZMbIiIiMhVOboiIiMhU6p2Wmrfp+EW9WwrKcOog306PoA0nzuEBOuG+RV2b4TulE87hZI3Wm6NFsKc6phLlTvTGSm+plQdxzw53JSXTIxqnqOz1Utp0JB3Wb+yM++UkKOknbbu1PlGp+XpfMDcX/Jp8K07HeXvgpJG/BaeMyiv1Q1R7LL8Er7tFEE4U5SnP3x2PU1SD7fRhaR2IkzUrjmfBemk5fr/7tsQptGKlD9aRNPxei4hkFeDzok1j29RGaVGhvPmX7qZJS73z8wGxeNlenzIL8Xv98k31731zOXj6e5zQWan0pTv+xlCHln/73D3qY5u2xcP6jX3xvk1Q+lRt/Hsfh8Z0tYqZuBLWB18fpb5m1l/a/VHDERGRv/9wDNbfHtlGfU2PqetgfdekgTY/s7cUERERXbU4uSEiIiJT4eSGiIiITIWTGyIiIjIVTm6IiIjIVPQGObXUNtRXvH1s71qe8tNx+NwgH9w7SEQk1A+nWApLcPKqX1Pcj6dFIO5n9MQHuJ/MsEH6HdzlSl+hv3ZoDOuuzjg1tPpwJqzvSsBJnDIlpSUiEncKJ7KK2uD0zqHT2bDet6U/rId6usP6wChcFxHZl5kL61p6R2vntf9MHqz3j8XvtYiIpxuen28+jZcV5IkPeQ8t8VVQCuv2epItP4ZTUTfE4H3+5Ezcb6hLZA9Yv61DBKwfTDmljsmqpAV/3Jls83Nlid5D7Er00LVRtU59aT2ZRP74vkyW6/4B69bt/3V4WX9tj69Pbs74XOn4Mu5bdPC1G2H92we66StXHhv96S5YP3IgCdYnrcJ9nKYO1fs4OWrYh9thfeWj18F6h0k/w3r/bk3Udfi44wTotFta/87oaqcwD5+v9q5Pg2dug/WfHuvZIGN6sne0w6+pmYpqCPzkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4uSEiIiJT4eSGiIiITKXeaan5+5PF3dM2mfJUr+bwuVuScf8oEZGtcThx0zwE9+lZeDgN1jfsSYb1J8d2h3UPV5ySERFJycdJreUncGLp7c/xXegjh3XE63bDd9L3jtbTHR6u+DVaf6cwZf/lKf2r9qfiBNey9XoS54aeuI/JnmM4JXbXgGhYv6djK1j/bB9+T0X0RNtNbQJhPcSCU1/z9qTA+okjeN3lPfSExD8HtID18V/vg/WvJuKkQFwuTkK8uBKnEfOK9f5fIb44qTisQ4jNz9aiAtn3b3UxV5wHvtovrhbb/lkLx3WFz61LIkrrs2Q3UQTc+fS9Dq9b88kunECacye+Dm05hq9ngz7YCutrn7je4TG5K9etkMb4PHU0FWUZ8rb62LVDcQpowzO9HVrHoak3wfoQJX0kIrJaSSBNXYOvp5NuxD24ek7bAOt5ezbCevkgPY31/hh8HGh9qqbd0xnW96YWwvqvR/DxVJd+Yc8tt73WlRbjdSL85IaIiIhMhZMbIiIiMhVOboiIiMhUOLkhIiIiU+HkhoiIiEyFkxsiIiIyFSfDMPQOW3bk5+eLn5+f/N/8HeLmaRu1dG2EY8n+Fj15fjilANaDfHB0t6QcR5mjA3EDziplM8vtNBjTpObhZopN/PFYsworYD0qED+/QIlpi4i4OOF9W6pEosO8XWH9ZFYJrGsN1wKUhpMiIkVleN0tg3D8+GAqjjiXKw1DmwbqDVcrKvF4MwpwLLq0Au/bxn54HdHKe3QmR49dF5fj7VBOC3Ub3F3xvz2ylWaeLYI91TFlFeNjMMTL9n0tKSqUaWO6SV5eXq0bTl6Ozl+fLtV2DHwPN0ONDvWG9bljO8H63V/irw8QEZl/TxdYv+F9vO5fnuwF60Nn4QaSqybgBpJ1cefne2Fdi+VrtNh1gLfe2LeoFB/7y8ZfA+taBL5MaT5bl4jzgOmbYX39047F0ycuOwbrCZl6A9xFD+KvRdH8GcdHbTlyXvOTGyIiIjIVTm6IiIjIVDi5ISIiIlPh5IaIiIhMhZMbIiIiMpV6N848kHBOXDxs0xsRQbhZ45aj+epypt+Gm3OdKSiG9fVKo81tp3FzzpvaBeMxKcsREWkaiNMnQV5usF5WgVMvXZvg/fHl1rOw3iEqQB1Tm1CcBkvKxemdRTtw48debUJhPbsIL+fAmTxYFxHpGu0P6+uPZ8P6/ddGwnp+GW5U+sNB3IBTRCQ1G6cCHhmIm7dGeuP9982hdFhfexRvg48Fp9BERAqseDt6t8JNAntF4Pf7zQ2nYb17FN7fyUqKT0RkRBt8/M/dYXsMlltr35juSnXNv9fDelJilvqatP/dBusfbIqHdU93fGnVUlFPLj0K61oiyp5AO8khxNHUy7APcXpGRGTlo3hZWgpTc9uc3bCuNaJs+8JqdVlH/zME1jtPXgvr+/816HdGV3+OpqI0P+/C13d72+Dodj/ZD19LHWWvwehd10bA+rhrmtV5ffzkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4uSEiIiJT4eSGiIiITKXeaamqKkOqatwJf64QpzbKlB5EIiI7UnNgvWMI7h9xLBmnd2LC8fOb++Pk0/ZGegqoeyTuA5OkpFJClT5Om+NxSkxpE6X2ZBLR+071VxJLX/98AtbdOoTBekIGTsvYa0Hm7eYM67d3C4f1EAtOc6yNw8eA1l9MRGTdBrx9pzvhdfdrHgLrAZ44FZVnxaeIrydOzImIHInHib12PfGd/75K8ipXOY+CvPCYtLqISIQPTonVPFcrSvSeWWax86UBsB58/0L1Nc/8gHv49I3yg/UVj1zr0JjeH9XWoefb42jvIEe1VK6x9ozoiM87jbtyTdFoiSgRkfHfHoL1ewZGO7SOP0NcphXWW4Tg89fLS09tahxNg4V567+PHKEl3UREtp3SU8t1xU9uiIiIyFQ4uSEiIiJT4eSGiIiITIWTGyIiIjIVTm6IiIjIVOqdlmoc5CmuFtveSR6u+E53L6XfiojI3rM4pdNY6ZOi9U/xdsfrPnEO9yAK8NKTOMn5OK3i7oxjTonn8PNrpsnO8/LAd7q7KssXEckswmmpDCted1iYD6x7uuJ5bYgvvjO+pByv97fHcAouuxj3WKqowu91I2Wzg+wkAtp3bqo+hqTm4DRCsbINhUqfqHBlP4mItIjECZrkQrzusiq8b4OVdeSX4OeXVuhpxKPZOLFXM/VV7mT+tJQmMlpP9OyNw2m6d0e2+aOGU2dan6pcpW/cF3fjvn4a7fpnz3098Hka9vC3sD5icMOlx0rK8PnyjwEtYX3qmlOwvl1Jc3p76L/XFtzX9XdGZ0tLRWm2PNcP1kd/ukt9TVJaAawP6d4E1qMD8O/I7tGOpeYeWHBAfUzruVYf/OSGiIiITIWTGyIiIjIVTm6IiIjIVDi5ISIiIlPh5IaIiIhMpd5pqV4t/cXiZZvISVBSQ/5eej8eLcWSZcV3+I/uHArrB1JxKurTlSdh/ebe0eqYlu9Ng/XnboqB9dnbz8J6p0h8V/nB0ziBkVaAEzoiImfPlcD6qXS83f5+OHHTzB/Xw31wMulYBl6viEiV4DTYwq1JsN42KhDWO0d4wXpppd7X6qUhsbCeVYKPQa2HWZaSjNu1B7+nUUP0lEzvlgGwvj+1GNYTMzNgPTYcJ93ClPdo4Y4UdUw74vC/Y6pq9AyrUFJjV6qXVp4Qd0/bHnEJmfhc2TfFsZ47IiLPrzgO69Nuae3QciyjP4F165K/OTwmrU9VlylrHVpO6EPfwHrG7DscHpMm/dPbHXr+lNX4Oj5lSCv1NVpbvCaPLoH15A9HOzSmxxYfUR9boFw/xnaLdGgdjiqv1M/jXZMG/qHr1tQlEdX6n6tsfq4sxecuwk9uiIiIyFQ4uSEiIiJT4eSGiIiITIWTGyIiIjIVTm6IiIjIVJwMQ7uX3L78/Hzx8/OTFk8skkbunjaP/WVAC/iazEK9b02c0u/ihZvwXfD/+Br3qegcGwzrD3THd6dvPovTMyIiPkqfqu0JebDuojRHCvHBvTnS8nECaVw33ONDRMRZWUdcLr6LfEs8HmuEkpb69QhO7mQqCRMRkRuuwX1j2oXjPinaEVdUhu/wLyjV+1rtjDsH6yO7hMF6pA/e7m/2p8O6uwue/7sr/dNERFyc8Wu0U23Cdc1g/e/fH4b1dOW9GHW93mcrNhi/FwfSbJdVWlwo797eQ/Ly8sTX17HeMZeT89enK307GlKftzbC+qaJfRtsHd/uw4m9fam4n9ymY5mwHhHkCeu/bkuA9XPp+DogItJ/UAdY79AU94B7a8Qf3y/s7z8cg/XYUHx9+mxDIqy7ueHA84ZneqvrvuWjHbBeMzl53r+H4fRdNwd7SzUER85rfnJDREREpsLJDREREZkKJzdERERkKpzcEBERkalwckNERESmwskNERERmUq9G2fefWML8ajROHNPIo4fW5TYmojItOHtYH3GVhyBe/5W3Jhuv9I4871N8cqY9EivJtwfx2rbh+G6prwKR++mrsLN+EREOihNJ1NycFPG7lH+sJ6Yg2Po9/TGkflgC46zi4jsScEx/vXHcTwzPADHPF2dccw9uwA3tRQRual9CKzHZ+PXrD6cBevafgpVmlTuOYujrSIiaUosv4sSPX1m6SH8/CjcgDO2ezis7zyjjym7uALWi0pt62XF+tc1XIkeX3xE3Go0zvz0DhwN1qLSIg0bl0Y6vbIG1g+8emODrWNAB/z1CJrhH+PI8PL/u1Z9zewtuFluciq+RvRohxsgq00W7+uqrvuPNurTXbC+9OEeDi/r7ZE4bt5h0s+w3q0t3k9f3N0Z1gd9sFVdt/a1Hp1j8bX0UkS+GwI/uSEiIiJT4eSGiIiITIWTGyIiIjIVTm6IiIjIVDi5ISIiIlOpd1qqqZ+HWLxtm31F+uFkjZ8bTp6IiLyw4gisN1IaRWYV4zu4PZRGh6XluPniwNZB6pg2nsRpn9Jy3ODx+/246WRMuA+sd4nAqSFlk0VEpFBpIlmpJK+uicAJnVwrTs9U4k2TlAKcrhIRScvHySRX5b3YfQLvp97tcJqjTbiXuu4BzXCj1Nl7zsL6sZM4LdUkEL8Xxcp7He7rpo5pZFs8pvfW48ReVhZOurW9NgLWVx/JhnVXpWGniMjWg7hB4dPDbRvTWgud5Et1KVeelsEe4uFVuxSjvURU82dWwHr8u7c4NJ57v9wP62Vl+Lx++vuj6rKm34obGmqmDo116Pk9onFaz56yCrwdB1/Dqa+vlPN0tJJMaqVcS7XkqYjIvnh8HV/3VC9YDx+/CNbT/ncbrD/09UF13fd0xufwtLUnYf3Q1JvUZSG3z90D62ufuN6h5YiItHj2R/zAPTiR9a+f8DZMHoybXf/ZBxpYPQAACmlJREFU+MkNERERmQonN0RERGQqnNwQERGRqXByQ0RERKbCyQ0RERGZipNhGPpt5nbk5+eLn5+fTFy0W9y9bHu3pOXj/jT2UkA3xOA78zfG4z5V7cNxuuVEphXW03Jx2sfboie4SpUEQ5MAD1h3c8Eb6O+BQ2nHMnBKJrdI7++j9bXKUvovRfjjsVYo6YJIf5x0szcLzigsh/XTGbiHSZNAvA2+HrjPV3o+Xr6ISPMgvH3FSjrubA7eT4Fe+DjQzg6rkqISEamowo/5KsdBah4+Nhv74W0LU/pdHc/Ax76IiLuSXKuotN3AsuJC+ez+npKXlye+vldmTxmR/399aqjteGdDHKz/fASn0Do2xb3Kjqbkw/qy8dfUbWAOuPvLfbAe4IXP+WPJubC+5nHHkzhXq/Hf4r5xViX1Ok9JJo2ZvRvWtV/fS+rQ78pR2pgWP9Qd1m94f4u6LHdXfO0vrxHfrbAWyYbnbqrVec1PboiIiMhUOLkhIiIiU+HkhoiIiEyFkxsiIiIyFU5uiIiIyFTq3VvKxclJXJxsU0LtwnCSSUvDiIisOIx7/hSW4KSMxRXPy7TeUpFKQifQU98FKUpKJzEbp5w83fGyij1xeqZ3NL7be18qXr6IiKey3QVWXO8ThXtL/Xgc9yfKLsLbrO1XEZFzxfg1kUH4OIjPKFSej3tI+Vn09yjUGz+2OR7vwyMJuM9Mz7ahsN42DB83p7Nw6kpEpIkffo3Wq8zJCafsQrxxKmrzqRxYjw7Re3AlZuLk2i0dQ2x+thYZ8pm6FHObtOqE+pjWl+nZ/i3+qOE0uBTlujX/ni5/+Lr7vbMJ1n99tg+sD/pgK6zXpWeSo8vq9MoaWD/wKu6PpfV3EhFpH4F7YU0Z4lj/JS2BpLE3pm8f6ObQsjSOjumXJ3EvLxGRzpPXwvqI65va/Fxa5C4bark+fnJDREREpsLJDREREZkKJzdERERkKpzcEBERkalwckNERESmUu+01NG0AnG12Pa3KC6tgM9Nz8KJDRGRVU/3hfX5e5NgPcgTJ0kW7kqBda1/ysEzuH+KiJ5+atcEp5y83PBcsWs4fv6/Vx6HdRc7yaRB7cNgXest9fHmM7DeOgKPKUTpsZSppKhERJr44X373aZEWJ84sjWsW1xwmm72FnwMiIhkFuBk0s1tg2G9e1OcXtB6km04iY+P8gq9t9QB5Zga0AaPSTtutiXgvmp/6YKPgWWHcJ8jEZGRnXEa7Ls9aTY/l1txks1M2r6wGtaP/meI+hqtv9l/1p2C9em3tnV8YEDsc6vUx068ORTWx36xF9bXPYXTKl/uPgvr93SPhPWop5apY7qhZxSsnzun9z1DHE1F3fk53mYRPYWpuaNfNKy/tzEe1u2lj7SUmJaWem45/p3gaE+yuLP42vFnmPbLaVh//oaW6mv2/2tQrZadn58vb9VyHPzkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4uSEiIiJT4eSGiIiITKXeaal7ujURT2/bBMpPp3Hfoq7R/upyNsfh3lKfr8N3qD98I+7p0iUKr6OwtBLW3ZReTSIi7cPxXfYJOTiZ1CzADda/3IsTXE/fiO8ef/+XOHVMK3Ynw/rATuGw7uWGE0iHkgtgvbEv3obrInG6SkRkyxl8J/+d/aNh/UAqTs0lKP2PQv081HUXluBk3gzluAkLwOmqoe2CYD2kRSCsH8zA+09E5GxuGaw7K4fakj2psH5Pzyaw7uWKT9trm+vn1/4UvG+vaR5g83NJkav8pC7FHGKa4/fUnjBfnCKct3gfrDualrr1k52w3kRJZtqz4L6uDj1fS0Vpqir1pODcsZ1g3V6aqSEsHKdvs70+S8ieRJw0SkjG9af6NleXpfXOavbED7AeGobTnLsmDYT1uTtxGrZr6xBYb0hTVp/EdQf7Ztkz7qsDNj+XFdc+zclPboiIiMhUOLkhIiIiU+HkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4uSEiIiJTcTIMw/j9p10sPz9f/Pz85PGFO8Xd09vmsUNJODLnY8FxShGR2HBvWI8JxjHgojIc7da25uttuDlcbKQenz1XiCPfo5UmhFalmWJhGa5/rzT5bNcsANZFRIK98T5MyS2B9ZwiHEvu1wrHYbWQp6edyHx+CX4vTmXhZnk+HjjKXFqO167F00VEwn3w/tCOj5IKfIAk5eL3OvlcMawH++rx9KRMHFcc1hEfN7+ezIF1d2WfD22LY+sbTuvN8ryVfX4g4ZzNzxUlRbL5hSGSl5cnvr6Ox5AvF+evTyNnrBdXi+21ZdGD3RtsPf/6CcdhJw/GcdhnfjgG6++ObNNgY3rtZ9zM8+WbYmB9wndHYH3WX9o12JgayqUc69BZ22F91YTrHF7Wk0uPwnqxct369I4ODq/DUdqY3h/l2NcaaMff2sPp6mt8PPE1vmZj0PPndW2uT/zkhoiIiEyFkxsiIiIyFU5uiIiIyFQ4uSEiIiJT4eSGiIiITKXejTPbh1rE4u1pUytW0kGxIXrCZOay47D+lwG4QaabixOsdw7HjcdcXPA8rnkQbqQoIuKEV6EmirKKcBPHwym4yWL7KJyKahbgro7peDpO74T54tf4e+lJI6RESSwFWnADThGRFOUO/x5NcQJu1WHcJLW0HC+nZ1SEum5vpYnk1gS8Du099VWSfAHK/ovw0/erh6sfrGcV4+PjTCpuPDqgc2NYTyssh/VALz2NeDQZr6NJsG1z2HJrncKTl61593a9KFWhpV5iGuvpixmjcWJEO7c1rZT0Z10MmL4Z19viVJ5m1/EM5RGcQHro64Pqsmb/taND6x716S5YjwzCTYtLlGtEXTy/Av/OmXZLa1hPOJPbYOt2NIGk7aelD/doiOGIiONj0pxVkqfrn+6tvqbtC6sbZN0X4ic3REREZCqc3BAREZGpcHJDREREpsLJDREREZlKnW8oPt+1wVp08VfNlxUXwdeUFOEbIUVEqkrxjbKlxfir7A1nfGeo8nSpKNHGpN8QWFaMWxpYC/GcsERpdVBuxYMqE3xTaokbXs5vY8ItDUpd8L4tr8I3iJa44JtPtRuKrY30m1VLle22KuvQ9keFtu5C/T1qpNxQXKYcCNoNxaUGHmtZMd62Und8c7CISJkV3/RYWoVvyq5Ujk3t2C9xUtpXKDcsi+j7vNzZrcbzfhtLHbuyXDbOjz8//+IbqSuseH+XFdtpMQKWI6LvV+35VuV6oz3fHu2aVgquyfbWoR1/2vO1c8veazTqtbEYH39lpfgYr8v+084vdT+VOrafGpKjx9mlpB0f9sZa2317/ufaXJ/q3Fvq7Nmz0rRp07q8lIguc0lJSRIZGXmph1FnvD4RmVdtrk91ntxUVVVJSkqK+Pj4iJP2z2EiuqIYhiEFBQUSEREhjRpduX+15vWJyHwcuT7VeXJDREREdDm6cv9pRkRERARwckNERESmwskNERERmQonN0RERGQqnNwQERGRqXByQ0RERKbCyQ0RERGZCic3REREZCqc3BAREZGpcHJDREREpsLJDREREZkKJzdERERkKv8P8gJ3Ff+6VYoAAAAASUVORK5CYII=" + }, + "metadata": {} + } + ], + "execution_count": 5, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:51:06.820Z", + "iopub.execute_input": "2020-11-06T21:51:07.061Z", + "iopub.status.idle": "2020-11-06T21:51:11.112Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Let us create our problem now:" + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "M = opnorm(Zfull,2)\n", + "f = SquaredLossMatrixCompletion(Zobs, iterative = true)\n", + "r = rank(Zfull)\n", + "Z0 = zeros(size(Zobs))\n", + "C = RankSet(M, r)\n", + "setting = NExOS.Setting(μ_max = 5, μ_min = 1e-8, μ_mult_fact = 0.5, n_iter_min = 1000, n_iter_max = 1000, verbose = false, freq = 1000, tol = 1e-4, γ_updt_rule = :safe)\n", + "problem = NExOS.Problem(f, C, setting.β, Z0)" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 6, + "data": { + "text/plain": "Problem{ProximalOperators.LeastSquaresIterative{1,Float64,Float64,SparseArrays.SparseMatrixCSC{Float64,Int64},Array{Float64,1},ProximalOperators.AAc},RankSet{Float64,Int64},Float64,Array{Float64,2}}(description : Least squares penalty\ndomain : n/a\nexpression : n/a\nparameters : n/a, RankSet{Float64,Int64}(1.0000000000000002, 5), 1.0e-10, [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0])" + }, + "metadata": {} + } + ], + "execution_count": 6, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:51:11.124Z", + "iopub.execute_input": "2020-11-06T21:51:11.136Z", + "iopub.status.idle": "2020-11-06T21:51:12.475Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Time to solve our problem..." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "state_final = NExOS.solve!(problem, setting)\n", + "\n", + "Z_estimated = state_final.x" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 7, + "data": { + "text/plain": "40×40 Array{Float64,2}:\n 0.00921175 -0.0038448 -0.00129091 … -0.00447769 -0.00843496\n -0.0147354 0.0011719 -0.000635124 0.00706023 0.00987895\n -0.0444565 -0.00207041 -0.0177526 0.0141985 0.00632431\n -0.0026325 0.00607353 0.0049039 -0.00903814 0.00128217\n 0.0112686 0.0109044 -0.0102908 0.00818285 -0.0127803\n 0.0291193 -0.00187681 0.000716274 … 0.013873 -0.00132453\n -0.0028306 -0.0188753 -0.0163884 0.0218045 -0.00466638\n -0.00147231 0.0134306 0.0115974 -0.00983515 0.00878667\n 0.0105054 0.00433151 0.00609612 0.00531648 0.00636103\n -0.0051754 0.0209462 0.0130169 -0.0116038 0.0111824\n -0.0119 -0.000936404 -0.0142817 … 0.00737043 -0.00860866\n -0.0144667 -0.00838286 -0.0160724 0.0259919 0.00282836\n -0.00400986 0.0017526 -0.00431504 0.019474 0.00892805\n ⋮ ⋱ \n 0.0347436 0.0353489 0.0538003 -0.0787362 0.00203755\n -0.00400174 0.0174897 0.0194419 -0.0309039 0.00560616\n -0.0107255 -0.0128399 -0.000264674 … 0.0154012 0.0140037\n 0.0289964 0.00806107 0.0227597 -0.0375122 -0.00791514\n -0.00530796 0.00876034 0.00745653 -0.00555373 0.00819782\n -0.0171454 -0.0176783 -0.0123308 0.0109885 -0.000278122\n 0.00567607 0.00238189 0.00376122 0.00607806 0.00619147\n -0.00492816 -0.0150325 -0.013713 … 0.0200635 -0.00200861\n 0.0283517 0.0228515 0.0230801 0.00278105 0.0182905\n -0.0135796 0.00143811 -0.0111562 0.0100748 -0.00156238\n -0.00264522 0.00379569 0.0072004 -0.000408534 0.00941569\n -0.0167782 -0.00217151 -0.0046546 0.00407195 0.00347602" + }, + "metadata": {} + } + ], + "execution_count": 7, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:51:12.485Z", + "iopub.execute_input": "2020-11-06T21:51:12.491Z", + "iopub.status.idle": "2020-11-06T21:51:18.816Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Finally, we plot a simple histogram to see how much of the matrix has been recovered." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "figure(figsize=(8,3))\n", + "PyPlot.hist(vec(Zfull - Z_estimated ),100)\n", + "xlim([-0.5,0.5]),xlabel(\"Absolute Errors/Residuals\",fontweight=\"bold\"),tight_layout()\n", + "show()\n", + "PyPlot.display_figs()" + ], + "outputs": [ + { + "output_type": "display_data", + "data": { + "text/plain": "Figure(PyObject
)", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAxYAAAEiCAYAAABkykQ1AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAPYQAAD2EBqD+naQAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3deXxU9b3/8feQZUgwCRJIQiSQYKMsQdk0skmsAiI77VXEIgj3UbggFKliIgjBW0OhiqiIVa9CWmW5LqBeXEgrm+yERVkuQg2bECNeOgkQE0i+vz/45ZQhISY5M5kkvJ6Pxzya+c73nPnMzFc67/me7zkOY4wRAAAAANhQz9cFAAAAAKj9CBYAAAAAbCNYAAAAALCNYAEAAADANoIFAAAAANsIFgAAAABsI1gAAAAAsI1gAQAAAMA2f18XUBXFxcU6efKkQkJC5HA4fF0OAAAAUCcZY5SXl6fo6GjVq1f+nEStDBYnT55UTEyMr8sAAAAArgnHjx9Xs2bNyu1TK4NFSEiIpEsvMDQ01MfVAAAAAHVTbm6uYmJirO/f5amVwaLk8KfQ0FCCBQAAAOBlFVl+wOJtAAAAALYRLAAAAADYRrAAAAAAYBvBAgAAAIBtBAsAAAAAthEsAAAAANhGsAAAAABgG8ECAAAAgG218gJ5AIDaJzZ5lfX3kT/282ElAABvYMYCAAAAgG0ECwAAAAC2ESwAAAAA2EawAAAAAGAbwQIAAACAbQQLAAAAALYRLAAAAADYVulgsX79eg0YMEDR0dFyOBxauXKl2+PGGKWmpio6OlpBQUFKSkrSvn373PoUFBRo4sSJaty4sRo0aKCBAwfqxIkT9l4JAAAAAJ+pdLA4d+6cbr31Vi1YsKDMx+fOnat58+ZpwYIF2r59u6KiotSrVy/l5eVZfSZPnqwVK1Zo2bJl+vLLL3X27Fn1799fRUVFVX8lAAAAAHym0lfe7tu3r/r27VvmY8YYzZ8/X9OmTdPQoUMlSenp6YqMjNSSJUs0duxYuVwuvfnmm/rrX/+qe+65R5L09ttvKyYmRn/729/Up08fGy8HAAAAgC94dI1FVlaWsrOz1bt3b6vN6XSqZ8+e2rRpkyQpMzNTFy5ccOsTHR2thIQEq8+VCgoKlJub63YDAAAAUHN4NFhkZ2dLkiIjI93aIyMjrceys7MVGBio66+//qp9rjR79myFhYVZt5iYGE+WDQAAAMAmr5wVyuFwuN03xpRqu1J5fVJSUuRyuazb8ePHPVYrAAAAAPs8GiyioqIkqdTMQ05OjjWLERUVpcLCQp05c+aqfa7kdDoVGhrqdgMAAABQc3g0WMTFxSkqKkoZGRlWW2FhodatW6euXbtKkjp16qSAgAC3PqdOndLevXutPgAAAABql0qfFers2bM6fPiwdT8rK0u7d+9Wo0aN1Lx5c02ePFlpaWmKj49XfHy80tLSFBwcrOHDh0uSwsLCNGbMGP3+979XeHi4GjVqpMcff1zt2rWzzhIFAAAAoHapdLDYsWOH7rrrLuv+lClTJEkjR47U4sWLNXXqVOXn52v8+PE6c+aMEhMTtXr1aoWEhFjbvPDCC/L399f999+v/Px83X333Vq8eLH8/Pw88JIAAAAAVDeHMcb4uojKys3NVVhYmFwuF+stAKCWiE1eZf195I/9fFgJAKCiKvO92ytnhQIAAABwbSFYAAAAALCNYAEAAADANoIFAAAAANsIFgAAAABsI1gAAAAAsI1gAQAAAMA2ggUAAAAA2wgWAAAAAGwjWAAAAACwjWABAAAAwDZ/XxcAAKjbYpNX+boEAEA1YMYCAAAAgG0ECwAAAAC2ESwAAAAA2EawAAAAAGAbwQIAAACAbQQLAAAAALYRLAAAAADYRrAAAAAAYBvBAgAAAIBtBAsAAAAAthEsAAAAANhGsAAAAABgG8ECAOAzscmrFJu8ytdlAAA8gGABAAAAwDaCBQAAAADbCBYAAAAAbCNYAAAAALDN48Hi4sWLmj59uuLi4hQUFKSWLVvqmWeeUXFxsdXHGKPU1FRFR0crKChISUlJ2rdvn6dLAQAAAFBNPB4s5syZoz//+c9asGCBDhw4oLlz5+pPf/qTXn75ZavP3LlzNW/ePC1YsEDbt29XVFSUevXqpby8PE+XAwAAAKAaeDxYbN68WYMGDVK/fv0UGxurX//61+rdu7d27Ngh6dJsxfz58zVt2jQNHTpUCQkJSk9P1/nz57VkyRJPlwMAAACgGng8WHTv3l1///vf9c0330iS9uzZoy+//FL33XefJCkrK0vZ2dnq3bu3tY3T6VTPnj21adMmT5cDAAAAoBr4e3qHTz75pFwul1q1aiU/Pz8VFRXp2Wef1YMPPihJys7OliRFRka6bRcZGamjR4+Wuc+CggIVFBRY93Nzcz1dNgAAAAAbPD5jsXz5cr399ttasmSJdu7cqfT0dD333HNKT0936+dwONzuG2NKtZWYPXu2wsLCrFtMTIynywYAAABgg8eDxRNPPKHk5GQNGzZM7dq104gRI/TYY49p9uzZkqSoqChJ/5q5KJGTk1NqFqNESkqKXC6XdTt+/LinywYAAABgg8cPhTp//rzq1XPPK35+ftbpZuPi4hQVFaWMjAx16NBBklRYWKh169Zpzpw5Ze7T6XTK6XR6ulQAgIfEJq+y/j7yx36V6g8AqBs8HiwGDBigZ599Vs2bN1fbtm21a9cuzZs3T6NHj5Z06RCoyZMnKy0tTfHx8YqPj1daWpqCg4M1fPhwT5cDAAAAoBp4PFi8/PLLevrppzV+/Hjl5OQoOjpaY8eO1YwZM6w+U6dOVX5+vsaPH68zZ84oMTFRq1evVkhIiKfLAQAAAFANHMYY4+siKis3N1dhYWFyuVwKDQ31dTkAcM0r71Coihz2VJHDpwAA1a8y37s9vngbAAAAwLWHYAEAAADANoIFAAAAANsIFgAAAABsI1gAAAAAsI1gAQAAAMA2ggUAAAAA2wgWAAAAAGzz+JW3AQCoqvIutAcAqNmYsQAAAABgG8ECAAAAgG0ECwAAAAC2scYCAOAVl6+XAADUfcxYAAAAALCNYAEAAADANoIFAAAAANsIFgAAAABsI1gAAAAAsI1gAQCokWKTV3FmKQCoRQgWAAAAAGzjOhYAgCpjRgEAUIIZCwAAAAC2ESwAAAAA2EawAAAAAGAbaywAAB7FugsAuDYxYwEAAADANmYsAAA+xywHANR+zFgAAAAAsI1gAQAAAMA2ggUAAAAA27wSLL777jv95je/UXh4uIKDg9W+fXtlZmZajxtjlJqaqujoaAUFBSkpKUn79u3zRikAAAAAqoHHg8WZM2fUrVs3BQQE6NNPP9X+/fv1/PPPq2HDhlafuXPnat68eVqwYIG2b9+uqKgo9erVS3l5eZ4uBwAAAEA18PhZoebMmaOYmBgtWrTIaouNjbX+NsZo/vz5mjZtmoYOHSpJSk9PV2RkpJYsWaKxY8d6uiQAAAAAXubxGYuPPvpInTt31r/9278pIiJCHTp00BtvvGE9npWVpezsbPXu3dtqczqd6tmzpzZt2uTpcgAAAABUA48Hi2+//Vavvvqq4uPj9fnnn2vcuHGaNGmS/vKXv0iSsrOzJUmRkZFu20VGRlqPXamgoEC5ubluNwAAAAA1h8cPhSouLlbnzp2VlpYmSerQoYP27dunV199VQ8//LDVz+FwuG1njCnVVmL27NmaNWuWp0sFAAAA4CEen7Fo2rSp2rRp49bWunVrHTt2TJIUFRUlSaVmJ3JyckrNYpRISUmRy+WybsePH/d02QAAAABs8Hiw6Natmw4ePOjW9s0336hFixaSpLi4OEVFRSkjI8N6vLCwUOvWrVPXrl3L3KfT6VRoaKjbDQAAAEDN4fFDoR577DF17dpVaWlpuv/++7Vt2za9/vrrev311yVdOgRq8uTJSktLU3x8vOLj45WWlqbg4GANHz7c0+UAAAAAqAYeDxa33XabVqxYoZSUFD3zzDOKi4vT/Pnz9dBDD1l9pk6dqvz8fI0fP15nzpxRYmKiVq9erZCQEE+XAwAAAKAaOIwxxtdFVFZubq7CwsLkcrk4LAoAfCg2eZXXn+PIH/t5/TkAAGWrzPduj89YAADqvuoIFACA2sXji7cBAAAAXHsIFgAAAABsI1gAAAAAsI1gAQAAAMA2ggUAAAAA2wgWAIBaITZ5FWejAoAajGABAAAAwDaCBQAAAADbCBYAAAAAbOPK2wCAGo11FQBQOzBjAQAAAMA2ggUAAAAA2wgWAAAAAGwjWAAAAACwjWABAAAAwDbOCgUAqBDOzgQAKA8zFgAAAABsI1gAAAAAsI1gAQAAAMA2ggUAAAAA2wgWAAAAAGwjWAAAAACwjWABAAAAwDaCBQAAAADbCBYAAAAAbCNYAAAAALCNYAEAAADANoIFAAAAANsIFgAAAABs83qwmD17thwOhyZPnmy1GWOUmpqq6OhoBQUFKSkpSfv27fN2KQAAAAC8xKvBYvv27Xr99dd1yy23uLXPnTtX8+bN04IFC7R9+3ZFRUWpV69eysvL82Y5AAAAALzEa8Hi7Nmzeuihh/TGG2/o+uuvt9qNMZo/f76mTZumoUOHKiEhQenp6Tp//ryWLFnirXIAAAAAeJHXgsWECRPUr18/3XPPPW7tWVlZys7OVu/eva02p9Opnj17atOmTWXuq6CgQLm5uW43AAAAADWHvzd2umzZMmVmZmrHjh2lHsvOzpYkRUZGurVHRkbq6NGjZe5v9uzZmjVrlucLBQAAAOARHp+xOH78uH73u9/pnXfeUf369a/az+FwuN03xpRqK5GSkiKXy2Xdjh8/7tGaAQAAANjj8RmLzMxM5eTkqFOnTlZbUVGR1q9frwULFujgwYOSLs1cNG3a1OqTk5NTahajhNPplNPp9HSpAIBaKDZ5lfX3kT/282ElAIDLeXzG4u6779bXX3+t3bt3W7fOnTvroYce0u7du9WyZUtFRUUpIyPD2qawsFDr1q1T165dPV0OAAAAgGrg8RmLkJAQJSQkuLU1aNBA4eHhVvvkyZOVlpam+Ph4xcfHKy0tTcHBwRo+fLinywEAVFHJzACzAgCAivDK4u2fM3XqVOXn52v8+PE6c+aMEhMTtXr1aoWEhPiiHAAAAAA2VUuwWLt2rdt9h8Oh1NRUpaamVsfTAwAAAPAyr155GwAAAMC1gWABAAAAwDaCBQAAAADbCBYAAAAAbCNYAAAAALCNYAEAAADANoIFAAAAANsIFgAAAABsI1gAAAAAsI1gAQAAAMA2ggUAoM6ITV6l2ORVvi4DAK5JBAsAAAAAthEsAAAAANhGsAAAAABgm7+vCwAA1GysWQAAVAQzFgAAAABsY8YCAFDrMasCAL7HjAUAAAAA2wgWAHAN47oPAABPIVgAAAAAsI1gAQAAAMA2ggUAAAAA2zgrFACgzq2zuPz1HPljPx9WAgDXDmYsAAAAANhGsAAAAABgG8ECAAAAgG0ECwAAAAC2ESwAAAAA2EawAAAAAGCbx4PF7NmzddtttykkJEQREREaPHiwDh486NbHGKPU1FRFR0crKChISUlJ2rdvn6dLAQAAAFBNPB4s1q1bpwkTJmjLli3KyMjQxYsX1bt3b507d87qM3fuXM2bN08LFizQ9u3bFRUVpV69eikvL8/T5QAAyhCbvKpOXLuirrwOAKgLPH6BvM8++8zt/qJFixQREaHMzEzdeeedMsZo/vz5mjZtmoYOHSpJSk9PV2RkpJYsWaKxY8d6uiQAAAAAXub1NRYul0uS1KhRI0lSVlaWsrOz1bt3b6uP0+lUz549tWnTJm+XAwAAAMALPD5jcTljjKZMmaLu3bsrISFBkpSdnS1JioyMdOsbGRmpo0ePlrmfgoICFRQUWPdzc3O9VDEAAACAqvDqjMWjjz6qr776SkuXLi31mMPhcLtvjCnVVmL27NkKCwuzbjExMV6pFwAAAEDVeC1YTJw4UR999JHWrFmjZs2aWe1RUVGS/jVzUSInJ6fULEaJlJQUuVwu63b8+HFvlQ0AAACgCjx+KJQxRhMnTtSKFSu0du1axcXFuT0eFxenqKgoZWRkqEOHDpKkwsJCrVu3TnPmzClzn06nU06n09OlAsA14fKzJh35Yz8fVgIAqMs8HiwmTJigJUuW6MMPP1RISIg1MxEWFqagoCA5HA5NnjxZaWlpio+PV3x8vNLS0hQcHKzhw4d7uhwAAAAA1cDjweLVV1+VJCUlJbm1L1q0SKNGjZIkTZ06Vfn5+Ro/frzOnDmjxMRErV69WiEhIZ4uBwBwGa75AADwFq8cCvVzHA6HUlNTlZqa6umnBwAAAOADXr+OBQAAAIC6j2ABAAAAwDavXiAPAICa4sr1JZwhCwA8ixkLAAAAALYxYwEAqNM4ExYAVA9mLAAAAADYRrAAAAAAYBvBAgAAAIBtBAsAAAAAthEsAAAAANhGsAAAAABgG8ECAAAAgG1cxwIA6iiu31B5Je8ZV+UGgMpjxgIAAACAbQQLAAAAALYRLAAAAADYRrAAAAAAYBvBAgAAAIBtnBUKAOoYzgZVebxnAGAfMxYAAAAAbCNYAEANEZu8il/OAQC1FsECAAAAgG0ECwAAAAC2sXgbAHBNquphZyXbHfljP0+WAwC1HjMWAAAAAGxjxgIAapErf2XnV3PvKGs2g/caAMrHjAUAAAAA25ixAIBqUB3H5XOqWu8q7/292ud7+TbMeACo65ixAAAAAGAbMxYAUMOU9ev31X4tZ5bCd3jvAcCdT2csFi5cqLi4ONWvX1+dOnXShg0bfFkOAAAAgCry2YzF8uXLNXnyZC1cuFDdunXTa6+9pr59+2r//v1q3ry5r8oCgEofF1+ZMzVVZq0Fv4jXPnXlM+NaHQCqwmczFvPmzdOYMWP07//+72rdurXmz5+vmJgYvfrqq74qCQAAAEAV+WTGorCwUJmZmUpOTnZr7927tzZt2uSLkgCgXFf+gluRMwRVZH+49lRkLFVmpqC82YXaeN0TZkuA2ssnweL06dMqKipSZGSkW3tkZKSys7NL9S8oKFBBQYF13+VySZJyc3O9WyiAa1JxwXnr75J/Z0rarrwPVFRlxlJl/v/tyv2V9Vhl91vePr3Nl88NoLSS/xaNMT/b16dnhXI4HG73jTGl2iRp9uzZmjVrVqn2mJgYr9UGAJIUNr/8+0BFVWYsVWWcVWSbyu7Xl+Od/9aAmiUvL09hYWHl9vFJsGjcuLH8/PxKzU7k5OSUmsWQpJSUFE2ZMsW6X1xcrP/7v/9TeHh4mUEEl+Tm5iomJkbHjx9XaGior8tBLcU4gqcwluApjCV4CmPp5xljlJeXp+jo6J/t65NgERgYqE6dOikjI0NDhgyx2jMyMjRo0KBS/Z1Op5xOp1tbw4YNvV5nXREaGsp/LLCNcQRPYSzBUxhL8BTGUvl+bqaihM8OhZoyZYpGjBihzp07q0uXLnr99dd17NgxjRs3zlclAQAAAKginwWLBx54QD/++KOeeeYZnTp1SgkJCfrkk0/UokULX5UEAAAAoIp8unh7/PjxGj9+vC9LqNOcTqdmzpxZ6jAyoDIYR/AUxhI8hbEET2EseZbDVOTcUQAAAABQDp9deRsAAABA3UGwAAAAAGAbwQIAAACAbQSLOuTMmTMaMWKEwsLCFBYWphEjRuif//xnhbcfO3asHA6H5s/ncqfXusqOpQsXLujJJ59Uu3bt1KBBA0VHR+vhhx/WyZMnq7Fq1AQLFy5UXFyc6tevr06dOmnDhg3l9l+3bp06deqk+vXrq2XLlvrzn/9cTZWipqvMWPrggw/Uq1cvNWnSRKGhoerSpYs+//zzaqwWNVVl/00qsXHjRvn7+6t9+/ZerrBuIVjUIcOHD9fu3bv12Wef6bPPPtPu3bs1YsSICm27cuVKbd26tUJXVUTdV9mxdP78ee3cuVNPP/20du7cqQ8++EDffPONBg4cWI1Vw9eWL1+uyZMna9q0adq1a5d69Oihvn376tixY2X2z8rK0n333acePXpo165deuqppzRp0iS9//771Vw5aprKjqX169erV69e+uSTT5SZmam77rpLAwYM0K5du6q5ctQklR1HJVwulx5++GHdfffd1VRpHWJQJ+zfv99IMlu2bLHaNm/ebCSZ//3f/y132xMnTpgbbrjB7N2717Ro0cK88MIL3i4XNZidsXS5bdu2GUnm6NGj3igTNdDtt99uxo0b59bWqlUrk5ycXGb/qVOnmlatWrm1jR071txxxx1eqxG1Q2XHUlnatGljZs2a5enSUItUdRw98MADZvr06WbmzJnm1ltv9WaJdQ4zFnXE5s2bFRYWpsTERKvtjjvuUFhYmDZt2nTV7YqLizVixAg98cQTatu2bXWUihquqmPpSi6XSw6HQw0bNvRGmahhCgsLlZmZqd69e7u19+7d+6rjZvPmzaX69+nTRzt27NCFCxe8VitqtqqMpSsVFxcrLy9PjRo18kaJqAWqOo4WLVqkf/zjH5o5c6a3S6yTfHqBPHhOdna2IiIiSrVHREQoOzv7qtvNmTNH/v7+mjRpkjfLQy1S1bF0uZ9++knJyckaPny4QkNDPV0iaqDTp0+rqKhIkZGRbu2RkZFXHTfZ2dll9r948aJOnz6tpk2beq1e1FxVGUtXev7553Xu3Dndf//93igRtUBVxtGhQ4eUnJysDRs2yN+fr8hVwYxFDZeamiqHw1HubceOHZIkh8NRantjTJntkpSZmakXX3xRixcvvmof1B3eHEuXu3DhgoYNG6bi4mItXLjQ468DNduVY+Tnxk1Z/ctqx7WnsmOpxNKlS5Wamqrly5eX+SMJri0VHUdFRUUaPny4Zs2apZtuuqm6yqtziGM13KOPPqphw4aV2yc2NlZfffWVvv/++1KP/fDDD6XSeokNGzYoJydHzZs3t9qKior0+9//XvPnz9eRI0ds1Y6axZtjqcSFCxd0//33KysrS1988QWzFdeQxo0by8/Pr9QvgTk5OVcdN1FRUWX29/f3V3h4uNdqRc1WlbFUYvny5RozZozeffdd3XPPPd4sEzVcZcdRXl6eduzYoV27dunRRx+VdOmQOmOM/P39tXr1av3yl7+sltprM4JFDde4cWM1btz4Z/t16dJFLpdL27Zt0+233y5J2rp1q1wul7p27VrmNiNGjCj1D2+fPn00YsQIPfLII/aLR43izbEk/StUHDp0SGvWrOGL4TUmMDBQnTp1UkZGhoYMGWK1Z2RkaNCgQWVu06VLF3388cdubatXr1bnzp0VEBDg1XpRc1VlLEmXZipGjx6tpUuXql+/ftVRKmqwyo6j0NBQff31125tCxcu1BdffKH33ntPcXFxXq+5TvDhwnF42L333mtuueUWs3nzZrN582bTrl07079/f7c+N998s/nggw+uug/OCgVjKj+WLly4YAYOHGiaNWtmdu/ebU6dOmXdCgoKfPES4APLli0zAQEB5s033zT79+83kydPNg0aNDBHjhwxxhiTnJxsRowYYfX/9ttvTXBwsHnsscfM/v37zZtvvmkCAgLMe++956uXgBqismNpyZIlxt/f37zyyitu//7885//9NVLQA1Q2XF0Jc4KVXnMWNQh77zzjiZNmmSdAWHgwIFasGCBW5+DBw/K5XL5ojzUIpUdSydOnNBHH30kSaUuJrRmzRolJSV5v2j43AMPPKAff/xRzzzzjE6dOqWEhAR98sknatGihSTp1KlTbuePj4uL0yeffKLHHntMr7zyiqKjo/XSSy/pV7/6la9eAmqIyo6l1157TRcvXtSECRM0YcIEq33kyJFavHhxdZePGqKy4wj2OYz5/yvlAAAAAKCKOCsUAAAAANsIFgAAAABsI1gAAAAAsI1gAQAAAMA2ggUAAAAA2wgWAAAAAGwjWAAAAACwjWABAAAAwDaCBQAAAADbCBYAIOnIkSNyOBxyOBxau3at159v8eLF1vOhbli7dq31mR45cuSq/VJTU7362TO2APgKwQLANWHRokXWl6169eqV+8Wvphs1apQcDoeSkpI8ts+kpCTr/bnytnLlSo89j7eNGzdOkZGRKi4utt6nkpufn58aN26s++67T7t27fL4c4eGhioxMVGJiYlyOp0e3z8A1HT+vi4AAKrD4sWLrb+NMUpPT9fMmTN9V1ANFRgYqA4dOri1NWrU6Kr9jTG6ePGiAgICSj1WVFQkSfLz86tyPYWFhQoMDKxQX2OMPv74Yw0YMED16rn/bpaYmKizZ89q3759+vTTT7Vjxw4dPXpUQUFBVa7tSh07dtSWLVs8tj8AqG2YsQBQ52VlZWnDhg2SpM6dO0uS0tPTZYwps/93332nAQMGKDg4WM2aNdMrr7xiPVZUVKSUlBS1bNlS9evXV8OGDdWxY0f96U9/svrk5+frqaee0o033qjAwECFh4dryJAh2rt3b7l1lswajBo1ymorOWwmNjZWkhQbG6v09HRJ0rp160odvnXy5EmNHj1a0dHRCgwMVMuWLfWf//mfunjxYoXeq6ZNm2rLli1utzvvvFOS+yE2n332mdq2bauAgABt3LjRrc6//OUv1ms/fvy4JOmjjz5S9+7ddd111ykoKEgdO3bUW2+95fbcJfueO3euhg4dqgYNGui3v/2tJOn5559Xq1atFBwcrJCQELVt21aPP/642/bbt2/XyZMnNXjw4FKva8uWLdq7d6+efvppSdIPP/ygAwcOWI8XFBRo5syZio+Pl9PpVEREhEaPHq3Tp09bfbKzs/XQQw+padOmCgwMVJMmTZSUlKRVq1ZJKvtQKGOMpk+frvDwcDVs2FATJ05UYWFhqfoq8tlL0nPPPaf27durUaNGCggIUEREhIYOHapvvvmm3M91y5YtuvvuuxUeHi6n06lmzZpp4MCB+sc//lHudgBQKQYA6rgZM2YYSSYqKsrs2bPHSDKSzNq1a60+WVlZVnuDBg1MXFycady4sdX24YcfGmOMefHFF40k4+fnZ2655Rbzi1/8wgQGBpqePXta+7rnnnuMJONwOEyrVq3MddddZySZ6667zhw4cMAYY8yiRYusfZfo2bWgwScAAApbSURBVLOnkWRGjhxptc2cOdNIMi1atDDGGDN48GCrrpCQEJOYmGgSExNNZmam+eGHH0xMTIz12C233GL8/f2NJPPII4+U+x6VPHfJ85Tl8poDAwNNixYtTMuWLc2aNWusOgMCAozD4TA33XSTueGGG0xWVpb561//am0XGRlpWrRoYd3/wx/+YO3/8n2X1P/b3/7WfPjhh9Zjbdq0Ma1atTJBQUGlak1JSTENGjQw+fn5xhhjRo4cWeo9nj59uvX5fffdd1b7fffd5/a5hoaGWs93/vx5Y4wxQ4YMsT7Hjh07mpiYGONwOMzMmTONMcasWbPGer6srCxjjDEvvfSS1dasWTMTERFhGjRoUKXP3hhj+vXrZxo0aGBat25tEhISjJ+fn5FkYmJirNd95dgqKioy4eHh1vvfvn1706RJEyPJrFmzptxxAQCVQbAAUKcVFxebuLg4I8lMmTLFGGNMhw4djCQzatQoq9/lweLBBx80xcXFJi8vz8THxxtJJjEx0RhjzKOPPlrqC2BeXp7Ztm2bMcaYL774wtrPCy+8YIwx5vjx41a4ePjhh40xVQ8WxvzrC/PlYcYYY1JTU60vjzk5OcYYY1auXGmFnEOHDl31fSp57rJuZ86cKVXz448/bm178eJFq05JZsGCBdZjRUVFpnnz5tZ7+NNPP5ni4mLrS3pQUJA5d+6cMeZfweKmm24yP/74o7Xv5557zkgySUlJ1n5/+ukns3HjRrfX0KZNGzN06NBS71PJc7dt29ZIMsHBweall16y+q1du9bqt27dOmOMMSdPnjRBQUFGkvmv//ovY4wxCQkJRpJZtGiRte3JkyetsFhWsGjWrJmRZLp3724uXLhgzp07Z26++eYqf/Z79+41hYWF1v2MjAxrX3/7299KfU7GGHP69OlSdZXs6/vvvzcA4CkcCgWgTlu7dq2ysrIkSSNGjHD73/fee0/nzp0rtc2wYcPkcDh03XXXqX///pJkHcbUv39/ORwOpaenKzo6WnfddZf+8Ic/WOsQtm/fbu1n+PDhkqRmzZqpR48ekqQdO3Z442VKkrZt2yZJ+v777xURESGHw2EdFmSM0datW392H4GBgdYC5JKbv3/p5XhTpkyx/r58DUVQUJD+4z/+w7p/+vRpHTt2TJI0dOhQOZ1OORwODRs2TNKlw8b27dvntu9Ro0ZZ76efn5/69OmjwMBArV27Vk2aNFH37t01depUBQcHW9scPnxY+/fvL/MwKEnaunWr9Tw33nij+vbtaz1W8r5JUs+ePeVwOBQdHa38/HxJstZNDBgwQJI0ZswY/eIXv1D//v319ttvKzo6usznzM3N1YkTJyRJAwcOlL+/v4KDg9WvX78y+1fEsWPHdNdddyk0NFT16tVTr169rMdOnjxZ5jbh4eHq0qWLJKlVq1Zq166dHnzwQe3atUuNGzeuci0AcCUWbwOo0y5ftF1yFqWSRcVnz57Ve++9p5EjR7ptU95pOvv06aOdO3fq3Xff1Z49e7Rr1y6tXbtWixcv1uHDhyu8n7KU9C+pT5JcLleFtzf/f81ISEiI2rRpU+rxy7+IX03JGoufExUVVWZ7kyZNSi2cLlHR9+PKfSckJGjfvn1asmSJdu3apT179mjjxo164403dODAAbVo0UIrV66Uv7//Vb+0FxcXKzMzU/fee6++/vpr/frXv9bOnTtVr149t7U2iYmJV63n2WefVbdu3fT5559r7969Wr9+vVatWqW1a9da6yyu5vLXfvnzXfl4eZ/9t99+q8GDB6uwsFAhISHq1KmTLl68qN27d5fa9kp///vftWTJEm3cuFH79+/Xu+++q2XLlunUqVN64oknyq0dACqKGQsAddbZs2f1/vvvW/ddLpdcLpfOnj1rtV0ePEosXbpUxhidO3fO+sKYkJAgSfrqq68UERGhZ599Vv/zP/9jfQn//vvvdfDgQd12223Wft555x1J0okTJ0otHi9LRESEJOnQoUOSLv2aX9YX1pKAcOVsy+233y5J8vf317Jly6zF1xkZGRo/fryGDBly1eeurKuFhCvbIyIi1Lx5c0nS+++/r4KCAhljtGzZMkmXZjjatm1b7j4OHTokh8OhGTNmaMWKFdq/f7+Cg4OVn59vzQB9+OGH6tGjx1XPYOVwONS5c2frTGB79uzRf//3f0v61/smSSkpKdb79uWXXyo1NVVjxoyRJG3cuFE9e/bUSy+9pC+++MJa1L9+/foynzM0NFTNmjWTdGnx+sWLF5Wfn69PP/20VN+KfPa7du2yFn5//vnn2r59u5588skyn/tyxhht2rRJo0aN0ltvvaUtW7ZYYfpqtQNAlfjwMCwA8Kq33nrLOrb8q6++cnvs5ZdfttYeZGVllbl4u2SBqySzYsUKY4wx06ZNMw6Hw8TExJiOHTtaC6mDg4OttQiXL95u3bq1CQkJqdDi7ddee81q69y5s4mNjTX16tUrdZx9yQJySSYhIcEkJiaa8+fPm5ycHHPDDTdYC6BvvfVW07JlSxMQEGB+7p/7kmP8AwMDrQXhJbdly5ZdteYSZa0HKFHZxduXr2Ewxpg33njDSDJNmzY1HTp0MNHR0dZC6/3795ucnBxTr1498+KLL7ptV9bi7fPnz1uf66233mq19+nTx+p78803mzZt2liLrEsWOHfr1s0EBgaaG2+80XTs2NFag9G1a1djTNlrLObPn++2eDsyMtI4nc4qffYHDhywFmuHhoaahIQEtxMMlLxvV35OFy5cMNKlBf1t2rQxCQkJ1r6feuqpq4wIAKg8ZiwA1Fklp2WNj49Xu3bt3B4bMmSIHA6HdU2Ly7322mtq06aNzp49q+joaL344ovWsft33nmn7r33XhUXF2vv3r0qLi7WL3/5S3366adq2LChpEu/TqekpCguLk6HDh2Sv7+/Bg8erM2bN6tVq1ZXrfeRRx7RpEmT1LhxYx0+fFi9evXS7373u1L9Ro8erV/96lcKCwvT3r17tXXrVhUVFalJkybasmWLHnnkEYWHh2vfvn3Kz89Xjx499MILL1ToPSssLNTWrVvdbqdOnarQtlfzm9/8RitXrlTXrl2Vl5en7OxstW/fXm+++aamTZv2s9t36NBBQ4YMUWBgoPbv369z587pjjvu0LvvvqvWrVvr448/VnFxsQYNGvSz+woKCtKkSZMkXZq1KJkVWLlypWbMmKH4+Hh9++23ys7OVuvWrTV9+nRrtuqBBx7QbbfdptzcXH399ddq2LChhg0bpqVLl171+SZOnKjk5GRdf/31crlc6t+/f5mfaUU++1atWumtt95SXFycCgsL1bhx43Kfu4Sfn5/GjRunuLg4fffddzp8+LBiY2P1+OOPa8aMGT+7PQBUlMOYq5zIHQCAWmDQoEE6duyYV66mDQCoOBZvAwBqtW7dumncuHG+LgMArnnMWAAAAACwjTUWAAAAAGwjWAAAAACwjWABAAAAwDaCBQAAAADbCBYAAAAAbCNYAAAAALCNYAEAAADANoIFAAAAANsIFgAAAABs+38V40mqLxAsuAAAAABJRU5ErkJggg==" + }, + "metadata": {} + } + ], + "execution_count": 8, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-06T21:51:18.824Z", + "iopub.execute_input": "2020-11-06T21:51:18.830Z", + "iopub.status.idle": "2020-11-06T21:51:19.679Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "So, `NExOS` does a good job!" + ], + "metadata": {} + } + ], + "metadata": { + "language_info": { + "file_extension": ".jl", + "name": "julia", + "mimetype": "application/julia", + "version": "1.5.0" + }, + "kernelspec": { + "name": "julia-1.5", + "display_name": "Julia 1.5.0", + "language": "julia" + }, + "nteract": { + "version": "0.26.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} \ No newline at end of file diff --git a/tutorials/Matrix_completion_problem_NEXOS.jmd b/tutorials/Matrix_completion_problem_NEXOS.jmd new file mode 100644 index 0000000..e63d430 --- /dev/null +++ b/tutorials/Matrix_completion_problem_NEXOS.jmd @@ -0,0 +1,99 @@ + ### Matrix Completion Problem + +A matrix completion problem can be formulated as the following optimization problem: + +$$ +\begin{array}{ll} +\textrm{minimize} & \sum_{(i,j)\in\Omega}(X_{ij}-Z_{ij})^{2}\\ +\textrm{subject to} & \mathop{{\bf rank}}(X)\leq r\\ + & \|X\|_{2}\leq M\\ + & X\in\mathbf{R}^{m\times n}, +\end{array} +$$ + +where $Z\in\mathbf{R}^{m\times n}$ is the matrix whose entries $Z_{ij}$ are observable for $(i,j)\in\Omega$. Based on these observed entries, our goal is to construct a matrix $X\in\mathbf{R}^{m\times n}$ that has rank $r$. + +First, of course, we load our packages. + +```julia +using Random, LinearAlgebra, NExOS + +Random.seed!(1234) +``` + +Construct a random $$m\times n$$ matrix matrix of rank $n$. + +```julia +m,n,k = 40,40,2 +Zfull = randn(m,k)*randn(k,n) # ground truth data +M = opnorm(Zfull,2) # this is the bound on our constraint ||X||_2 ≦ M +``` + +Suppose that we only observe a fraction of entries in Zfull. Let us find the indices of all the elements that are available. + +```julia +n_obs = 600 +Zobs = fill(NaN,(m,n)) +obs = randperm(m*n)[1:n_obs] +Zobs[obs] .= Zfull[obs] # partially observed matrix +``` + +Plot the ground-truth, full dataset and our partial observations + +```julia +using PyPlot +``` + +Plot the ground-truth, full dataset and our partial observations + +```julia +figure(figsize=(7,14)) +subplot(1,2,1) +imshow(Zfull,cmap=ColorMap("Blues"),interpolation="None") +xticks([]),yticks([]),title("True Data",fontweight="bold") + +subplot(1,2,2) +imshow(Zobs,cmap=ColorMap("Blues"),interpolation="None") +xticks([]),yticks([]),title("Observed Data",fontweight="bold") +show() +PyPlot.display_figs() +``` + +Let us create our problem now: + +```julia +M = opnorm(Zfull,2) +f = SquaredLossMatrixCompletion(Zobs, iterative = true) +r = rank(Zfull) +Z0 = zeros(size(Zobs)) +C = RankSet(M, r) +setting = NExOS.Setting(μ_max = 5, μ_min = 1e-8, μ_mult_fact = 0.5, n_iter_min = 1000, n_iter_max = 1000, verbose = true, freq = 1000, tol = 1e-4, γ_updt_rule = :safe) +problem = NExOS.Problem(f, C, setting.β, Z0) +``` + +Time to solve our problem + +```julia +state_final = NExOS.solve!(problem, setting) + +# +Z_estimated = state_final.x +``` + +Finally, we plot a simple histogram to see how much of the matrix has been recovered. + +```julia +figure(figsize=(8,3)) +PyPlot.hist(vec(Zfull - Z_estimated ),100) +xlim([-0.5,0.5]),xlabel("Absolute Errors/Residuals",fontweight="bold"),tight_layout() +show() +PyPlot.display_figs() +``` +So, `NExOS` does a good job! + +```julia +using Weave +cd("C:\\Users\\shuvo\\Desktop") # directory that contains the .jmd file +tangle("Matrix_completion_problem_NEXOS.jmd", informat = "markdown") # convert the .jmd file into a .jl file that will contain the code +convert_doc("Matrix_completion_problem_NEXOS.jmd", "Matrix_completion_problem_NEXOS.ipynb") + ``` diff --git a/tutorials/sparse_regression_using_NExOS.ipynb b/tutorials/sparse_regression_using_NExOS.ipynb new file mode 100644 index 0000000..94bb82b --- /dev/null +++ b/tutorials/sparse_regression_using_NExOS.ipynb @@ -0,0 +1,260 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "source": [ + "### Solving Sparse Regression using `NExOS.jl`\n", + "\n", + "The sparse regression problem (also known as regressor selection problem) is concerned with approximating a vector $b\\in\\mathbf{R}^{m}$ with a linear combination of at most $k$ columns of a matrix $A\\in\\mathbf{R}^{m\\times d}$ with bounded coefficients. The problem can be written as the following optimization\n", + "problem\n", + "$$\n", + "\\begin{equation}\n", + "\\begin{array}{ll}\n", + "\\textrm{minimize} & \\|Ax-b\\|_{2}^{2}+\\frac{\\beta}{2}\\|x\\|^{2}\\\\\n", + "\\textrm{subject to} & \\mathbf{card}(x)\\leq k\\\\\n", + " & \\|x\\|_{\\infty}\\leq M,\n", + "\\end{array}\n", + "\\end{equation}\n", + "$$\n", + "where $x\\in\\mathbf{R}^{d}$ is the decision variable, and $A\\in\\mathbf{R}^{m\\times d},b\\in\\mathbf{R}^{m},$ and $M>0$ are problem data.\n", + "\n", + "First, load the packages." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "using Random, NExOS, ProximalOperators" + ], + "outputs": [], + "execution_count": 1, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-08T18:09:31.752Z", + "iopub.execute_input": "2020-11-08T18:09:32.311Z", + "iopub.status.idle": "2020-11-08T18:09:43.141Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Let us generate some random data for this problem." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "m = 25\n", + "n = 50\n", + "A = randn(m,n)\n", + "A = randn(m,n)\n", + "b = randn(m)\n", + "M = 100\n", + "k = convert(Int64, round(m/3))\n", + "beta = 10^-10" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 2, + "data": { + "text/plain": "1.0000000000000006e-10" + }, + "metadata": {} + } + ], + "execution_count": 2, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-08T18:09:53.303Z", + "iopub.execute_input": "2020-11-08T18:09:53.742Z", + "iopub.status.idle": "2020-11-08T18:09:54.811Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Create the problem instance in `NExOS`." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "C = SparseSet(M, k) # Create the set\n", + "f = LeastSquares(A, b, iterative = true) # Create the function\n", + "setting = NExOS.Setting(μ_max = 2, μ_min = 1e-8, μ_mult_fact = 0.85, verbose = false, freq = 250, γ_updt_rule = :adaptive, β = beta) # setting\n", + "z0 = zeros(n) # create an initial point\n", + "problem = NExOS.Problem(f, C, setting.β, z0) # problem instance" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 3, + "data": { + "text/plain": "Problem{ProximalOperators.LeastSquaresIterative{1,Float64,Float64,Array{Float64,2},Array{Float64,1},ProximalOperators.AAc},SparseSet{Int64,Int64},Float64,Array{Float64,1}}(description : Least squares penalty\ndomain : n/a\nexpression : n/a\nparameters : n/a, SparseSet{Int64,Int64}(100, 8), 1.0000000000000006e-10, [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0 … 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0])" + }, + "metadata": {} + } + ], + "execution_count": 3, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-08T18:10:09.043Z", + "iopub.execute_input": "2020-11-08T18:10:09.060Z", + "iopub.status.idle": "2020-11-08T18:10:10.880Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Time to solve the problem." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "state_final = NExOS.solve!(problem, setting)" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 4, + "data": { + "text/plain": "State{Array{Float64,1},Int64,Float64}([2.3604068371270658e-9, -1.8614022509121833e-8, 1.4283929504262508e-8, -0.16944938189544959, 1.607411136746223e-8, 1.9905400634770604e-8, 0.29495341217575866, 0.20254672745957286, 3.219912492554042e-8, 2.295088435599488e-8 … -0.17321307976429576, -4.257769720060745e-8, 0.3490200698603441, -4.066531833939707e-9, -1.8881561945230612e-9, 1.6860244710903703e-8, -6.390798535020245e-9, -7.4146058898080014e-9, 2.3471740858500734e-8, 2.9259207732582412e-8], [2.503142602875358e-9, -1.9739343790018626e-8, 1.5147520155775593e-8, -0.1694493825596602, 1.7045900328179816e-8, 2.1108862617962955e-8, 0.2949534115875845, 0.2025467262194669, 3.414578090974902e-8, 2.4338410440704434e-8 … -0.17321307916293596, -4.5151862513441285e-8, 0.3490200698298937, -4.312348720089836e-9, -2.0023296492550167e-9, 1.7879582823859328e-8, -6.777157090410619e-9, -7.86286864718721e-9, 2.4890840888034824e-8, 3.1028160171478726e-8], [-2.347730063129355e-8, 1.8513760949107962e-7, -1.420704045266314e-7, -0.16944938189544959, -1.5987550720778272e-7, -1.9798255216803828e-7, 0.29495341217575866, 0.20254672745957286, -3.202573022126649e-7, -2.2827280880911574e-7 … -0.17321307976429576, 4.2348468931166974e-7, 0.349020069860344, 4.04459959795016e-8, 1.8780106042461643e-8, -1.6769473435006287e-7, 6.35637466708413e-8, 7.374677090721157e-8, -2.3345419886594505e-7, -2.910168024607876e-7], 2, 2.5741653128338348e-9, 4.5151862513441285e-8, 9.385625688121253e-9, 9.68794389337658e-8)" + }, + "metadata": {} + } + ], + "execution_count": 4, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-08T18:10:15.625Z", + "iopub.execute_input": "2020-11-08T18:10:15.635Z", + "iopub.status.idle": "2020-11-08T18:10:19.545Z" + } + } + }, + { + "cell_type": "markdown", + "source": [ + "Let us take a look at the quality of the solution." + ], + "metadata": {} + }, + { + "cell_type": "code", + "source": [ + "log10(state_final.fxd_pnt_gap) <= -4 # if the fixed point gap is less than 10^-4 (to determin if the algorithm has converged)" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 5, + "data": { + "text/plain": "true" + }, + "metadata": {} + } + ], + "execution_count": 5, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-08T18:10:27.985Z", + "iopub.execute_input": "2020-11-08T18:10:27.993Z", + "iopub.status.idle": "2020-11-08T18:10:28.213Z" + } + } + }, + { + "cell_type": "code", + "source": [ + "log10(state_final.fsblt_gap) <= -4 # this is to test if the found solution by NExOS is locally optimal" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 6, + "data": { + "text/plain": "true" + }, + "metadata": {} + } + ], + "execution_count": 6, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-08T18:10:30.403Z", + "iopub.execute_input": "2020-11-08T18:10:30.411Z", + "iopub.status.idle": "2020-11-08T18:10:30.424Z" + } + } + }, + { + "cell_type": "code", + "source": [ + "f(state_final.x) # this gives the objective value of the solution found by NExOS" + ], + "outputs": [ + { + "output_type": "execute_result", + "execution_count": 7, + "data": { + "text/plain": "2.6773713078392" + }, + "metadata": {} + } + ], + "execution_count": 7, + "metadata": { + "execution": { + "iopub.status.busy": "2020-11-08T18:10:32.985Z", + "iopub.execute_input": "2020-11-08T18:10:32.993Z", + "iopub.status.idle": "2020-11-08T18:10:33.014Z" + } + } + }, + { + "cell_type": "code", + "source": [], + "outputs": [], + "execution_count": null, + "metadata": { + "collapsed": true, + "jupyter": { + "source_hidden": false, + "outputs_hidden": false + }, + "nteract": { + "transient": { + "deleting": false + } + } + } + } + ], + "metadata": { + "language_info": { + "file_extension": ".jl", + "name": "julia", + "mimetype": "application/julia", + "version": "1.5.0" + }, + "kernelspec": { + "name": "julia-1.5", + "display_name": "Julia 1.5.0", + "language": "julia" + }, + "nteract": { + "version": "0.26.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} \ No newline at end of file diff --git a/tutorials/sparse_regression_using_NExOS.jmd b/tutorials/sparse_regression_using_NExOS.jmd new file mode 100644 index 0000000..d851309 --- /dev/null +++ b/tutorials/sparse_regression_using_NExOS.jmd @@ -0,0 +1,64 @@ + ### Solving Sparse Regression using `NExOS.jl` + +The sparse regression problem (also known as regressor selection problem) is concerned with approximating a vector $b\in\mathbf{R}^{m}$ with a linear combination of at most $k$ columns of a matrix $A\in\mathbf{R}^{m\times d}$ with bounded coefficients. The problem can be written as the following optimization +problem +$$ +\begin{equation} +\begin{array}{ll} +\textrm{minimize} & \|Ax-b\|_{2}^{2}+\frac{\beta}{2}\|x\|^{2}\\ +\textrm{subject to} & \mathbf{card}(x)\leq k\\ + & \|x\|_{\infty}\leq M, +\end{array} +\end{equation} +$$ +where $x\in\mathbf{R}^{d}$ is the decision variable, and $A\in\mathbf{R}^{m\times d},b\in\mathbf{R}^{m},$ and $M>0$ are problem data. + +First, load the packages. + +```julia +using Random, NExOS, ProximalOperators +``` + +Let us generate some random data for this problem. + + +```julia +m = 25 +n = 50 +A = randn(m,n) +A = randn(m,n) +b = randn(m) +M = 100 +k = convert(Int64, round(m/3)) +beta = 10^-10 +``` + +Create the problem instance in `NExOS`. + +```julia +C = SparseSet(M, k) # Create the set +f = LeastSquares(A, b, iterative = true) # Create the function +setting = NExOS.Setting(μ_max = 2, μ_min = 1e-8, μ_mult_fact = 0.85, verbose = true, freq = 250, γ_updt_rule = :adaptive, β = beta) # setting +z0 = zeros(n) # create an initial point +problem = NExOS.Problem(f, C, setting.β, z0) # problem instance +``` + +Time to solve the problem. + +```julia +state_final = NExOS.solve!(problem, setting) +``` + +Let us take a look at the quality of the solution. + +```julia +log10(state_final.fxd_pnt_gap) <= -4 # if the fixed point gap is less than 10^-4 (to determin if the algorithm has converged) +``` + +```julia +log10(state_final.fsblt_gap) <= -4 # this is to test if the found solution by NExOS is locally optimal +``` + +```julia +f(state_final.x) # this gives the objective value of the solution found by NExOS +```