• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

rflamary/nonconvex-optimization: Matlab/Octave toolbox for nonconvex optimizatio ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

rflamary/nonconvex-optimization

开源软件地址(OpenSource Url):

https://github.com/rflamary/nonconvex-optimization

开源编程语言(OpenSource Language):

MATLAB 100.0%

开源软件介绍(OpenSource Introduction):

Non-convex optimization Toolbox

This matlab toolbox propose a generic solver for proximal gradient descent in the convex or non-convex case. It is a complete reimplementation of the GIST algorithm proposed in [1] with new regularization terms such as the lp pseudo-norm with p=1/2.

When using this toolbox in your research works please cite the paper Non-convex regularization in remote sensing:

D. Tuia, R. Flamary and M. Barlaud, "Non-convex regularization in remote sensing", 
IEEE transactions Transactions on Geoscience and Remote Sensing, (to appear) 2016.

The code solve optimization problems of the form:

min_x f(x)+lambda g(x)

We provide solvers for solving the following data fitting terms f(x) problems:

  • Least square (linear regression)
  • Linear SVM with quadratic Hinge loss
  • Linear logistic regression
  • Calibrated Hinge loss

The regularization terms g(x) that have been implemented include:

  • Lasso (l1)
  • Ridge (squared l2)
  • Log sum penalty (LSP) ([2],prox in [1])
  • lp regularization with p=1/2 (prox in [3])
  • Group lasso (l1-l2)
  • Minimax concave penalty (MCP)
  • Indicator function on convex (projection)
  • Indicator function on simplex (projection)

New regularization terms can be easily implemented as discussed in section 3.

Start using the toolbox

Installation

All the functions in the toolbox a given in the folder /utils.

The unmix folder contains code and data downloaded from the website of Jose M. Bioucas Dias.

In order to use the function we recommend to execute the following command

addpath(genpath('.'))

if you are not working in the root folder of the toolbox or replacing '.' by the location of the folder on your machine.

Entry points

We recommend to look at the following files to see how to use the toolbox:

  • demo/demo_classif.m : contains an example of 4 class linear classification problem and show how to learn different classifiers.
  • demo/demo_unmix.m : show an example of linear unmixing with positivity constraint and non-convex regularization.
  • demo/visu_classif.m : reproduce the example figure in the paper.

Solving your own optimization problem

New regularization terms

All the regularization terms (and theri proximal operators) are defined in the function utils/get_reg_prox.m.

If you want to add a regularization term (or a projection), you only need to add a case to the switch beginning line 37 and define two functions:

  • g(x) : R^d->R, loss function for the regularization term
  • prox_g(x,lambda) : R^d->R^d, proximal operator of lambda*g(x)

For a simple example look at the implementations of the Lasso loss (line 124) soft thresholding (Line 128) and loss implementations.

note that in order to limit the number of files, the loss and proximal operators functions are all implemented as subfunctions of file utils/get_reg_prox.m.

Data fitting term

You can easily change the data fitting term by providing a new loss and gradient functions to the optimization function utils/gist_opt.m.

A good starting point is by looking at the least square implementation in utils/gist_least.m. Changing the data fitting term correspond to only code the loss function at Line 63 and the corresponding gradient function at Line 59.

Contact and contributors

Aknowledgements

We want to thank Jose M. Bioucas Dias for providing the unmixing dataset and functions on his website.

References

[1] Gong, P., Zhang, C., Lu, Z., Huang, J., & Ye, J. (2013, June). A General Iterative Shrinkage and Thresholding Algorithm for Non-convex Regularized Optimization Problems. In ICML (2) (pp. 37-45).

[2] Candes, E. J., Wakin, M. B., & Boyd, S. P. (2008). Enhancing sparsity by reweighted ? 1 minimization. Journal of Fourier analysis and applications, 14(5-6), 877-905.

[3] Xu, Z., Chang, X., Xu, F., & Zhang, H. (2012). L1/2 regularization: A thresholding representation theory and a fast solver. IEEE Transactions on neural networks and learning systems, 23(7), 1013-1027.

Copyright 2016




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap