Mercurial > hg > camir-aes2014
annotate toolboxes/SVM-light/Readme_optimization_relative_constraints.txt @ 0:e9a9cd732c1e tip
first hg version after svn
author | wolffd |
---|---|
date | Tue, 10 Feb 2015 15:05:51 +0000 |
parents | |
children |
rev | line source |
---|---|
wolffd@0 | 1 Solving general optimization problems |
wolffd@0 | 2 ------------------------------------- |
wolffd@0 | 3 |
wolffd@0 | 4 You can use SVM-light to solve general optimzation problems of the form: |
wolffd@0 | 5 |
wolffd@0 | 6 min 0.5 w*w + C sum_i C_i \xi_i |
wolffd@0 | 7 s.t. x_i * w > rhs_i - \xi_i |
wolffd@0 | 8 |
wolffd@0 | 9 Use the option "-z o". This allows specifying a training set where the examples are the inequality constraints. For example, to specify the problem |
wolffd@0 | 10 |
wolffd@0 | 11 min 0.5 w*w + 10 (1000 \xi_1 + 1 \xi_2 + 1 \xi_3 + 1 \xi_4) |
wolffd@0 | 12 s.t. 1 w_1 >= 0 - \xi_1 |
wolffd@0 | 13 -2 w_1 >= 1 - \xi_2 |
wolffd@0 | 14 2 w_3 >= 2 - \xi_3 |
wolffd@0 | 15 2 w_2 + 1 w_3 >= 3 - \xi_4 |
wolffd@0 | 16 |
wolffd@0 | 17 you can use the training set |
wolffd@0 | 18 |
wolffd@0 | 19 0 cost:10000 1:1 |
wolffd@0 | 20 1 1:-2 |
wolffd@0 | 21 2 3:2 |
wolffd@0 | 22 3 2:3 3:1 |
wolffd@0 | 23 |
wolffd@0 | 24 and run |
wolffd@0 | 25 |
wolffd@0 | 26 svm_learn -c 10 -z o train.dat model |
wolffd@0 | 27 |
wolffd@0 | 28 The format is just like the normal SVM-light format. Each line corresponds to one inequality. However, the first element of each line is the right-hand side of the inequality. The remainder of the line specifies the left-hand side. The parameter cost:<value> is optional and lets you specify a factor by which the value of the slack variable is weighted in the objective. The general regularization parameter (10 in the example) is specified with the option -c <value> on the command line. |
wolffd@0 | 29 |
wolffd@0 | 30 To classify new inequalities, you can use svm_classify in the normal way. |