wolffd@0: Solving general optimization problems wolffd@0: ------------------------------------- wolffd@0: wolffd@0: You can use SVM-light to solve general optimzation problems of the form: wolffd@0: wolffd@0: min 0.5 w*w + C sum_i C_i \xi_i wolffd@0: s.t. x_i * w > rhs_i - \xi_i wolffd@0: wolffd@0: Use the option "-z o". This allows specifying a training set where the examples are the inequality constraints. For example, to specify the problem wolffd@0: wolffd@0: min 0.5 w*w + 10 (1000 \xi_1 + 1 \xi_2 + 1 \xi_3 + 1 \xi_4) wolffd@0: s.t. 1 w_1 >= 0 - \xi_1 wolffd@0: -2 w_1 >= 1 - \xi_2 wolffd@0: 2 w_3 >= 2 - \xi_3 wolffd@0: 2 w_2 + 1 w_3 >= 3 - \xi_4 wolffd@0: wolffd@0: you can use the training set wolffd@0: wolffd@0: 0 cost:10000 1:1 wolffd@0: 1 1:-2 wolffd@0: 2 3:2 wolffd@0: 3 2:3 3:1 wolffd@0: wolffd@0: and run wolffd@0: wolffd@0: svm_learn -c 10 -z o train.dat model wolffd@0: wolffd@0: The format is just like the normal SVM-light format. Each line corresponds to one inequality. However, the first element of each line is the right-hand side of the inequality. The remainder of the line specifies the left-hand side. The parameter cost: is optional and lets you specify a factor by which the value of the slack variable is weighted in the objective. The general regularization parameter (10 in the example) is specified with the option -c on the command line. wolffd@0: wolffd@0: To classify new inequalities, you can use svm_classify in the normal way.