$title  Interior Point QP versus NLP for Entropy-Based Matrix Balancing

*       Compare efficiency of nonlinear programming formulation as 
*       compared with sequential quadratic programming for solving
*       a matrix balancing problem.

*	Thomas F. Rutherford
*	Ann Arbor, MI
*	September, 2006


*       Define the solvers to use in this test.  What the heck, why
*       not let the Danes duke it out?

option nlp=conopt;
option qcp=mosek;


*       Define the test problem dimension here or on the command line:

$if not set dimen $set dimen 100

set     i   SAM rows and column indices   /1*%dimen%/;

alias  (i,j);

parameter sam(i,j)  Base year social accounts (random);

*       Generate a random sparse matrix:

loop((i,j)$(uniform(0,1) > 0.1), sam(i,j) = uniform(0,1););

set             nz(i,j)                 Non-zero structure of the data;

parameter       es0(i,j)                Reference point for QP approximation;

nz(i,j) = yes$(sam(i,j) > 0);

*       Balance the SAM using least squares

variable        obj             Objective -- least squares deviation;

positive
variable        es(i,j)         Estimated matrix;

equations       entobj          Defines the entropy norm
                qpobj           Norm for QP approxmiation to entropy norm,
                balance(i)      SAM balance condition;

qpobj..         obj =e= sum(nz(i,j), log(es0(i,j)/sam(i,j))*(es(i,j)-es0(i,j)) +
                                        1/2*es0(i,j)*sqr(es(i,j)/es0(i,j)-1));

entobj..        obj =e= sum(nz(i,j), es(i,j) * (log(es(i,j)/sam(i,j))-1));

balance(i)..    sum(nz(i,j), es(i,j)) =e= sum(nz(j,i), es(j,i));


es.l(i,j) = sam(i,j);
es.fx(i,j)$(sam(i,j) = 0) = 0;

model entropyqp /qpobj, balance/;
model entropy   /entobj, balance/;

parameter       reslog          Resource log
                qpadjust        Adjustment in the QP reference point
                dev             Current deviation /1/;

*       Impose a lower bound:

es.lo(i,j) = sam(i,j)/100;

set     qpiter  QP iterations /qp0*qp10/;

es0(i,j) = sam(i,j);
loop(qpiter$(dev>1e-5),
        solve entropyqp using QCP minimizing obj;
        reslog(qpiter) = entropyqp.resusd;
        qpadjust(qpiter) = sum((i,j), abs(es.l(i,j)-es0(i,j)));
        dev = qpadjust(qpiter);
        es0(i,j) = es.l(i,j);
);
display qpadjust;
es0(i,j) = es.l(i,j);

*       Initiate the NLP from the same starting point:

es.l(i,j) = sam(i,j);
solve entropy using NLP minimizing obj;
reslog("NLP") = entropy.resusd;
reslog("Deviation") = sum((i,j), abs(es.l(i,j)-es0(i,j)));
display reslog;


Results from my IBM X40 (1 GHZ):

----     12 PARAMETER reslog  Resource log

             nlp          qp         qp0         qp1         qp2         qp3
100          5.9         1.2         0.3         0.3         0.3         0.4
250        124.2         7.6         1.8         1.9         1.9         1.9
500       1013.8        33.9         8.8         8.6         8.2         8.3