Discussion:
[Scikit-learn-general] Scipy optimizer problem when adding Jacobian
Bao Thien
2014-07-08 12:44:40 UTC
Permalink
Dear all,

I need to optimize a loss function and currently use some optimizers from
scipy.optimize.minimize
More detail like this:
+ parameters to optimize : X - size is about 50
+ init parameters: X0
+ bounds - all parameters are in [0,1]
+ loss function: L (defined)
+ Jacobian (gradient) : J (defined)

I have tried with many optimizers: SLSQP, L-BFGS-B, CG, Newton-CG, COBYLA.

In the case that I don't provide the Jacobian, all the optimizers run and
exit successfully with the final loss functions are more or less the same
(the init loss is 21.18, and the final loss is ~16.87), but it takes
sometime to finish. Then, I want to improve the speed of computation, I was
advised to use Jacobian.

However, when I provide the Jacobian there are something strange:
- with SLSQP: the optimizer finish successfully, but the final loss is
~20.80 (which is not much less then the init loss function value of 21.18)

- with L-BFGS-B: ABNORMAL_TERMINATION_IN_LNSRCH
Line search cannot locate an adequate point after 20 function
and gradient evaluations. Previous x, f and g restored.
Possible causes: 1 error in function or gradient evaluation;
2 rounding error dominate
computation.t
and returns exactly the init parameters: X0

- with CG/Newton-CG: warning: Desired error not necessarily achieved due
to precision loss.
It returns the same loss function values as the init one: 21.18 and
exactly the init parameters: X0.

So, would anyone face this problem before, please let me know. Or if you
have any hint or glue to overcome this problem, all are very appreciated.

Regards,

T.Bao
--
Nguyen Thien Bao

NeuroInformatics Laboratory (NILab), Fondazione Bruno Kessler (FBK),
Trento, Italy
Centro Interdipartimentale Mente e Cervello (CIMeC), Universita degli Studi
di Trento, Italy
Surgical Planning Laboratory (SPL), Department of Radiology, BWH, Harvard
Medical School, USA
Email: bao at bwh.harvard.edu or tbnguyen at fbk.eu or ntbaovn at gmail.com
Fax: +39.0461.283.091
Cellphone: +1. 857.265.6408 (USA)
+39.345.293.1006 (Italy)
+84.996.352.452 (VietNam)
Gael Varoquaux
2014-07-08 13:29:11 UTC
Permalink
Hi,

I believe that this is a question for the scipy mailing list.

Gaël
Post by Bao Thien
Dear all,
I need to optimize a loss function and currently use some optimizers from
scipy.optimize.minimize
+ parameters to optimize : X - size is about 50
+ init parameters: X0
+ bounds - all parameters are in [0,1]
+ loss function: L (defined)
+ Jacobian (gradient) : J (defined)
I have tried with many optimizers: SLSQP, L-BFGS-B, CG, Newton-CG, COBYLA.
In the case that I don't provide the Jacobian, all the optimizers run and exit
successfully with the final loss functions are more or less the same (the init
loss is 21.18, and the final loss is ~16.87), but it takes sometime to finish.
Then, I want to improve the speed of computation, I was advised to use
Jacobian.
  - with SLSQP: the optimizer finish successfully, but the final loss is ~20.80
(which is not much less then the init loss function value of 21.18)
  - with L-BFGS-B: ABNORMAL_TERMINATION_IN_LNSRCH 
              Line search cannot locate an adequate point after 20 function and
gradient evaluations.  Previous x, f and g restored.
              Possible causes: 1 error in function or gradient evaluation;
                                          2 rounding error dominate
computation.t
      and returns exactly the init parameters: X0
  - with CG/Newton-CG: warning: Desired error not necessarily achieved due to
precision loss. 
       It returns the same loss function values as the init one: 21.18 and
exactly the init parameters: X0.
So, would anyone face this problem before, please let me know. Or if you have
any hint or glue to overcome this problem, all are very appreciated.
Regards,
T.Bao
--
Gael Varoquaux
Researcher, INRIA Parietal
Laboratoire de Neuro-Imagerie Assistee par Ordinateur
NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France
Phone: ++ 33-1-69-08-79-68
http://gael-varoquaux.info http://twitter.com/GaelVaroquaux
Loading...