Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
From | Christophe Kolodziejczyk <ck.statalist@gmail.com> |
To | statalist@hsphsun2.harvard.edu |
Subject | Re: st: programming in stata - problems with optimize |
Date | Sun, 24 Jul 2011 11:37:14 +0200 |
As Matthew points out your objective function seems to be ill-behaved. It looks like you are trying to solve the normal equations but not quite. Moreover the code of your objective function is not consistent with the formula you give at the beginning of your post. Optimize helps you finding the maximum or minimum of a function and is not originally designed for finding the roots of an equation. However the methods used to optimize functions, like Newton-Raphson are root-finding methods. They find the roots of the gradient of the objective function and this why Matthew proposes to optimize the square of your function. You can also think of the GMM method where the solution of the problem is solved by mimizing a quadratic form of the moments. The problem with this method is that you have to find the primitive of the equation you want to solve. Therefore a way to use optimize as a root-finder is to specify the gradient, which will in this case be equal to the equation you are trying to solve and use the d1 evaluator. The solution will be the set of paramaters which annulate the gradient. I give here an example I want to find mu such that Sum_{i} (x_i-mu) = 0 -> which solution I know is the mean clear * set obs 1000 drawnorm x sum x mata: x=st_data(.,"x") void myf(todo,p, x, f,g,H) { f=0 // obejctive function is irrelevant in this case if (todo==1) { g=quadcolsum((x:-p)) // the equation I want to solve, implicitly set equal to zero } } S = optimize_init() optimize_init_evaluator(S, &myf()) optimize_init_evaluatortype(S, "d1") optimize_init_params(S, (1)) optimize_init_argument(S, 1, x) p=optimize(S) p end sum x // should be equal to p Now let's say I want to solve for x the equation x^2-2x+1=0 Write then void myf2(todo,p,f,g,H) { f=0 if (todo==1) { g=p^2-2*p+1 } } S = optimize_init() optimize_init_evaluator(S, &myf2()) optimize_init_evaluatortype(S, "d1") optimize_init_params(S, (1)) p=optimize(S) p You will find that p=1. You might also have a look at Benn Jann's root finder from his moremata package. Christophe 2011/7/22 A.J van der Vlist <vlist001@hotmail.com>: > Dear Statalisters, > > I encouter problems in optimizing function f: > > I would like to solve for mu: > > sum over i { Nx_i' * (mu - Xbeta_i) } = Sk > > Nx = [n x 1] , vector of {1/n}'s in order to sum over i > mu = parameter > Xbeta = [n x 1] vector of fitted values > Sk = scalar > > This is what I tried: > > scalar Sk1=Sm1_SRA > mata: > X=st_data(.,"Xbeta") > Nx=st_data(.,"Nxi") > void mymodelb(todo, mu2, f, g, H) > { > external Nx, X > hlp=(mu2 :- X) > f = (Nx' * normal(hlp)) - Sk1 > } > S = optimize_init() > optimize_init_evaluator(S, &mymodelb()) > optimize_init_params(S, 0) > mu2 = optimize(S) > problem: > > 1) I get problems when running with Sk1 in f > 2) when I substitute for Sk1 =0.501 in f the code will give a result - yet unreasonable high +e11 > > Suggestions are very welcome. > > Arno > * > * For searches and help try: > * http://www.stata.com/help.cgi?search > * http://www.stata.com/support/statalist/faq > * http://www.ats.ucla.edu/stat/stata/ > * * For searches and help try: * http://www.stata.com/help.cgi?search * http://www.stata.com/support/statalist/faq * http://www.ats.ucla.edu/stat/stata/