Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at statalist.org.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: st: Regression analysis with a minimum function on the RHS
From 
 
"Sebastian van Baal" <[email protected]> 
To 
 
<[email protected]> 
Subject 
 
Re: st: Regression analysis with a minimum function on the RHS 
Date 
 
Sat, 6 Mar 2010 00:18:25 +0100 
Thank you for your suggestions! 
 
> On Fri, Mar 5, 2010 at 2:14 PM, Austin Nichols <[email protected]>
wrote:
> This seems likely to be problematic no matter what you do--typically
> the objective function should be differentiable in the parameters in
> these kinds of problems.  What is the theory that drives this
> specification?  Is there an alternative parameterization that is
> differentiable?
I admit it is a special hypothesis. My model is based on psychological and
microeconomic theory -- a mixture that creates all sorts of problems but is
also very interesting (to me). An alternative parameterization could be the
following: My original problem 
y = {b0} + min({b1}*x1 , {b2}*x2) 
is formally equal to 
y = {b0} + 0.5*[{b1}*x1 + {b2}*x2 - abs({b1}*x1 - {b2}*x2)]. 
Would you think that the second approach is better suited for estimation
with nl? 
> Steve:
> How about this approach?
> 1. run -sureg- to fit the regressions separately on x1 and x2.  Apply
> -constraint- first to get equal intercepts.
> 2. Use b0 + b1*X1  where   b1*x1 < b2*x2;  otherwise use b2*x2.
This seems to be a good approach I hadn't thought about. The results look
promising, but I will have to consult the literature before I decide on
that. 
Thanks again 
Sebastian
*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/