One , Basic command of linear regression 
regress y x1 x2   ( Red means that the command can be abbreviated to red )
 with  Nerlove  Data, for example ( The data are attached below )
regress lntc lnq lnpf lnpk lnpl 
       The upper part of the table is an analysis of variance , Including the sum of squares of regression , Sum of squares of residuals , mean square ,F Inspection, etc . The goodness of fit is given on the right side of the upper part R2 And adjusted R2.root MSE 
 Represents the standard error of the equation .
     The lower part of the table shows the point estimates and interval estimates of regression coefficients , Standard error sum t Test value .
 If the regression does not require a constant term , Can be added after the command  noconstant
regress lntc lnq lnpf lnpk lnpl,noc 
 Two , Regression with constraints 
 If only the samples satisfying certain conditions are regressed , Conditions can be added   if  condition 
 If in this case q>=6000 Identified as a large enterprise , The following commands can be set 
 regress lntc lnq lnpf lnpk lnpl if q>=6000 
 Or virtual variables , Define a new variable large , If it's a big business , Then the value is 1, Otherwise 0, The code is 
 g large=(q>=6000)  regress lntc lnq lnpf lnpk lnpl if large 
 The output is equivalent to the above if Conditional results .
 If we return to non large enterprises , It can be expressed as 
 regress lntc lnq lnpf lnpk lnpl if large==0 
 If the regression coefficient needs to satisfy some specified conditions , such as a1+a2+a3=1,a1=2a2  etc. , This can be done by setting constraints :
 constraint def 1 lnpl+ lnpk+ lnpf=1 cnsreg lntc lnq lnpf lnpk lnpl,c(1) 
constraint def  1   Define the first constraint ,cnsreg Represents a constrained regression ,c(1)  Indicates that the constraint condition is satisfied 1
 If several conditions need to be satisfied at the same time , Conditions can be further set 2, condition 3, To add constraints lnq=1  take as an example 
 cons def 2 lnq=1 cnsreg lntc lnq lnpf lnpk lnpl,c(1-2) 
 Three , forecast , Inspection and related calculation 
 If you want to calculate the fitting value of the dependent variable , And save to the new variable yhat in , Can be used predict:
 Take the prediction of unconstrained regression as an example :
regress lntc lnq lnpf lnpk lnpl predict yhat 
 The prediction results are saved in the original data set 
 The residual is further calculated , And save to e1 in , Then we can ( among residual Can be abbreviated ):
 predict e1,residual 
 residual e1 The results are stored in the original data 
 If you want to calculate a coefficient , It can be used directly display  Expression mode , Such as computing lnq Square of 
 di _b[lnq]^2 
 If it is necessary to test the coefficient under certain conditions , Can be used test  condition , To test lnq=1,lnpl+lnpk+lnpf=1  take as an example 
 te lnq=1 te (lnq=1)(lnpl+lnpk+lnpf=1) 
 
 Separate inspection lnq=1,F The test showed that the original hypothesis was rejected ; Joint inspection lnq=1,lnpl+lnpk+lnpf=1, Rejecting the original hypothesis of joint establishment .
 For two coefficients at the same time equal to 0 Make assumptions , Commands can be used  te  variable 1  variable 2 , To test lnpl lnpk Union equals 0 take as an example :
 test lnpl lnpk 
F The test showed that both were not rejected 0 The original hypothesis of .
Technology