1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128
|
@c Copyright (C) 1996, 1997 John W. Eaton
@c This is part of the Octave manual.
@c For copying conditions, see the file gpl.texi.
@node Optimization, Statistics, Differential Equations, Top
@chapter Optimization
@menu
* Quadratic Programming::
* Nonlinear Programming::
* Linear Least Squares::
@end menu
@c @cindex linear programming
@cindex quadratic programming
@cindex nonlinear programming
@cindex optimization
@cindex LP
@cindex QP
@cindex NLP
@node Quadratic Programming, Nonlinear Programming, Optimization, Optimization
@section Quadratic Programming
@node Nonlinear Programming, Linear Least Squares, Quadratic Programming, Optimization
@section Nonlinear Programming
@node Linear Least Squares, , Nonlinear Programming, Optimization
@section Linear Least Squares
@deftypefn {Function File} {[@var{beta}, @var{v}, @var{r}] =} gls (@var{y}, @var{x}, @var{o})
Generalized least squares estimation for the multivariate model
@iftex
@tex
$y = x b + e$
with $\bar{e} = 0$ and cov(vec($e$)) = $(s^2)o$,
@end tex
@end iftex
@ifinfo
@code{@var{y} = @var{x} * @var{b} + @var{e}} with @code{mean (@var{e}) =
0} and @code{cov (vec (@var{e})) = (@var{s}^2)*@var{o}},
@end ifinfo
where
@iftex
@tex
$y$ is a $t \times p$ matrix, $x$ is a $t \times k$ matrix, $b$ is a $k
\times p$ matrix, $e$ is a $t \times p$ matrix, and $o$ is a $tp \times
tp$ matrix.
@end tex
@end iftex
@ifinfo
@var{Y} is a @var{T} by @var{p} matrix, @var{X} is a @var{T} by @var{k}
matrix, @var{B} is a @var{k} by @var{p} matrix, @var{E} is a @var{T} by
@var{p} matrix, and @var{O} is a @var{T}@var{p} by @var{T}@var{p}
matrix.
@end ifinfo
@noindent
Each row of Y and X is an observation and each column a variable.
The return values @var{beta}, @var{v}, and @var{r} are defined as
follows.
@table @var
@item beta
The GLS estimator for @var{b}.
@item v
The GLS estimator for @code{@var{s}^2}.
@item r
The matrix of GLS residuals, @code{@var{r} = @var{y} - @var{x} *
@var{beta}}.
@end table
@end deftypefn
@deftypefn {Function File} {[@var{beta}, @var{sigma}, @var{r}] =} ols (@var{y}, @var{x})
Ordinary least squares estimation for the multivariate model
@iftex
@tex
$y = x b + e$
with
$\bar{e} = 0$, and cov(vec($e$)) = kron ($s, I$)
@end tex
@end iftex
@ifinfo
@code{@var{y} = @var{x}*@var{b} + @var{e}} with
@code{mean (@var{e}) = 0} and @code{cov (vec (@var{e})) = kron (@var{s},
@var{I})}.
@end ifinfo
where
@iftex
@tex
$y$ is a $t \times p$ matrix, $x$ is a $t \times k$ matrix,
$b$ is a $k \times p$ matrix, and $e$ is a $t \times p$ matrix.
@end tex
@end iftex
@ifinfo
@var{y} is a @var{t} by @var{p} matrix, @var{X} is a @var{t} by @var{k}
matrix, @var{B} is a @var{k} by @var{p} matrix, and @var{e} is a @var{t}
by @var{p} matrix.
@end ifinfo
Each row of @var{y} and @var{x} is an observation and each column a
variable.
The return values @var{beta}, @var{sigma}, and @var{r} are defined as
follows.
@table @var
@item beta
The OLS estimator for @var{b}, @code{@var{beta} = pinv (@var{x}) *
@var{y}}, where @code{pinv (@var{x})} denotes the pseudoinverse of
@var{x}.
@item sigma
The OLS estimator for the matrix @var{s},
@example
@group
@var{sigma} = (@var{y}-@var{x}*@var{beta})' * (@var{y}-@var{x}*@var{beta}) / (@var{t}-rank(@var{x}))
@end group
@end example
@item r
The matrix of OLS residuals, @code{@var{r} = @var{y} - @var{x} * @var{beta}}.
@end table
@end deftypefn
|