Monte Carlo simulation of a linear regression model with a lagged dependet variable
4 views (last 30 days)
Show older comments
I have a linear regression model with a lagged dependet variable: y_t=beta_0 + beta_1 * y_{t-1} + u_t The initial starting point is y_0=2 and I know the real coefficients of beta_0=2 and beta_1=1. How can I perform a Monte Carlo Simulation that´s estimating the bias of the OLS coefficients?
0 Comments
Answers (0)
See Also
Categories
Find more on Linear and Nonlinear Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!