《Time Series》由会员分享,可在线阅读,更多相关《Time Series(56页珍藏版)》请在金锄头文库上搜索。
1、<p><p>Time Series Math 419/592 Winter 2009 Prof. Andrew Ross Eastern Michigan University Overview of Stochastic Models But first, a word from our sponsor Take Math 560 (Optimization) this fall! Sign up soon or it will disappear Outline ?Look at the data! ?Common Models ?Multivari
2、ate Data ?Cycles/Seasonality ?Filters Look at the data! or else! Atmospheric CO2 Years: 1958 to now; vertical scale 300 to 400ish Ancient sunspot data Our Basic Procedure 1.Look at the data 2.Quantify any pattern you see 3.Remove the pattern 4.Look at the residuals 5.Repeat at step 2 until no patter
3、ns left Our basic procedure, version 2.0 ?Look at the data ?Suck the life out of it ?Spend hours poring over the noise ?What should noise look like? One of these things is not like the others Stationarity ?The upper-right-corner plot is Stationary. ?Mean doesnt change in time ?no Trend ?no Seasons (
4、known frequency) ?no Cycles (unknown frequency) ?Variance doesnt change in time ?Correlations dont change in time ?Up to here, weakly stationary ?Joint Distributions dont change in time ?That makes it strongly stationary Our Basic Notation ?Time is “t”, not “n” ?even though its discrete ?State (valu
5、e) is Y, not X ?to avoid confusion with x-axis, which is time. ?Value at time t is Yt, not Y(t) ?because time is discrete ?Of course, other books do other things. Detrending: deterministic trend ?Fit a plain linear regression, then subtract it out: ?Fit Yt = m*t + b, ?New data is Zt = Yt m*t b ?Or u
6、se quadratic fit, exponential fit, etc. Detrending: stochastic trend ?Differencing ?For linear trend, new data is Zt = Yt Yt-1 ?To remove quadratic trend, do it again: ?Wt = Zt Zt-1=Yt 2Yt-1 + Yt-2 ?Like taking derivatives ?Whats the equivalent if you think the trend is exponential, not linear? ?Har
7、d to decide: regression or differencing? Removing Cycles/Seasons ?Will get to it later. ?For the next few slides, assume no cycles/seasons. A brief big-picture moment ?How do you compare two quantities? ?Multiply them! ?If theyre both positive, youll get a big, positive answer ?If theyre both big an
8、d negative ?If one is positive and one is negative ?If one is big e.g. ?River flow data over many decades ?Traffic on computer networks How to calculate ACF ?R, Splus, SAS, SPSS, Matlab, Scilab will do it for you ?Excel: download PopTools (free!) ?http:/www.cse.csiro.au/poptools/ ?Excel, etc: do it
9、yourself. ?First find avg. and std.dev. of data ?Next, find AutoCoVariance Function (ACVF) ?Then, divide by variance of data to get ACF ACVF at lag h ?Y-bar is mean of whole data set ?Not just mean of N-h data points ?Left side: old way, can produce correl&gt;1 ?Right side: new way ?Differen
10、ce is “End Effects” ?Pg 30 of Pea, Tiao, Tsay ?(if it makes a difference, youre up to no good?) Common Models ?White Noise ?AR ?MA ?ARMA ?ARIMA ?SARIMA ?ARMAX ?Kalman Filter ?Exponential Smoothing, trend, seasons White Noise ?Sequence of I.I.D. Variables et ?mean=zero, Finite std.dev., often unknown
11、 ?Often, but not always, Gaussian AR: AutoRegressive ?Order 1: Yt=a*Yt-1 + et ? E.g. New = (90% of old) + random fluctuation ?Order 2: Yt=a1*Yt-1 +a2*Yt-2+ et ?Order p denoted AR(p) ?p=1,2 common; &gt;2 rare ?AR(p) like pth order ODE ?AR(1) not stationary if |a|&gt;=1 ?EYt = 0, can g
12、eneralize Things to do with AR ?Find appropriate order ?Estimate coefficients ?via Yule-Walker eqn. ?Estimate std.dev. of white noise ?If estimated |a|&gt;0.98, try differencing. MA: Moving Average ?Order 1: ?Yt = b0et +b1et-1 ?Order q: MA(q) ?In real data, much less common than AR ?But stil
13、l important in theory of filters ?Stationary regardless of b values ?EYt = 0, can generalize ACF of an MA process ?Drops to zero after lag=q ?Thats a good way to determine what q should be! ACF of an AR process? ?Never completely dies off, not useful for finding order p. ?AR(1) has exponential decay
14、 in ACF ?Instead, use Partial ACF=PACF, which dies after lag=p ?PACF of MA never dies. ARMA ?ARMA(p,q) combines AR and MA ?Often p,q &lt;= 1 or 2 ARIMA ?AR-Integrated-MA ?ARIMA(p,d,q) ?d=order of differencing before applying ARMA(p,q) ?For nonstationary data w/stochastic trend SARIMA, ARMAX
15、?Seasonal ARIMA(p,d,q)-and-(P,D,Q)S ?Often S= ?12 (monthly) or ? 4 (quarterly) or ?52 (weekly) ?Or, S=7 for daily data inside a week ?ARMAX=ARMA with outside explanatory variables (halfway to multivariate time series) State Space Model, Kalman Filter ?Underlying process that we dont see ?We get nois
16、y observations of it ?Like a Hidden Markov Model (HMM), but state is continuous rather than discrete. ?AR/MA, etc. can be written in this form too. ?State evolution (vector): St = F * St-1 + ht ?Observations (scalar): Yt = H * St + et ARCH, GARCH(p,q) ?(Generalized) AutoRegressive Conditional Heteroskedastic (heteroscedastic?) ?Like ARMA but variance changes randomly in time too. ?Used for many financial models Exponential Smoothing ?More a method than a model. Exponential Smoo</p></p>