Time Series Analysis with Long Memory in View. Wiley Series in Probability and Statistics

  • ID: 4481381
  • Book
  • 300 Pages
  • John Wiley and Sons Ltd
1 of 4

Provides a simple exposition of the basic time series material, and insights into underlying technical aspects and methods of proof

Long memory time series are characterized by a strong dependence between distant events. This book introduces readers to the theory and foundations of univariate time series analysis with a focus on long memory and fractional integration, which are embedded into the general framework. It presents the general theory of time series, including some issues that are not treated in other books on time series, such as ergodicity, persistence versus memory, asymptotic properties of the periodogram, and Whittle estimation.  Further chapters address the general functional central limit theory, parametric and semiparametric estimation of the long memory parameter, and locally optimal tests.

Intuitive and easy to read, Time Series Analysis with Long Memory in View offers chapters that cover: Stationary Processes; Moving Averages and Linear Processes; Frequency Domain Analysis; Differencing and Integration; Fractionally Integrated Processes; Sample Means; Parametric Estimators; Semiparametric Estimators; and Testing. It also discusses further topics. This book:

  • Offers beginning–of–chapter examples as well as end–of–chapter technical arguments and proofs
  • Contains many new results on long memory processes which have not appeared in previous and existing textbooks
  • Takes a basic mathematics (Calculus) approach to the topic of time series analysis with long memory
  • Contains 25 illustrative figures as well as lists of notations and acronyms

Time Series Analysis with Long Memory in View is an ideal text for first year PhD students, researchers, and practitioners in statistics, econometrics, and any application area that uses time series over a long period. It would also benefit researchers, undergraduates, and practitioners in those areas who require a rigorous introduction to time series analysis.

Note: Product cover images may vary from those shown
2 of 4


List of Figures


Chapter 1 Introduction

1.1 Empirical Examples

1.2 Overview

Chapter 2 Stationary Processes

2.1 Stochastic Processes

2.2 Ergodicity

2.3 Memory and Persistence

2.4 Technical Appendix: Proofs

Proof of Proposition 2.7

Chapter 3 Moving Averages and Linear Processes

3.1 Infinite Series and Summability

3.2 Wold Decomposition and Invertibility

3.3 Persistence versus Memory

3.4 Autoregressive Moving Average Processes

3.5 Technical Appendix: Proofs

Proof of Lemma 3.3

Proof of Lemma 3.4

Proof of Proposition 3.7

Proof of Lemma 3.12

Proof of Lemma 3.13

Proof of Proposition 3.11

Chapter 4 Frequency Domain Analysis

4.1 Decomposition into Cycles

4.2 Complex Numbers and Transfer Functions

4.3 The Spectrum :

4.4 Parametric Spectra

4.5 (Asymptotic) Properties of the Periodogram

4.6 Whittle Estimation

4.7 Technical Appendix: Proofs

Proof of Proposition 4.1

Proof of Proposition 4.3

Proof of Proposition 4.4

Proof of Proposition 4.9

Proof of Proposition 4.10

Proof of Corollary 4.12

Proof of Proposition 4.14

Proof of Lemma 4.17

Proof of Corollary 4.18

Chapter 5 Differencing and Integration

5.1 Integer Case

5.2 Approximating Sequences and Functions

5.3 Fractional Case

5.4 Technical Appendix: Proofs

Proof of Lemma 5.1

Proof of Lemma 5.2

Proof of Lemma 5.4

Chapter 6 Fractionally Integrated Processes

6.1 Definition and Properties

6.2 Examples and Discussion

6.3 Nonstationarity and Type I vs. II

6.4 Practical Issues

6.5 Frequency Domain Assumptions

6.6 Technical Appendix: Proofs

Proof of Proposition 6.3

Proof of Corollary 6.6

Proof of Lemma 6.7

Proof of Proposition 6.8

Proof of Proposition 6.13

Chapter 7 Sample Mean

7.1 Central Limit Theorem for I(0) Processes

7.2 Central Limit Theorem for I(d) Processes

7.3 Functional Central Limit Theory

7.4 Inference about the Mean

7.5 Sample Autocorrelation

7.6 Technical Appendix: Proofs

Proof of Proposition 7.4

Proof of Lemma 7.7

Proof of Proposition 7.8

Chapter 8 Parametric Estimators

8.1 Parametric Assumptions

8.2 Exact Maximum Likelihood Estimation

8.3 Conditional Sum of Squares

8.4 Parametric Whittle Estimation

8.5 Log–Periodogram Regression of FEXP Processes

8.6 Fractionally Integrated Noise

8.7 Technical Appendix: Proofs

Proof of Corollary 8.3

Proof of Proposition 8.9

Chapter 9 Semiparametric Estimators

9.1 Local Log–Periodogram Regression

9.2 Local Whittle Estimation

9.3 Finite Sample Approximation

9.4 Bias Approximation and Reduction

9.5 Bandwidth Selection

9.6 Global Estimators

9.7 Technical Appendix: Proofs

Proof of Lemma 9.3

Chapter 10 Testing

10.1 Hypotheses on Fractional Integration

10.2 Rescaled Range or Variance

10.3 The Score Test Principle

10.4 Lagrange Multiplier (LM) Test

10.5 LM Test in the Frequency Domain

10.6 Regression–Based LM Test

10.7 Technical Appendix: Proofs

Proof of Proposition 10.6

Derivations for Section 10.5

Chapter 11 Further Topics

11.1 Model Selection and Specification Testing

11.2 Spurious Long Memory

11.3 Forecasting

11.4 Cyclical and Seasonal Models

11.5 Long Memory in Volatility

11.6 Fractional Cointegration

11.7 R Packages

11.8 Neglected Topics


Note: Product cover images may vary from those shown
3 of 4


4 of 4

Uwe Hassler, PhD, is full professor of statistics and econometric methods, Goethe University, Frankfurt.  He is also associate editor of Advances in Statistical Analysis. He received his PhD from FU Berlin in 1993 and is recipient of the Opus magnum grant from VolkswagenStiftung.

Note: Product cover images may vary from those shown
5 of 4
Note: Product cover images may vary from those shown