ISSN (Print) - 0012-9976 | ISSN (Online) - 2349-8846

A+| A| A-

Textbook with an Indian Flavour

Textbook with an Indian Flavour Econometrics: Theoretical Foundations and Empirical Perspectives by D N Nachane; Oxford University Press, New Delhi, 2006;

Reviews

Textbook with an Indian Flavour

Econometrics: Theoretical Foundations and Empirical Perspectives

by D N Nachane; Oxford University Press, New Delhi, 2006; pp 868, Rs 395.

ASHOK K NAG

T
extbooks come in different types and flavours. Some define and lay the foundation of a subject. Samuelson’s Economics is a foremost example. Some are curriculum oriented and are highly useful, if written by competent authors. Some are expository and capture the essential developments of a subject up to a point of time. Dilip Nachane, a past president of Indian Econometric Society and a distinguished practitioner of the subject, has authored an international standard textbook largely for Indian graduate students, researchers and also teachers. Although there is no dearth of good econometrics textbooks, Nachane’s book has a typical Indian flavour in the sense that he takes pains to illustrate various techniques with Indian data and follows a very systematic and constructive approach to proofs of theorems and solutions of problems. The author has attempted to make the book almost self-contained as it covers all mathematical and statistical results, which are prerequisites for a rigorous exposition of econometric techniques and tools. One is reminded of C R Rao’s textbook on linear statistical inference, which became almost an indispensable reference book for all students of statistics in early 1970s. If one were to look for a result on matrix algebra or a theorem on probability limit, one only needed to consult Rao’s book. It is hoped that, at least for Indian students of econometrics, Nachane’s book will serve a similar purpose.

The 868 page book is divided into five parts. The first two parts comprising nine chapters covers the mathematical and statistical preliminaries. Part three has four chapters on the workhorse of econometrics, viz, linear regression models. The next two parts comprising nine chapters cover time series econometrics which is obviously the author’s favourite subject. Many latest developments in this area have been dealt with in detail and in fact, a little more than half of the book is devoted to this area of econometrics. There are separate chapters on new developments like ‘Causality and Exogeneity’, ‘Unit Roots and Fractional Differencing’, ‘Cointegration’ and ‘Multivariate Time Series Modelling’. It is also very helpful to have a full chapter devoted to ‘Univariate Spectral Analysis’.

The Classical Viewpoint

Although time series or dynamic econometrics has been the most fruitful area where econometricians have made fundamental contributions in the last few decades, a number of interesting and highly useful developments have taken place in other areas of statistical inference and data analysis which are generally given a go by in most textbooks on econometrics. Even a subject like stochastic process which encompasses time series analysis has found many important applications in finance, are mentioned only in the passing. But it is the area of finance where practitioners are using theories and tools of stochastic calculus as a part of their daily routines (Ito’s calculus, for example) and probably deserves more attention. Similarly techniques like Monte Carlo simulation, nonparametric regression, generalised additive models, new exploratory data analysis techniques like data cloud, data depth, etc, need to be introduced. Doubtless, no textbook can cover all subjects and an author has to choose as per his or her predilection. But, a perusal of the standard textbooks shows that most of the tools and techniques are explicated with reference to policyoriented problems (except perhaps forecasting which finds extensive use in business) and that is why time series techniques like co-integration or nature of non-stationarity has received so much attention. It is high time that academicians look beyond researchers in government, universities and policy-making bodies for readership of their textbooks and address the research needs of practitioners in business.

The author has stated at the outset that he intends to write a textbook without professing a specific point of view (preface, p xiii). But while discussing various approaches to statistical inference in the ‘Introduction’ chapter, he makes his preference clear – “(this) book is exclusively devoted to the classical viewpoint”. It is difficult to agree with him at the beginning of 21st century that the Bayesian method is still to make any “significant headway as an alternative dynamic modelling methodology”. Standard time series tools like AIC/SBC have a clear Bayesian interpretation and given today’s ready availability of cheap computing power, methods like Markov Chain, Monte Carlo, etc, are not really difficult to implement; software packages incorporating these methods are also becoming available. The Bayesian Analysis of Time Series (BATS) package is one example. In the area of operational risk, for example, application softwares that have implemented Bayesian methodology are already available. Without going into philosophical debates, I believe, it should be possible for a neutral author to give a balanced view of alternative approaches. The fact is that the policy researchers are reluctant to make explicit their “prior” and find it convenient to take shelter under the so-called “objectivity” of the classical school. Indeed, for policyoriented research, the preferred approach should be the Bayesian approach, whereas the classical approach should be extremely useful in analysing many other problems.

The greatest value of the book lies in its detailed exposition of various techniques through examples. Inclusion of data used in all these examples is also very helpful as readers can use any software package to verify the reported numbers. However, these data could have been supplied through an accompanying CD to economise the book’s bulk. These examples also set one thinking whether these techniques are all tales full of “sound and fury” and signify nothing very substantial.

Economic and Political Weekly February 24, 2007 Let me illustrate my point with an interesting example from the book.

In the chapter on ‘Cointegration’, Johansen’s maximum likelihood method of testing for presence of a co-integrating relationship among a set of time series has been discussed with a 10-year dataset consisting monthly data of wholesale price index (WPI), money supply (M3) and Index of Industrial Production (IIP). The presence of a single co-integrating vector is reported, signifying existence of a causal relationship among these variables. The author then proceeds to test two hypotheses, namely one that says M3 has no long-term impact on prices and the other which says that M3 has no long-term effect on output. The corresponding test statistics indicate non-rejection of the null in both cases. What does it mean? That for the period under study, money supply did not have any impact on either output or inflation. In that event, one should ideally subscribe to the notion of endogeneity of money supply in the Indian context. What role is then left for the monetary authority? I doubt whether any policy-maker will hang his boots when confronted with such results.

The final chapter of the book gives a very succinct summary of the various approaches to the practice of econometrics. Whether methodological debates that rage between votaries of various approaches is sterile or “hair-splitting” or not is a matter of opinion; the question that remains open is about the utility of econometrics in settling theoretical debates in economics. Let us consider the notion of ‘Data Generating Process (DGP)’ popularised by the LSE approach. The autoregressive distributed lag (ARDL) model specification of any DGP does not tell us how many lags one has to go back in time for the model to hold good. Many a time estimated model parameters, as any practitioner would know, change considerably depending on the number of lags included in the model. What sanctity does the notion of DGP then have when its specification is not invariant over a sufficiently long period? Similarly, when one tries to fit an ARMA model to a series which has been identified as stationary, it is a common experience to observe parameter instability with the inclusion of new data points. In this connection, I am reminded of a comment by late J Roy, considered to be one of the finest teachers of statistics that the Indian Statistical Institute (ISI), Kolkata has ever had, to the effect that estimation is more fundamental to statistics than to hypothesis

SAGE AD

testing. If econometrics is nothing but the application of statistics to economic data, then we need to understand the status of estimation in practice of econometrics. We may recall here what Larry Summers had once remarked – that there is no deep parameter to estimate in economics. One has to, therefore, remain a reluctant sceptic about the author’s fervent wish that econometrics will be able to overcome “the huge creditability gap that currently exists between economic theory, empirical analysis of data, and policy prescriptions”.

The above comments in no way diminish the utility of the book as a reference textbook. I have, however, one comment about the author’s treatment of some mathematical topics. It is clear that author is very comfortable with highly complex mathematical topics. But most students of econometrics may not need the extent of details given in the book. For example, the author has devoted a full chapter on ‘Measure Theory and Lebesgue Integration’. Probably, these mathematical preliminaries can be given a shorter treatment to pave way for inclusion of topics of direct relevance to the subject.

EPW

Email: ashok.nag@gmail.com

Economic and Political Weekly February 24, 2007

Dear Reader,

To continue reading, become a subscriber.

Explore our attractive subscription offers.

Click here

Back to Top