Statistics for the Future

Published on 6th August 2019

“As we interpret it, the fundamental aim of statistics is to give determinate and adequate knowledge of reality with the help of numbers and numerical analysis." Professor Mahalonobis

I would like to take this opportunity to review some of the recent challenges in the field of statistics and in that context, to envision our future plans. The shared understanding that statistics is a 'key technology' in dealing with real life problems is resonating across the world. Aptly, the 50th session of the United Nations Statistical Commission (UNSC) held in March 2019 had 'Better Data Better Lives' as its leit motif.

The Reserve Bank's statistics and information management system has evolved over several decades in response to demands for national level statistics of the highest quality and their dissemination as a 'public good'. Over the years, more and more information is being compiled and released in the public domain, including those data that form inputs in policy-making.

With India's growing integration with the global economy and the sophistication and complexity of economic activity, information needs have exploded. This has thrown up challenges to practitioners to exploit innovations in analytical tools so as to keep pace with the fast changing dynamic. For instance, in the aftermath of the global financial crisis, statistical systems across the central banks have undergone a paradigm shift.

The focus is increasingly on monitoring of risks in the financial sector, global linkages, and sectoral accounts in terms of analysis of vulnerabilities, interconnectedness and spillovers. At the same time, information management and dissemination has become technologically more advanced, with an emphasis on higher frequency, granularity, better validation and integration into multi-purpose and structured data production processes. In the Reserve Bank too, we propose to leverage our new age data warehouse to support a granular data access lab to facilitate research, and a sandbox environment for evaluating regulatory tools.

Other modern-day challenges confronting professional statisticians, in the Reserve Bank as also outside the Bank

Big data is the new buzzword in the world of statistics, and it has already started changing the way the world views itself. Corporations are making large investments to predict the behaviour of consumers by exploiting the advances made in the field of data analytics. This information technology revolution has also created problems of plenty, underscoring the need for rigorous processes of classification, aggregation and analysis. Given the large amount of information, extracting 'signals' and ignoring 'noise' is vital in the productive use of new age technologies for analytical needs and policy decision support. Big data analytics are being increasingly employed to assess food inflation, to develop risk profiles and stress scenarios for the corporate sector and to conduct sentiment analysis with artificial intelligence and machine learning techniques.

In recent times, there have been animated discussions on the precision of statistical methods. The doctrine - even tradition - of statistical significance in scientific research has come under some cloud for its veracity and the journal American Statistician published a special edition on this issue earlier this year. Critics allege that these tests are susceptible to manipulation in order to make desired results significant, and undesired results non-significant. Further, some important results may be discarded at the conception level itself just because they are highly unlikely. Similarly, an opportunity to cherry-pick variables is available. In other words, correlations can conveniently be extended to establish spurious causality. In this context, do's and don't's have been cited: 'don't say statistically significant' is one of them. Yet, as the global financial crisis demonstrated, tail risks materialised and the world was not the same again. These lessons inform our modelling of corporate financial risks on an ongoing basis. As the American Statistician recommends, "Accept uncertainty. Be Thoughtful, Open, and Modest"; in short, "ATOM."

In my view, in an era marked by the widespread usage of the internet and social media, there is no substitute for rigorous statistical testing for establishing empirical regularities. In a deluge of data and results, what is vital is not just knowing which facts warrant importance, but also which are to be ignored, or even strongly refuted. Deviations from stylised facts and common sense should be investigated, but backed by robust analysis, peer reviews of the data used and the methods employed before deriving conclusions. So, the solution might not be less statistics, but actually more of them, but used in a manner that they are supposed to be.

In most countries today, the profession of statistics is required to meet increasing analytical demands for decision support. The rising demands on the profession, as pointed out recently by the US Bureau of Labour Statistics, speak for the public's expectations about and reliance on the quality of statistics, statistical methods and the statistician. In the Reserve Bank, we will continue to refine the methodologies used for forecasting and assessment of macroeconomic developments on an ongoing basis. Research and analytics using cutting edge techniques will be pursued and in particular, nowcasting of growth and inflation will be further strengthened.

By Mr Shaktikanta Das,

Governor of the Reserve Bank of India.


This article has been read 6,265 times
COMMENTS