Budget data for accountability
It isn’t easy to understand budgets in India today. A citizen who looks at a budget website out of curiosity will be bewildered by the number of documents and jargon. Should one look at the Annual Financial Statement, Expenditure Budget or Demands for Grants? What are the differences between Budget Estimates, Revised Estimates and Actuals; and how do supplementary budgets fit into this picture?
Even if the intrepid citizen overcomes these hurdles in comprehension, she faces many difficulties in interpreting available data. Several programmes are broken up across many line items. Finding out how much the government spends on subjects such as ‘education’, ‘nutrition’, or ‘women’ is an exercise which demands lots of time and patience. Often, data presented are misleading – for instance, expenditure on building roads under the Pradhan Mantri Gram Sadak Yojana, is accounted as revenue expenditure instead of capital expenditure.
But there exists a bigger problem. A key reason budgets are presented publicly is to enable ordinary citizens to judge the priorities and performance of the government. Indian history supplies many examples of how the budgets of the British Raj were used to challenge government policy. Gopala Krishna Gokhale in 1902 produced a sensation when he challenged the Government’s budget, using their own data to tell a compelling story about the misplaced priorities of the British Raj. Similarly, Dr. Ambedkar’s MA thesis, published in 1925, was on the evolution of provincial finance in India, and used budgets as its primary data source.
Since then, the functions and size of government has expanded, and budgets have also grown more complicated and unwieldy. There are a large number of ambitious government programmes in a number of specialised sectors, but the problems they aim to address seem to persist. Today, we all intuitively understand that it isn’t enough to judge the government on allocations alone. How, then, can we evaluate the government on the results of its efforts, not just the amounts spent?
Changes introduced in this year’s Union Budget mark a step towards this goal. The removal of the distinction between plan and non-plan expenditure has been widely reported. In addition, budget documents are now much more compact and understandable. Schemes are presented as one line item instead of being disaggregated across many; and are neatly categorised so that the reader can distinguish between different types of schemes. New statements on Centrally Sponsored Schemes, Railways, and so on bring together information which was scattered across multiple documents. Suyash Rai of NIPFP covers these changes quite exhaustively here.
But the most promising change is the publication of an Outcome Budget for 2017-18. While budgets traditionally display expenditure, evaluating a program requires data also on outputs – what the program is producing, and outcomes – the ultimate benefit which the program produces. For example, an agriculture scheme might have outputs as seeds and fertiliser distributed, and outcomes as improvements in soil fertility and agricultural GDP. For every scheme of the Union Government, this year’s Outcome Budget presents the allocation, expected outputs and outcomes.
This effort is not wholly new. Outcome Budgeting was introduced by the then Minister of Finance, P. Chidambaram, in 2004-05. Every ministry and department was asked to publish documents stating policy changes, outputs and outcomes. However, this effort failed to take root: outcome documents were often unavailable publicly, voluminous, or of poor quality.
For example, look at the Department of School Education and Literacy. It published outcome budgets which were more than a hundred pages long in 2011-12, 2013-14, and 2016-17 – documents for the remaining three years are unavailable online. Outputs and outcomes for the Sarva Shiksha Abhiyan (SSA) in previous years were often vague, and kept changing. In 2016-17, for example, one of the outputs of the SSA was declared to be “Recruitment of teachers to attain ideal PTR”, which would lead to the outcome of “Enhanced learning levels and retention”. How many teachers were to be recruited? What did “enhanced” learning levels mean? The document was silent on these questions.
The consolidated Outcome Budget avoids many of these pitfalls, by presenting a comprehensive view the outputs and outcomes of all government programmes in one central document.
Look again at the Department of School Education and Literacy, we find a concise two pages in the Outcome Budget 2017-18. All SSA outputs are quantified – gross enrolment of 98.5, 9.5 crore textbooks to be distributed, and so on. The outcome, however, is less clear – it says “GER at elementary level will be increased to 97.5 in 2016-17; efforts to improve quality and retention at the elementary level”. Similar problems with the outcomes exist for other programmes – a common pattern seen is that the outcome of a programme is said to be “improvement” or “enhancement” in some area, whether infrastructure or AYUSH services or fertiliser usage. Such statements bereft of concrete, measurable targets do not help meet the goal of holding the government accountable for its performance.
Indians have grown accustomed to using budget allocations to understand how the government is trying to improve things, but using perceptions and personal experiences to judge success. Putting numbers to the government’s good intentions would help make this more concrete. The next step would be to integrate these outcomes and outputs into the cycle of budgeting and planning. Though this Budget marks an important first step, much more needs to be done to make budgets truly an instrument of public accountability.
To watch our series of learning videos on 'Breaking Down the Budget' click here.