You are here: Home » Punditry » Social Specs
Business Standard

How data can mislead in analysing social policy

India needs a transformation in how data is measured and analysed to form policy

Avani Kapur 


A visit to the website on Independence Day, one is greeted with a congratulatory message on the completion of the (SV) campaign (launched to ensure including separate ones for girls in all schools). The message is accompanied with a bar measuring 100% completion rate of a target of constructing 4.17 lakh in 2.61 lakh across the country.

Similarly, the launch of the (PMJDY) to ensure financial inclusion was accompanied with the creation of the PMJDY portal to track progress. A look at the numbers indicate that in a span of 1 year, over 17 cr accounts were opened, 15 cr debit cards issued and less than 50% of accounts had 0 balance. 

Whilst the successful achievement of the targets for PMJDY and SV campaign are indeed significant and should be lauded, their very “success” points to some fault lines in government’s management information systems which should not be ignored in the celebrations. 

Let me elaborate what I mean.

The Mission Mode Model

As early as October 2011, the Supreme Court (SC) of India had directed all States and UTs to build toilets, particularly for girls in all government by the end of November the same year. At the same time, the annual computerised system to collect school level data including the number of toilet facilities was also already available through District Information System for Education (DISE). In fact, if one was to look at the programme pre and post Swachh Bharat, the only noticeable difference between 2011 and today is that a) there is a lag between DISE data collection and publication (till today, latest DISE data available is for 2013-14) and thus there was no regular portal monitoring real time progress, and b) there was no significant political push for the programme. 

The fact that only when the entire machinery was mobilised in mission mode for a specific goal, were results fast-tracked is indicative of systemic weaknesses in the existing planning, implementation and decision-making processes, that need to be addressed. 

The Output Model

Another important fault line relates to what we are measuring, or rather, what we are able to measure regularly. As a country, we are relatively good at monitoring targets related to inputs and outputs. And interestingly, by design, both programmes (SV and PMJDY), the focus has been on monitoring progress on number of built or number of accounts opened. Little information is available on whether these are increasing attendance, or whether accounts opened have had an “impact” on easier access to benefits. 

Whilst measuring outputs is important, for programmes to succeed in the long run, a regular system for tracking outcomes will also need to be developed.

The Stick Model

The last important fault line ties into the question of incentives and usage of data.  To give an example: a look at data published by DISE suggests that between 2011 and 2012, the proportion of with girls’ increased from 72.2% to 88.3%. Interestingly however, during the same period the proportion of boy’s decreased from 81.1% to 67.1%(Remember this was the time the SC was pushing for separate for girls).

An off the record conversation with a state official provided some explanation: The push to show achievements in girl’s led the state to “convert” all existing toilet facilities into girl’s toilets!

The problem is a simple one. If data generated by me is used primarily to monitor me, I am incentivised to report positively on it. Or to be less pessimistic – if I know I am being monitored on certain things, I will focus only on those things.

Most sector schemes emphasize the need and importance of local level planning. Consequently, formats are made for collecting disaggregated data, numerous hours are spent by frontline workers filling formats (anganwadi workers are meant to fill 37 formats across 11 registers!) and computer operators at the block/district have the daunting task of turning 1000s of pages of data into monthly or quarterly reports to be handed over to their supervisors. But then what? What happens with the data? What is it used for?  More often than not it is used only as a tool for monitoring performance. Rarely does it feed effectively to a planning process, or a process for identifying bottlenecks or a tool for the user to diagnose or learn from inefficiencies. 

The new government seems keen to use statistics and data and create regular systems for monitoring performance. That is a positive step. However, for India to move towards a sound evidence based making system, we need a radical transformation in the manner in which we view and use data. In the words of Dr. Suresh Tendulkar we need a system “involving continuing interaction between data generators and data users so that demand for and supply of data are taken to be realistically inter-dependent and mutually interactive in character.”
Avani Kapur works as Senior Researcher: Lead Public Finance, Accountability Initiative at Centre for Research, New Delhi. Her work is focused on public finance & accountability in the sector.
This is her first post on her blog, Specs, a part of Business Standard's platform, Punditry
Avani tweets as @avani_kapur

First Published: Thu, August 20 2015. 08:30 IST
RECOMMENDED FOR YOU

How data can mislead in analysing social policy

India needs a transformation in how data is measured and analysed to form policy

How data is used and reported influences policy choices. There is a need to tranform how information is used in India while deciding social policy.

A visit to the website on Independence Day, one is greeted with a congratulatory message on the completion of the (SV) campaign (launched to ensure including separate ones for girls in all schools). The message is accompanied with a bar measuring 100% completion rate of a target of constructing 4.17 lakh in 2.61 lakh across the country.

Similarly, the launch of the (PMJDY) to ensure financial inclusion was accompanied with the creation of the PMJDY portal to track progress. A look at the numbers indicate that in a span of 1 year, over 17 cr accounts were opened, 15 cr debit cards issued and less than 50% of accounts had 0 balance. 

Whilst the successful achievement of the targets for PMJDY and SV campaign are indeed significant and should be lauded, their very “success” points to some fault lines in government’s management information systems which should not be ignored in the celebrations. 

Let me elaborate what I mean.

The Mission Mode Model

As early as October 2011, the Supreme Court (SC) of India had directed all States and UTs to build toilets, particularly for girls in all government by the end of November the same year. At the same time, the annual computerised system to collect school level data including the number of toilet facilities was also already available through District Information System for Education (DISE). In fact, if one was to look at the programme pre and post Swachh Bharat, the only noticeable difference between 2011 and today is that a) there is a lag between DISE data collection and publication (till today, latest DISE data available is for 2013-14) and thus there was no regular portal monitoring real time progress, and b) there was no significant political push for the programme. 

The fact that only when the entire machinery was mobilised in mission mode for a specific goal, were results fast-tracked is indicative of systemic weaknesses in the existing planning, implementation and decision-making processes, that need to be addressed. 

The Output Model

Another important fault line relates to what we are measuring, or rather, what we are able to measure regularly. As a country, we are relatively good at monitoring targets related to inputs and outputs. And interestingly, by design, both programmes (SV and PMJDY), the focus has been on monitoring progress on number of built or number of accounts opened. Little information is available on whether these are increasing attendance, or whether accounts opened have had an “impact” on easier access to benefits. 

Whilst measuring outputs is important, for programmes to succeed in the long run, a regular system for tracking outcomes will also need to be developed.

The Stick Model

The last important fault line ties into the question of incentives and usage of data.  To give an example: a look at data published by DISE suggests that between 2011 and 2012, the proportion of with girls’ increased from 72.2% to 88.3%. Interestingly however, during the same period the proportion of boy’s decreased from 81.1% to 67.1%(Remember this was the time the SC was pushing for separate for girls).

An off the record conversation with a state official provided some explanation: The push to show achievements in girl’s led the state to “convert” all existing toilet facilities into girl’s toilets!

The problem is a simple one. If data generated by me is used primarily to monitor me, I am incentivised to report positively on it. Or to be less pessimistic – if I know I am being monitored on certain things, I will focus only on those things.

Most sector schemes emphasize the need and importance of local level planning. Consequently, formats are made for collecting disaggregated data, numerous hours are spent by frontline workers filling formats (anganwadi workers are meant to fill 37 formats across 11 registers!) and computer operators at the block/district have the daunting task of turning 1000s of pages of data into monthly or quarterly reports to be handed over to their supervisors. But then what? What happens with the data? What is it used for?  More often than not it is used only as a tool for monitoring performance. Rarely does it feed effectively to a planning process, or a process for identifying bottlenecks or a tool for the user to diagnose or learn from inefficiencies. 

The new government seems keen to use statistics and data and create regular systems for monitoring performance. That is a positive step. However, for India to move towards a sound evidence based making system, we need a radical transformation in the manner in which we view and use data. In the words of Dr. Suresh Tendulkar we need a system “involving continuing interaction between data generators and data users so that demand for and supply of data are taken to be realistically inter-dependent and mutually interactive in character.”
Avani Kapur works as Senior Researcher: Lead Public Finance, Accountability Initiative at Centre for Research, New Delhi. Her work is focused on public finance & accountability in the sector.
This is her first post on her blog, Specs, a part of Business Standard's platform, Punditry
Avani tweets as @avani_kapur
image
Business Standard
177 22

How data can mislead in analysing social policy

India needs a transformation in how data is measured and analysed to form policy


A visit to the website on Independence Day, one is greeted with a congratulatory message on the completion of the (SV) campaign (launched to ensure including separate ones for girls in all schools). The message is accompanied with a bar measuring 100% completion rate of a target of constructing 4.17 lakh in 2.61 lakh across the country.

Similarly, the launch of the (PMJDY) to ensure financial inclusion was accompanied with the creation of the PMJDY portal to track progress. A look at the numbers indicate that in a span of 1 year, over 17 cr accounts were opened, 15 cr debit cards issued and less than 50% of accounts had 0 balance. 

Whilst the successful achievement of the targets for PMJDY and SV campaign are indeed significant and should be lauded, their very “success” points to some fault lines in government’s management information systems which should not be ignored in the celebrations. 

Let me elaborate what I mean.

The Mission Mode Model

As early as October 2011, the Supreme Court (SC) of India had directed all States and UTs to build toilets, particularly for girls in all government by the end of November the same year. At the same time, the annual computerised system to collect school level data including the number of toilet facilities was also already available through District Information System for Education (DISE). In fact, if one was to look at the programme pre and post Swachh Bharat, the only noticeable difference between 2011 and today is that a) there is a lag between DISE data collection and publication (till today, latest DISE data available is for 2013-14) and thus there was no regular portal monitoring real time progress, and b) there was no significant political push for the programme. 

The fact that only when the entire machinery was mobilised in mission mode for a specific goal, were results fast-tracked is indicative of systemic weaknesses in the existing planning, implementation and decision-making processes, that need to be addressed. 

The Output Model

Another important fault line relates to what we are measuring, or rather, what we are able to measure regularly. As a country, we are relatively good at monitoring targets related to inputs and outputs. And interestingly, by design, both programmes (SV and PMJDY), the focus has been on monitoring progress on number of built or number of accounts opened. Little information is available on whether these are increasing attendance, or whether accounts opened have had an “impact” on easier access to benefits. 

Whilst measuring outputs is important, for programmes to succeed in the long run, a regular system for tracking outcomes will also need to be developed.

The Stick Model

The last important fault line ties into the question of incentives and usage of data.  To give an example: a look at data published by DISE suggests that between 2011 and 2012, the proportion of with girls’ increased from 72.2% to 88.3%. Interestingly however, during the same period the proportion of boy’s decreased from 81.1% to 67.1%(Remember this was the time the SC was pushing for separate for girls).

An off the record conversation with a state official provided some explanation: The push to show achievements in girl’s led the state to “convert” all existing toilet facilities into girl’s toilets!

The problem is a simple one. If data generated by me is used primarily to monitor me, I am incentivised to report positively on it. Or to be less pessimistic – if I know I am being monitored on certain things, I will focus only on those things.

Most sector schemes emphasize the need and importance of local level planning. Consequently, formats are made for collecting disaggregated data, numerous hours are spent by frontline workers filling formats (anganwadi workers are meant to fill 37 formats across 11 registers!) and computer operators at the block/district have the daunting task of turning 1000s of pages of data into monthly or quarterly reports to be handed over to their supervisors. But then what? What happens with the data? What is it used for?  More often than not it is used only as a tool for monitoring performance. Rarely does it feed effectively to a planning process, or a process for identifying bottlenecks or a tool for the user to diagnose or learn from inefficiencies. 

The new government seems keen to use statistics and data and create regular systems for monitoring performance. That is a positive step. However, for India to move towards a sound evidence based making system, we need a radical transformation in the manner in which we view and use data. In the words of Dr. Suresh Tendulkar we need a system “involving continuing interaction between data generators and data users so that demand for and supply of data are taken to be realistically inter-dependent and mutually interactive in character.”
Avani Kapur works as Senior Researcher: Lead Public Finance, Accountability Initiative at Centre for Research, New Delhi. Her work is focused on public finance & accountability in the sector.
This is her first post on her blog, Specs, a part of Business Standard's platform, Punditry
Avani tweets as @avani_kapur

image
Business Standard
177 22